Job Responsibilities: Data Engineer
Salary: $20-30/Hour
Company: Costco
Location: McKinney, USA
Educational Requirements: Bachelor's Degree
This is an environment in contrast to anything inside the excessive-tech world and the secret of Costco’s fulfillment is its subculture. The value Costco places on its personnel is well documented in articles from a variety of publishers consisting of Bloomberg and Forbes. Our employees and our participants come FIRST. Costco is well known for its generosity and community carrier and has received many awards for its philanthropy. The employer joins with its personnel to take an lively function in volunteering through sponsoring many possibilities to assist others. In 2021, Costco contributed over $58 million to businesses together with United manner and kid's Miracle network Hospitals.
Costco it is responsible for the technical future of Costco Wholesale, the 1/3 largest retailer within the international with wholesale operations in fourteen countries. No matter our length and explosive worldwide growth, we retain to provide a circle of relatives, employee centric ecosystem in which our personnel thrive and be triumphant. As evidence, Costco ranks 7th in Forbes “international’s exceptional Employers”.
The records Engineer is answerable for developing records pipelines and/or statistics integrations for Costco’s corporation licensed information units which can be used for commercial enterprise crucial information consumption use cases (i.E. Reporting, records technological know-how/system studying, records APIs, etc.). At Costco, we are on a mission to seriously leverage data to offer higher services and products for our individuals. This position is targeted on data engineering to construct and deliver automated statistics pipelines from a plethora of internal and outside records sources. The information Engineer will accomplice with product owners, statistics architects, and data platform teams to design, build, check, and automate information pipelines which might be relied upon throughout the enterprise as the unmarried source of truth.
If you need to be a part of one of the worldwide pleasant agencies “to paintings for”, without a doubt observe and allow your career be reimagined.
Position
- Develops and operationalizes records pipelines to create enterprise licensed facts sets which might be made available for intake (BI, superior analytics, apis/services).
- Works in tandem with data Architects, information Stewards, and records fine Engineers to design facts pipelines and recommends ongoing optimization of data storage, data ingestion, records exceptional and orchestration.
- Designs, develops, and implements ETL/ELT/CDC strategies the usage of Informatica wise Cloud services (IICS).
- Makes use of Azure services consisting of Azure sq. DW (Synapse), ADLS, Azure event Hub, Cosmos, Databricks, Delta-Lake to improve and velocity shipping of our information services and products.
- Implements massive records and nosql answers via developing scalable records processing structures to force excessive-value insights to the business enterprise.
- Identifies, designs, and implements internal method upgrades: automating guide tactics, optimizing records shipping.
- Identifies ways to improve information reliability, performance, and great of records management.
- Communicates technical concepts to non-technical audiences both in written and verbal form.
- Performs peer reviews for other records engineers’ work.
REQUIRED
- Five+ years’ revel in engineering and operationalizing facts pipelines with big and complex datasets.
- 2+ years’ hands-on enjoy with informatica iics or other etl equipment.
- 3+ years’ experience working with cloud technologies consisting of adls, azure databricks, spark, azure synapse, cosmos db and other massive facts technology.
- Considerable experience working with numerous statistics sources (db2, sq., oracle, flat documents (csv, delimited), apis, xml, and json.
- Enjoy enforcing data integration techniques including occasion/message primarily based integration (kafka, azure event hub), etl.
- Advanced square abilities; strong information of relational databases and enterprise statistics; potential to write complicated sq. Queries against a selection of records sources.
- Five+ years’ revel in with facts pipeline, etl, and statistics warehousing.
- Sturdy understanding of database garage principles (records lake, relational databases, nosql, graph, information warehousing).
- Experience with git / azure devops.
- Able to work in a quick-paced agile improvement environment.
Advocated
- Azure certifications.
- Ba/bs in pc science, engineering, or equal software program/offerings experience.
- Experience delivering information answers thru agile software program improvement methodologies.
- Publicity to the retail industry.
- Superb verbal and written verbal exchange capabilities.
- Experience working with sap integration tools inclusive of bods.
- Experience with job scheduling and orchestration tools.
Required documents
- Cover Letter
- Resume
