Job Responsibilities: Data Engineer
Salary: $20-30/Hour
Company: Costco
Location: Connecticut, USA
Educational Requirements: Bachelor's Degree
This is an environment unlike some thing in the excessive-tech world and the secret of Costco’s success is its tradition. The fee Costco puts on its employees is well documented in articles from a selection of publishers which include Bloomberg and Forbes. Our employees and our members come FIRST. Costco is well known for its generosity and network carrier and has received many awards for its philanthropy. The agency joins with its employees to take an lively role in volunteering by way of sponsoring many possibilities to assist others. In 2021, Costco contributed over $58 million to groups along with United manner and kid's Miracle community Hospitals.
Costco it's miles accountable for the technical future of Costco Wholesale, the 0.33 largest retailer in the world with wholesale operations in fourteen countries. In spite of our length and explosive international expansion, we maintain to offer a family, employee centric surroundings in which our employees thrive and be successful. As proof, Costco ranks seventh in Forbes “international’s satisfactory Employers”.
The statistics Engineer is chargeable for developing records pipelines and/or data integrations for Costco’s organization certified records units which might be used for commercial enterprise vital records consumption use instances (i.E. Reporting, information technological know-how/gadget studying, statistics APIs, and so forth.). At Costco, we're on a undertaking to noticeably leverage data to provide better products and services for our contributors. This role is focused on information engineering to build and supply computerized statistics pipelines from a plethora of internal and outside information assets. The facts Engineer will companion with product owners, records architects, and records platform teams to layout, construct, check, and automate statistics pipelines that are relied upon throughout the business enterprise because the single supply of fact.
If you need to be part of one of the international satisfactory groups “to paintings for”, simply observe and allow your career be reimagined.
FUNCTION
- Develops and operationalizes facts pipelines to create organization licensed statistics sets which can be made to be had for consumption (BI, superior analytics, apis/offerings).
- Works in tandem with data Architects, records Stewards, and facts satisfactory Engineers to layout statistics pipelines and recommends ongoing optimization of statistics storage, records ingestion, data excellent and orchestration.
- Designs, develops, and implements ETL/ELT/CDC techniques the usage of Informatica smart Cloud services (IICS).
- Makes use of Azure services inclusive of Azure square DW (Synapse), ADLS, Azure occasion Hub, Cosmos, Databricks, Delta-Lake to improve and speed transport of our records products and services.
- Implements big statistics and nosql answers through developing scalable information processing structures to force high-value insights to the organisation.
- Identifies, designs, and implements internal system improvements: automating manual techniques, optimizing records shipping.
- Identifies ways to improve records reliability, performance, and best of facts management.
- Communicates technical principles to non-technical audiences both in written and verbal shape.
- Plays peer evaluations for other records engineers’ work.
REQUIRED
- 5+ years’ revel in engineering and operationalizing records pipelines with big and complicated datasets.
- 2+ years’ hands-on revel in with Informatica iics or other etl tools.
- Three+ years’ revel in working with cloud technologies along with adls, azure databricks, spark, azure synapse, cosmos db and other large information technologies.
- Significant enjoy running with numerous facts sources (db2, sq., oracle, flat documents (csv, delimited), apis, xml, and json.
- Experience enforcing facts integration techniques such as occasion/message based integration (kafka, azure occasion hub), etl.
- Advanced square abilties; strong understanding of relational databases and business statistics; capability to put in writing complex sq. Queries towards a variety of data assets.
- 5+ years’ enjoy with statistics pipeline, etl, and facts warehousing.
- Sturdy understanding of database garage ideas (data lake, relational databases, nosql, graph, statistics warehousing).
- Enjoy with git / azure devops.
- Capable of work in a fast-paced agile development surroundings.
Recommended
- Azure certifications.
- Ba/bs in pc technological know-how, engineering, or equal software program/services revel in.
- Enjoy delivering records answers through agile software program improvement methodologies.
- Publicity to the retail enterprise.
- Exquisite verbal and written verbal exchange talents.
- Revel in working with sap integration equipment together with bods.
- Revel in with process scheduling and orchestration tools.
Required documents
- Cover Letter
- Resume
