Job Responsibilities: Data Engineer
Salary: $20-30/Hour
Company: Costco
Location: Lansing, USA
Educational Requirements: Bachelor's Degree
That is an environment in contrast to anything inside the high-tech international and the secret of Costco’s success is its subculture. The value Costco puts on its employees is properly documented in articles from a spread of publishers inclusive of Bloomberg and Forbes. Our employees and our participants come FIRST. Costco is widely recognized for its generosity and network provider and has received many awards for its philanthropy. The company joins with its personnel to take an lively function in volunteering by sponsoring many possibilities to assist others. In 2021, Costco contributed over $fifty eight million to organizations including United manner and children's Miracle community Hospitals.
Costco it's far chargeable for the technical future of Costco Wholesale, the third biggest store within the international with wholesale operations in fourteen nations. Despite our size and explosive global enlargement, we hold to offer a own family, employee centric surroundings in which our employees thrive and be triumphant. As proof, Costco ranks seventh in Forbes “global’s excellent Employers”.
The data Engineer - records Analytics is responsible for the end to stop statistics pipelines to strength analytics and facts offerings. This position is centered on statistics engineering to build and deliver automated records pipelines from a plethora of internal and outside data resources. The information Engineer will partner with product proprietors, engineering and records platform groups to design, construct, test and automate facts pipelines which can be relied upon across the corporation because the unmarried source of fact.
In case you want to be part of one of the worldwide first-rate businesses “to work for”, genuinely practice and permit your profession be reimagined.
Role
- Develops and operationalizes information pipelines to make records to be had for consumption (BI, advanced analytics, offerings).
- Works in tandem with facts architects and facts/BI engineers to layout records pipelines and recommends ongoing optimization of facts garage, records ingestion, information quality and orchestration.
- Designs, develops and implements ETL/ELT processes using IICS (Informatica cloud).
- Makes use of Azure offerings including Azure sq. DW (Synapse), ADLS, Azure event Hub, Azure facts factory to enhance and speed up shipping of our statistics products and services.
- Implements huge information and nosql solutions by using growing scalable facts processing platforms to drive high-fee insights to the enterprise.
- Identifies, designs, and implements internal technique enhancements: automating guide tactics, optimizing statistics delivery.
- Identifies approaches to enhance statistics reliability, performance and quality of statistics control.
- Communicates technical principles to non-technical audiences each in written and verbal form.
- Performs peer opinions for other records engineer’s paintings.
REQUIRED
- Five+ years’ revel in engineering and operationalizing facts pipelines with huge and complex datasets.
- Five+ years’ of arms on experience with informatica powercenter
- 2+ years’ of palms on enjoy with informatica iics
- 3+ years’ enjoy operating with cloud technologies along with adls, azure databricks, spark, azure synapse, cosmos db and different huge data technology.
- Giant enjoy running with various records resources (sq.,oracle database, flat files (csv, delimited), internet api, xml.
- Advanced square abilities required. Stable expertise of relational databases and enterprise facts; capacity to write down complex square queries against a ramification of data assets.
- Five+ years’ revel in with facts modeling, etl, and facts warehousing.
- Strong expertise of database storage concepts (data lake, relational databases, nosql, graph, records warehousing).
- Scheduling flexibility to fulfill the desires of the business together with weekends, vacations, and 24/7 on call duties on a rotational foundation.
- Able to paintings in a fast-paced agile development surroundings.
Endorsed
- BA/BS in computer science, Engineering, or equivalent software/offerings revel in.
Azure Certifications
- Experience imposing data integration strategies such as occasion / message based integration (Kafka, Azure event Hub), ETL.
- Experience with Git / Azure devops
- Experience delivering information answers via agile software program development methodologies.
- Publicity to the retail enterprise.
- First rate verbal and written conversation abilities.
- Revel in running with SAP integration equipment such as BODS.
- Revel in with UC4 task Scheduler
Required documents
- Cowl Letter
- Resume
