Menu

Lead Data Engineer

at Team TAG Services LLC in Chicago, Illinois, United States

Job Description

Lead Data Engineer, Chicago, IL, for Team TAG Services, LLC (TAG): Partner with business, analytics, and engineering teams to design and build data structures to facilitate reporting, models, and monitoring key performance metrics. Collaborate across disciplines, identify internal/external data sources to design data assets, define a data pipeline strategy & automated testing and implement scalable data solutions. Be a part of a team building cloud-native solutions with open-source tools and technologies. Partner with business to build self-service data assets and modern data science/ analytics solutions and work with big data and bring modern technology standards to the industry which benefits tens of millions of patients and thousands of employees. The specific duties include: Lead cross-functional engagements to understand data, reporting, and data asset requirements and architecture of data products. Lead engagements with engineering teams to collect required data from internal and external systems, then design and implement, data pipelines to allow for enterprise use. Architect, design, and implement ETL strategy to build performant data solutions that are reliable and scalable in a fast-growing cloud-native data ecosystem: use SQL for transformational layer in dbt and extend dbt functionality with Python. Rebuild and automate legacy reporting pipelines to a new platform by implementing event-streaming and orchestrating pipelines in a way that is easy to build upon by business stakeholders utilizing various cloud-native tools: use Python for event based transformations in GCP DataFlow and/or Cloud Functions. Contribute toward evolving company’s analytical self-service/ad hoc reporting by designing data API’s, curating a data warehouse; facilitate the build out of a data dictionary and data asset lineage. Develop and conduct automated testing which will allow for continuous delivery. Develop and maintain ETL routines using orchestration tools such as Airflow, Prefect, Kubeflow. Document and publish Metadata and table designs to facilitate data adoption. Perform pipeline tuning as necessary. Required: Bachelor’s Degree (or foreign equivalent) in Data Science, Mathematics or related data or computational social or hard science field and 5 years of experience in IT, Analytics and/or Data Science including: 2 years programming and maintaining SQL code, 2 years programming in Python or similar OOP languages, 2 years creating data pipelines, and 1 year maintaining monitoring for data pipelines OR Master’s Degree (or foreign equivalent) in Data Science, Mathematics or related data or computational social or hard science field and 3 years of experience in IT, Analytics and/or Data Science including: 2 years programming and maintaining SQL code, 2 years programming in Python or similar OOP languages, 2 years creating data pipelines, and 1 year maintaining monitoring for data pipelines. May work remotely up to 2 days/week.  Resumes to code YZ-LDE, Juliana Ximenes, Global Immigration Manager, TAG, 800 W Fulton Market, Chicago, IL 60607 (juliana.ximenescoutinhodias@aspendental.com).

Copy Link

Job Posting: 11990008

Posted On: Jun 19, 2024

Updated On: Jul 22, 2024

Please Wait ...