Menu

Sr. Data Engineer

in Cook - Chicago, IL

Job Description

Continental Casualty Company (Chicago, IL) seeks a Sr. Data Engineer w/ a bachelor’s degree in computer science, IT, engineering or related and 5 (five) years of experience as a data engineer, data analyst, software developer, software engineer, systems analyst, technical architect or related. Lead the design and build data solutions and applications that enable reporting, analytics, data science, and data management. Lead data application development teams and provide guidance to others to support the building of complex technical capabilities. Lead teams to design, implement and automate data pipelines, including sourcing data from internal and external systems and transforming the data for the optimal needs of various systems and business requirements. Lead robust unit testing to ensure deliverables match the design and provide expertise to support subsequent release testing. Lead the creation of integrated data views based on business or analytics requirements. Build large-scale batch and real-time data pipelines with data processing frameworks in GCP cloud platform. Develop data pipelines / APIs using Python, SQL, Spark and GCP Methods. Develop Apache Airflow Composer using Python scripts in GCP to orchestrate the cloud compute process. Develop Python scripts in GCP to perform batch processing using predefined API’s to read/write batch processing data into Big Query. Create SNS or PuSub notification process for job failure alerts and consumer notification process for job completion. Research, identify and implement process improvements that address complex technology gaps. Maintain professional and technical knowledge by attending educational workshops, reviewing professional publications, establishing personal networks, and participating in professional societies. Must have some work experience with each of the following: 1.) Build large-scale batch and real-time data pipelines with data processing frameworks in GCP cloud platform; 2.) Develop Python scripts in AWS or GCP to perform batch processing using predefined API’s to read/write batch processing data into Big Query or Aurora; 3.) Develop State Machines and Lambda’s or Apache Airflow Composer using Python scripts in AWS or GCP to orchestrate the cloud compute process; and 4.) Create SNS or PuSub notification process for job failure alerts and consumer notification process for job completion. 100% telecommuting allowed from anywhere within the United States. Submit cover letter and resume at www.cna.com.

Copy Link

Job Posting: 11974317

Posted On: Jun 13, 2024

Updated On: Jul 18, 2024

Please Wait ...