Menu

AWS Midstream SME - 121554BR

at Infosys Limited in Chicago, Illinois, United States

Job Description

As a Principal Consultant, you will be involved in the entire software development lifecycle, from conception to completion. Your responsibilities will include Design and document solutions to ensure maintainability and conduct reviews to ensure solutions follow best practice and solve the business problem. Work collaboratively with delivery teams including users, data scientists, statisticians, and analysts to develop data solutions and pipelines that meet their analytical and reporting needs. Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Optimize the performance of corporate data platforms and services to ensure that they can handle large volumes of data and support real-time processing and analysis. Design and implement efficient and reliable data models to store and retrieve data. Build and optimize data ingestion pipelines using AWS services such as Lambda, Kinesis, and Glue.

Candidate must be located within commuting distance of Chicago or be willing to relocate to the area. This position will require travel and or relocation.

Basic Qualifications:

Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
At least 10 years of experience in Information Technology.
AWS Glue, AWS Redshift/Spectrum, S3, API Gateway, Athena, Step and Lambda functions.
Development experience in one or more object-oriented programming languages (e.g., C++, C#, Java, Python, Golang, PowerShell, Ruby) Preferably Python.

Preferred Qualification

Having knowledge to identify algorithm for best time complexity. Understanding of BigO, Omega and Theta method to derive the time complexity.
Experience in Extract Transform Load (ETL) and Extract Load & Transform (ELT) data integration pattern.
Experience in designing and building data pipelines.
Ability to create or consume hosted API's
Database: PostgreSQL, Microsoft SQL, Oracle
Advanced SQL knowledge
Experience designing and implementing large-scale distributed systems.
Use of Azure DevOps for backlog, wiki, code repositories and CI/CD pipelines
Develop a comprehensive understanding of the organization's data structures and metrics, advocating changes for product development
Build the necessary DevOps capabilities like Continuous Integration (CI), Deployment (CD) and Testing (CT).
Work collaboratively with the team members to deliver all the commitments for the iteration and in turn the PI.
Document the necessary details of the changes in the wiki.
Participating in demonstrations of the product to the Product Owners during the Iteration review and System Demo. Ability to work in a collaborative multi-application team environment.
Experience and desire to work in a Global delivery environment

The job entails an extensive amount of travel. The job also entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email, or face to face .

Equal Opportunity Employer - minorities/females/veterans/individuals with disabilities/sexual orientation/gender identity

Copy Link

Job Posting: 12100574

Posted On: Aug 05, 2024

Updated On: Sep 04, 2024

Please Wait ...