Technology & Transformation- Ead- Engineering- AWS Data Engineer

Deloitte Bengaluru , IN 46615

Posted 3 weeks ago

  • What impact will you make?

Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential

Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential.

The Team

Deloitte's Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning.

Learn more about Analytics and Information Management Practice

Work you'll do

As a Senior Consultant in our Consulting team, you'll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations. You'll:

We are seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS servcies. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages.

Key Responsibilities:

1.Design and implement scalable, high-performance data pipelines using AWS services

2.Develop and optimize ETL processes using AWS Glue, EMR, and Lambda

3.Build and maintain data lakes using S3 and Delta Lake

4.Create and manage analytics solutions using Amazon Athena and Redshift

5.Design and implement database solutions using Aurora, RDS, and DynamoDB

6.Develop serverless workflows using AWS Step Functions

7.Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL

8.Ensure data quality, security, and compliance with industry standards

9.Collaborate with data scientists and analysts to support their data needs



  1. Optimize data architecture for performance and cost-efficiency

  2. Troubleshoot and resolve data pipeline and infrastructure issues


Required Qualifications:

  1. bachelor's degree in computer science, Information Technology, or related field

2.Relevant years of experience as a Data Engineer, with at least 60% of experience focusing on AWS

3.Strong proficiency in AWS data services: Glue, EMR, Lambda, Athena, Redshift, S3

4.Experience with data lake technologies, particularly Delta Lake

5.Expertise in database systems: Aurora, RDS, DynamoDB, PostgreSQL

6.Proficiency in Python and PySpark programming

7.Strong SQL skills and experience with PostgreSQL

8.Experience with AWS Step Functions for workflow orchestration

9.Familiarity with data modeling and schema design



  1. Knowledge of data security and compliance requirements

  2. Excellent problem-solving and analytical skills

  3. Strong communication and collaboration abilities


Preferred Qualifications:

1.AWS Certified Data Analytics

  • Specialty

2.AWS Certified Solutions Architect

  • Associate or Professional

3.Experience with real-time data processing using Kinesis or Kafka

4.Knowledge of machine learning workflows on AWS (e.g., SageMaker)

5.Familiarity with containerization technologies (Docker, Kubernetes)

6.Experience with CI/CD pipelines and infrastructure-as-code (e.g., CloudFormation, Terraform)

Technical Skills:

  • AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB, Step Functions

  • Big Data: Hadoop, Spark, Delta Lake

  • Programming: Python, PySpark

  • Databases: SQL, PostgreSQL, NoSQL

  • Data Warehousing and Analytics

  • ETL/ELT processes

  • Data Lake architectures

  • Version control: Git

  • Agile methodologies

How you will grow

At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there is always room to learn. We offer opportunities to help build excellent skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre.

Benefits

At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you.

Our purpose

Deloitte is led by a purpose: To make an impact that matters.

Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the

Communities in which we live and work-always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world


icon no score

See how you match
to the job

Find your dream job anywhere
with the LiveCareer app.
Mobile App Icon
Download the
LiveCareer app and find
your dream job anywhere
App Store Icon Google Play Icon
lc_ad

Boost your job search productivity with our
free Chrome Extension!

lc_apply_tool GET EXTENSION

Similar Jobs

Want to see jobs matched to your resume? Upload One Now! Remove

Technology & Transformation- Ead- Engineering- AWS Data Engineer

Deloitte