
Overview
Role description
Job Title: Lead Data Engineer
Location: Pune, Hyderabad, Bangalore, Kochi, Trivandrum, Chennai,
Key Technologies:
AWS (Glue, Lambda, Redshift, S3), Python, Spark, Terraform, CloudFormation, SQL, PySpark
About the Role:
UST is seeking a Lead Data Engineer with 8-10 years of experience to spearhead our data engineering initiatives. In this role, you will architect scalable, reliable data solutions and drive technical strategies that align with business goals. You will work closely with stakeholders, clients, and cross-functional teams to deliver comprehensive data solutions from inception to production deployment.
As the Lead AWS Data Engineer, you will provide technical leadership, mentor engineering teams, and ensure the implementation of best practices in data engineering and AWS technologies.
Key Responsibilities:
- Lead and manage data engineering projects, driving technical direction and ensuring successful delivery.
- Mentor junior to senior engineers, providing guidance and fostering skill development.
- Collaborate with stakeholders to define requirements and translate them into data engineering solutions.
- Design and implement scalable, secure, and high-performance data pipelines and systems.
- Ensure the best practices in AWS implementations and data engineering are followed across teams.
- Oversee infrastructure management using AWS CloudFormation and Terraform.
- Work with CI/CD pipelines and automation tools to streamline development processes.
- Ensure the team’s technical solutions are aligned with business objectives and performance goals.
- Manage tasks and projects using JIRA, tracking progress, and delivering results on time.
Technical Skills Required:
- Programming Languages: Expertise in Python, PySpark, and Spark architecture, with a focus on performance tuning and optimization.
- SQL: Advanced proficiency in writing optimized, performant SQL queries and stored procedures.
- AWS Data Engineering Stack: Extensive experience with AWS services including Glue, Lambda, API Gateway, EMR, S3, Redshift, and Athena.
- Data Pipelines: Proven ability to design and deploy scalable, secure, high-performance data pipelines.
- Infrastructure as Code (IaC): Experience with AWS CloudFormation and Terraform.
- CI/CD & Automation: Hands-on experience with continuous integration, continuous deployment, and automation tools.
- Unix/Linux: Knowledge of Unix/Linux scripting and administration is a plus.
Skills
glue,python,AWS cloud,Spark