Bangalore, Karnataka, India
Information Technology
Photon
Overview
Location:Bangalore
Experience:
6 - 9 Years
Role Overview:
We are looking for a
Data Engineer
with strong expertise in
Python, PySpark, AWS Glue, and Lambda
. The ideal candidate should have experience in
data processing, transformation, and cloud-based data workflows
. A strong foundation in
PL/SQL and PostgreSQL
is required, and experience with
AWS services such as SNS and Step Functions, along with Java API development, would be a plus
.
Key Responsibilities:
Develop and optimize ETL pipelines
using
AWS Glue, Lambda, and PySpark
.
Work with
large-scale data processing
and transformation using
Python and Spark
.
Design and implement data workflows
for structured and unstructured data.
Optimize queries
and data storage for performance and cost efficiency.
Develop and maintain
data models, schemas, and database structures
using
PL/SQL and PostgreSQL
.
Integrate AWS services
such as
SNS, Step Functions, and other cloud-based workflows
.
Collaborate with
cross-functional teams
to support business data needs.
Troubleshoot and optimize
data pipelines for scalability and reliability.
Must-Have Skills:
✅
Python & PySpark
– Strong hands-on experience
✅
AWS Glue & Lambda
– Experience in serverless data processing
✅
PL/SQL & PostgreSQL
– Basic database development and querying skills
Good to Have:
SNS & Step Functions
– AWS workflow automation
Java API
– Experience in API development and integration
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in