Overview
AWS Data Engineer (Glue, IAM, S3, Athena, Terraform, Python, PySpark)
Job Description:
We are seeking an 5 years of experienced AWS Data Engineer to support our data engineering and analytics initiatives. The ideal candidate will have strong expertise in AWS cloud services and big data tools, with a deep understanding of Python and PySpark for data transformation tasks.
Responsibilities:
Develop and maintain ETL pipelines using AWS Glue and PySpark
Manage and optimize data storage using Amazon S3, Athena, and Glue Catalog
Implement and manage IAM policies and roles for secure access control
Automate infrastructure deployment using Terraform
Write clean, scalable, and well-documented code in Python
Experience: 5+ years
Work Type: Remote via Zoom (Screen Sharing)
Mode of Work: Screen sharing-based development sessions
Duration: 2 hours per session
Budget: ₹1000 per 2-hour session
Timing: Flexible (To be discussed with the client)
Job Type: Part-time
Pay: Up to ₹20,000.00 per month
Expected hours: 10 per week
Benefits:
- Work from home
Schedule:
- Monday to Friday
- Night shift
- US shift
Education:
- Bachelor's (Required)
Experience:
- AWS Data Engineer: 5 years (Required)
Work Location: Remote