Overview
We are seeking a skilled and motivated Data Engineer to join our team. The ideal candidate will have expertise in AWS, Spark, Python, AWS Lambda, Glue, and Airflow. This role is critical for developing and maintaining scalable data pipelines and ensuring efficient data integration across our platforms.
Key Responsibilities:
- Experience in Appflow and Snowflake is compulsory
- Design, build, and maintain data pipelines using AWS, Spark, Python, AWS Lambda, Glue, and Airflow.
- Develop and deploy scalable ETL processes to integrate data from various sources.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Optimize and troubleshoot data workflows to ensure performance and reliability.
- Implement best practices for data governance, security, and compliance.
- Document data integration processes and maintain technical documentation.
- Continuously evaluate new tools and technologies to enhance data engineering capabilities.
Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 5 years of relevant experience in data engineering.
- Proficient in AWS services, Spark, Python, AWS Lambda, Glue, and Airflow.
- Strong understanding of ETL processes and data integration techniques.
- Experience with big data technologies and processing large datasets.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
- Ability to work independently and as part of a team in a fast-paced environment.
Share your CV to HR on whatsapp at +919051709611
Job Types: Full-time, Contractual / Temporary, Freelance
Pay: ?80,000.00 - ?95,000.00 per month
Benefits:
- Work from home
Schedule:
- Day shift
- UK shift
Supplemental Pay:
- Quarterly bonus
Experience:
- total work: 5 years (Required)
- snowflake: 1 year (Preferred)
- Appflow: 1 year (Preferred)
- AWS: 1 year (Required)
Work Location: Remote