Overview
Experience: 2+ Years
Location: Hybrid
Employment Type: C2H
Notice Period: Immediate to 15 Days Preferred
Job Summary: We are looking for a passionate and skilled Data Engineer with minimum 2 years of experience to join our team. As a Data Engineer, you will play a key role in building and maintaining scalable data pipelines and architectures on Google Cloud Platform (GCP). You will work with cross-functional teams to ensure the efficient management of data systems, ensuring the quality and availability of data for analytics and business intelligence.
Key Responsibilities:
● Design, implement, and optimize data pipelines using GCP tools such as BigQuery,
Cloud Storage, Dataflow, and Pub/Sub and third party tools.
● Develop and maintain ETL processes to transform and load data from multiple
sources into cloud-based data storage solutions.
● Work with Solution architects and analysts to deliver high-quality datasets that
support various business intelligence and machine learning models.
● Collaborate with the team to manage and scale data infrastructure on Google Cloud
Platform.
● Ensure data integrity and quality through automated validation checks and
monitoring.
● Troubleshoot and optimize data pipelines and storage solutions for performance and
reliability.
● Automate routine data engineering tasks and workflows for operational efficiency.
● Implement best practices for data security and compliance on cloud platforms.
Required Skills & Qualifications:
● Bachelor’s degree in Computer Science, Information Technology, or a related field.
● At least 2 years of experience working as a Data Engineer or in a similar role.
● Hands-on with data modeling, ETL processes, and data pipeline development.
● Hands-on experience with Google Cloud Platform (GCP) services like BigQuery,
Cloud Storage, Dataflow, Pub/Sub, and Cloud Functions.
● Strong proficiency in SQL and experience with NoSQL databases.
● Programming experience in languages like Python, Java, or Scala.
● Knowledge of cloud computing concepts and cloud services architecture.
● Experience with containerization tools like Docker is a plus.
● Good understanding of data security practices in the cloud.
Preferred Skills:
● Familiarity with Apache Spark, Apache Beam, or similar data processing frameworks.
● Experience with CI/CD processes and cloud automation tools like Terraform.
● GCP certifications (e.g., Google Cloud Professional Data Engineer certification) are
a plus.
● Experience with data warehousing (e.g., BigQuery or Redshift).
Job Type: Full-time
Pay: Up to ₹600,000.00 per year
Work Location: In person
Application Deadline: 11/04/2025