Overview
Location – Whitefield, Bengaluru (Hybrid)
Experience – 4+ Years
Must-Have: Python, PySpark, SQL, AWS or Azure and 2+ years of experience in a customer-facing role.
Responsibilities:
● Design, build, and maintain robust data pipelines and ETL processes using PySpark and other big data technologies.
● Optimize data processing workflows for cost-effectiveness, performance, and scalability.
● Implement data engineering best practices for data quality, reliability, and accessibility.
● Deploy and manage data solutions on cloud platforms (AWS, Azure, GCP), ensuring security, scalability, and compliance.
● Develop and automate logging, monitoring, and alerting mechanisms to ensure data pipeline health and performance.
● Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions.
● Prototype and implement new technologies and tools to enhance data infrastructure and analytics capabilities.Qualifications:
● Bachelor’s degree in Computer Science, Engineering, or a related field.
● Minimum 4+ years of experience programming and developing applications using Python.
● Expertise in distributed computing frameworks such as PySpark for big data processing.
● Strong understanding of cloud platforms (AWS, Azure, GCP) and their services for data storage, compute, and analytics.
● Experience with designing and optimizing cost-effective and performance-improved data pipelines.
● Proficiency in SQL and working with both relational(PostgreSQL, MySQL, Microsoft SQL Server) and non-relational databases(MongoDB, Cassandra, Redis).
● Familiarity with containerization (Docker, Kubernetes) and orchestration tools.
● Experience with logging, monitoring, and alerting tools (e.g., ELK Stack, Prometheus, Grafana).
● Excellent problem-solving skills with the ability to analyze complex data issues and provide solutions efficiently.Preferred Skills:
● Knowledge of streaming data processing frameworks (e.g., Apache Kafka, Spark Streaming).
● Familiarity with machine learning frameworks and deployment (e.g., TensorFlow, PyTorch).
● Certification in cloud platforms (AWS Certified Solutions Architect, Microsoft Azure Certifications,
Job Types: Full-time, Permanent
Pay: ₹2,500,000.00 - ₹3,000,000.00 per year
Benefits:
- Work from home
Schedule:
- Day shift
Education:
- Bachelor's (Preferred)
Experience:
- Data Engineer: 4 years (Preferred)
Shift availability:
- Day Shift (Preferred)
Work Location: In person