Overview
Role: Data Engineers (Remote)
Duration: 6 Months (Contract)
Exp: 4-9 Years
Location: Remote
Key Skills & Responsibilities:
✔ SQL, PySpark, Databricks, Snowflake, Azure (ADF, Blob, Data Lake)
✔ ETL/ELT Pipelines, Data Warehousing, Integration (Kafka, REST, Salesforce)
✔ Design & maintain data pipelines (Snowflake/BigQuery)
✔ Automate workflows, enrich data, and optimize performance
✔ Work with Databricks, DLT, Medallion Architecture, Unity Catalog
✔ Strong communication, problem-solving, and stakeholder interaction
Nice-to-Have: Big Data, ETL Tools (Airflow, DBT, Qlik)
Job Type: Contractual / Temporary
Contract length: 6 months
Pay: ₹75,000.00 - ₹120,000.00 per month
Schedule:
- Day shift
Experience:
- SQL: 4 years (Required)
- Pyspark: 4 years (Preferred)
- snowflake: 4 years (Preferred)
- Azure: 4 years (Preferred)
- ETL: 4 years (Required)
- Big data: 4 years (Required)
- Databricks: 4 years (Preferred)
Work Location: Remote
Application Deadline: 22/02/2025