Overview
Job Title: Snowflake Data Engineer
Experience: 6+ Years
Shift: UK Shift
Location: Remote (India), Ahmedabad Preferred
We are looking for an experienced Snowflake Data Engineer with 6+ years of experience to design, develop, and optimize data pipelines and warehouses using Snowflake. The ideal candidate should have expertise in ETL/ELT processes, data modeling, performance tuning, and cloud data solutions. This is a remote role (Ahmedabad preferred) with a UK shift schedule.
Key Responsibilities:
- Design, develop, and optimize scalable Snowflake-based data warehouses.
- Develop and maintain ETL/ELT pipelines using Snowflake, SQL, and Python.
- Optimize Snowflake performance using clustering, partitioning, caching, and query tuning techniques.
- Implement data ingestion, transformation, and orchestration workflows.
- Work with cloud storage (AWS S3, Azure Blob, GCS) and streaming data sources.
- Ensure data security, governance, and compliance best practices in Snowflake.
- Collaborate with data analysts, BI teams, and business stakeholders to understand data requirements.
- Automate data pipeline deployments using CI/CD and Infrastructure as Code (IaC) tools.
- Troubleshoot and resolve data integrity, performance, and security issues in Snowflake.
Required Skills & Qualifications:
✅ Snowflake Expertise:
- Hands-on experience with Snowflake data warehouse architecture.
- Expertise in Snowflake SQL, Snowpipe, Streams, and Tasks.
- Strong knowledge of Snowflake storage, compute (virtual warehouses), and cost optimization.
✅ ETL/ELT & Data Engineering:
- Strong experience with ETL/ELT development using Snowflake, dbt, Talend, or Informatica.
- Experience with Airflow, Prefect, or similar orchestration tools.
✅ Programming & Scripting:
- Strong SQL skills for query optimization and performance tuning.
- Experience with Python or Scala for data processing.
✅ Cloud & DevOps:
- Experience working with AWS, Azure, or GCP cloud platforms.
- Knowledge of CI/CD pipelines (Git, Jenkins, Terraform, or CloudFormation) for data pipeline automation.
✅ Soft Skills:
- Strong analytical and problem-solving skills.
- Good communication and documentation skills.
- Ability to work independently in a remote environment and collaborate with UK-based teams.
Preferred Qualifications (Nice to Have):
- Snowflake Advanced Certification or equivalent.
- Experience with real-time data streaming (Kafka, Kinesis, Pub/Sub).
- Knowledge of Data Vault modeling and best practices.
Work Mode & Shift Details:
- Remote (India-based), Ahmedabad preferred.
- UK Shift Timings (1 PM – 10 PM IST or similar).
Job Type: Full-time
Pay: ₹1,000,000.00 - ₹2,000,000.00 per year
Location Type:
- Remote
Schedule:
- UK shift
Application Question(s):
- Please provide your current CTC, expected CTC and Notice period along with the application. The applications without this information may not be considered.
Education:
- Bachelor's (Required)
Work Location: Remote