499428 - 1841710 Indian Rupee - Yearly
Bangalore, Karnataka, India
Information Technology
Full-Time
TekIT Solutions
Overview
Job Title :- Senior Data Engineer – Azure, ETL, Snowflake
Experience :- 7+ yrs
Location :- Remote
Job Summary:
We are seeking a highly skilled and experienced Senior Data Engineer with a strong background in ETL processes, Cloud data platforms (Azure), Snowflake, SQL, and Python scripting. The ideal candidate will have hands-on experience building robust data pipelines, performing data ingestion from multiple sources, and working with modern data tools like ADF, Databricks, Fivetran, and DBT.
Key Responsibilities:
- Develop and maintain end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake.
- Write optimized SQL queries, stored procedures, and views to transform and retrieve data.
- Perform data ingestion and integration from various formats including JSON, XML, Parquet, TXT, XLSX, etc.
- Work on data mapping, modelling, and transformation tasks across multiple data sources.
- Build and deploy custom connectors using Python, PySpark, or ADF.
- Implement and manage Snowflake as a data storage and processing solution.
- Collaborate with cross-functional teams to ensure code promotion and versioning using GitHub.
- Ensure smooth cloud migration and data pipeline deployment using Azure services.
- Work with Fivetran and DBT for ingestion and transformation as required.
- Participate in Agile/Scrum ceremonies and follow DevSecOps practices.
Mandatory Skills & Qualifications:
- 7 years of experience in Data Engineering, ETL development, or similar roles.
- Proficient in SQL with strong understanding of joins, filters, and aggregations.
- Solid programming skills in Python (Functions, Loops, API requests, JSON parsing, etc.).
- Strong experience with ETL tools such as Informatica, Talend, Teradata, or DataStage.
- Experience with Azure Cloud Services, specifically:
- Azure Data Factory (ADF)
- Databricks
- Azure Data Lake
- Hands-on experience in Snowflake implementation (ETL or Storage Layer).
- Familiarity with data modelling, data mapping, and pipeline creation.
- Experience working with semi-structured/unstructured data formats.
- Working knowledge of GitHub for version control and code management.
Good to Have / Preferred Skills:
- Experience using Fivetran and DBT for ingestion and transformation.
- Knowledge of AWS or GCP cloud environments.
- Familiarity with DevSecOps processes and CI/CD pipelines within Azure.
- Proficiency in Excel and Macros.
- Exposure to Agile methodologies (Scrum/Kanban).
- Understanding of custom connector creation using PySpark or ADF.
Soft Skills:
- Strong analytical and problem-solving skills.
- Effective communication and teamwork abilities.
- Ability to work independently and take ownership of deliverables.
- Detail-oriented with a commitment to quality.
Why Join Us?
- Work on modern, cloud-based data platforms.
- Exposure to a diverse tech stack and new-age data tools.
- Flexible remote working opportunity aligned with a global team.
- Opportunity to work on critical enterprise-level data solutions.
Job Type: Full-time
Pay: ₹499,427.88 - ₹1,841,709.33 per year
Work Location: Remote
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in