
Overview
1. Overview:
The Data Engineer will be a key member of our data team, responsible for designing, building, and maintaining our cloud-based data infrastructure leveraging Azure and Snowflake. The objective is to ensure efficient, reliable, and scalable data pipelines that support our business intelligence and analytics initiatives. This role requires a strong understanding of data warehousing principles, ETL processes, and cloud technologies.
2. Key Responsibilities:
Design, develop, and maintain robust and scalable data pipelines using Azure Data Factory (ADF) and other Azure services.
Develop and maintain efficient data transformation and loading processes using SQL and other relevant technologies.
Build and maintain Snowflake schemas, tables, views, and stored procedures.
Implement and monitor data quality checks and validation processes.
Optimize data pipelines for performance and cost-effectiveness.
Collaborate with data scientists and analysts to understand data requirements and translate them into technical solutions.
Troubleshoot and resolve data-related issues.
Contribute to the development and maintenance of data governance policies and procedures.
Participate in code reviews and contribute to improving team processes.
Document technical designs and processes.
Participate in on-call rotation for production support.
3. Technical Skills:
Strong proficiency in SQL.
Extensive experience with Azure Data Factory (ADF).
In-depth knowledge of Snowflake including data modeling, query optimization, and security best practices.
Experience with data lake architectures (e.g., Azure Data Lake Storage Gen2).
Experience with data integration tools and techniques (ETL/ELT).
Proficiency in at least one scripting language (e.g., Python, PowerShell).
Understanding of data warehousing concepts and dimensional modeling.
Experience with data visualization tools (e.g., Power BI, Tableau - a plus).
Experience with CI/CD pipelines for data engineering projects.
Familiarity with cloud security best practices.
Experience with monitoring and logging tools (e.g., Azure Monitor, Snowflake Monitoring).
4. Required Qualifications:
Bachelor's degree in Computer Science, Information Systems, or a related field.
3+ years of experience as a Data Engineer.
Proven experience designing and implementing data pipelines in a cloud environment (Azure preferred).
Experience with Snowflake data warehousing.
Demonstrated ability to work independently and as part of a team.
5. Skills & Experience (Emphasis):
Azure Expertise: Deep understanding of Azure services relevant to data engineering, including ADF, Azure Data Lake Storage Gen2, Azure Synapse Analytics (a plus). Experience with Azure DevOps is a strong plus.
Snowflake Expertise: Proficient in writing complex SQL queries, optimizing performance, and managing Snowflake resources effectively. Experience with Snowpipe and other advanced Snowflake features is highly valued.
ETL/ELT Proficiency: Proven ability to design and implement efficient and reliable ETL/ELT processes using various tools and technologies.
Data Modeling: Strong understanding of dimensional modeling and other data modeling techniques. Ability to design efficient and scalable data models for Snowflake.
Problem-Solving: Ability to quickly identify, diagnose and resolve complex data issues. Strong analytical and debugging skills are essential.