Hyderabad, Telangana, India
Information Technology
Full-Time
algoleap
Overview
We are seeking a highly skilled Data Engineer with extensive experience in Snowflake, Data Build Tool (dbt), Snaplogic, SQL Server, PostgreSQL, Azure Data Factory, and other ETL tools. The ideal candidate will have a strong ability to optimize SQL queries and a good working knowledge of Python. A positive attitude and excellent teamwork skills are essential.
Key Responsibilities
Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Snowflake, dbt, Snaplogic and ETL tools.
SQL Optimization: Write and optimize complex SQL queries to ensure high performance and efficiency.
Data Integration: Integrate data from various sources, ensuring consistency, accuracy, and reliability.
Database Management: Manage and maintain SQL Server and PostgreSQL databases.
ETL Processes: Develop and manage ETL processes to support data warehousing and analytics.
Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions.
Documentation: Maintain comprehensive documentation of data models, data flows, and ETL processes.
Troubleshooting: Identify and resolve data-related issues and discrepancies.
Python Scripting: Utilize Python for data manipulation, automation, and integration tasks.
Qualifications
Experience: Minimum of 9 years of experience in data engineering.
Technical Skills
Proficiency in Snowflake, dbt, Snaplogic, SQL Server, PostgreSQL, and Azure Data Factory.
Strong SQL skills with the ability to write and optimize complex queries.
Knowledge of Python for data manipulation and automation.
Knowledge of data governance frameworks and best practices
Soft Skills
Excellent problem-solving and analytical skills.
Strong Communication And Collaboration Skills.
Positive attitude and ability to work well in a team environment.
Certifications: Relevant certifications (e.g., Snowflake, Azure) are a plus.
Key Responsibilities
Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Snowflake, dbt, Snaplogic and ETL tools.
SQL Optimization: Write and optimize complex SQL queries to ensure high performance and efficiency.
Data Integration: Integrate data from various sources, ensuring consistency, accuracy, and reliability.
Database Management: Manage and maintain SQL Server and PostgreSQL databases.
ETL Processes: Develop and manage ETL processes to support data warehousing and analytics.
Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions.
Documentation: Maintain comprehensive documentation of data models, data flows, and ETL processes.
Troubleshooting: Identify and resolve data-related issues and discrepancies.
Python Scripting: Utilize Python for data manipulation, automation, and integration tasks.
Qualifications
Experience: Minimum of 9 years of experience in data engineering.
Technical Skills
Proficiency in Snowflake, dbt, Snaplogic, SQL Server, PostgreSQL, and Azure Data Factory.
Strong SQL skills with the ability to write and optimize complex queries.
Knowledge of Python for data manipulation and automation.
Knowledge of data governance frameworks and best practices
Soft Skills
Excellent problem-solving and analytical skills.
Strong Communication And Collaboration Skills.
Positive attitude and ability to work well in a team environment.
Certifications: Relevant certifications (e.g., Snowflake, Azure) are a plus.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in