Chennai, Tamil Nadu, India
Information Technology
Full-Time
TeizoSoft Private Limited
Overview
Key Responsibilities
- Document data pipelines, processes, and infrastructure. -
Mandatory Skill Set
- Design, develop, and maintain scalable and efficient data pipelines using Python, Spark, and Airflow.
- Implement ETL/ELT processes to ingest, transform, and load data from various sources into Snowflake.
- Optimize data pipelines for performance, reliability, and maintainability.
- Troubleshoot and resolve data pipeline issues and performance bottlenecks.
- Design and implement data warehousing solutions using Snowflake.
- Manage and optimize AWS cloud infrastructure for data engineering tasks.
- Ensure data security and compliance with industry best practices.
- Implement and manage data governance and data quality processes.
- Lead and mentor a team of data engineers, providing technical guidance and support.
- Foster a collaborative and high-performance team environment.
- Conduct code reviews and ensure adherence to coding standards.
- Participate in recruitment and onboarding processes.
- Utilize DBT (Data Build Tool) for data transformation and modeling.
- Develop and maintain data models for analytical and reporting purposes.
- Ensure data consistency and accuracy across the data warehouse.
- Write complex SQL queries for data extraction, transformation, and analysis.
- Optimize SQL queries for performance and efficiency.
- Design and implement database schemas and data structures.
- Implement automation for data pipeline deployment and monitoring.
- Utilize Airflow for workflow orchestration and scheduling.
- Implement CI/CD pipelines for data engineering tasks.
- Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.
- Document data pipelines, processes, and infrastructure. -
Mandatory Skill Set
- Python : Advanced proficiency in Python for data engineering tasks.
- Spark : Extensive experience with Apache Spark for large-scale data processing.
- SQL : Strong SQL skills for data manipulation and analysis.
- Snowflake : Proven experience in designing and implementing data warehousing solutions using Snowflake.
- Airflow : Experience with Apache Airflow for workflow orchestration.
- AWS : Hands-on experience with AWS cloud services related to data engineering.
- DBT (Data Build Tool) : Experience with DBT for data transformation and modeling.
- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
- Knowledge of data governance and data quality best practices.
- Experience with containerization (Docker, Kubernetes).
- Experience with real time data streaming.
- Experience with CI/CD tools.
- Experience with data visualization tools.
- Strong problem-solving and analytical skills.
- Excellent communication and interpersonal skills.
- Ability to work independently and as part of a team.
- Strong leadership and mentorship abilities.
- Ability to manage multiple projects and priorities.
- Strong understanding of software development lifecycle.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in