Overview
Job Title: Senior Data Architect - Snowflake
Experience: 12+ years
Budget: Up to 38 LPA
Notice Period: Immediate to 30 days
Location: Trivandrum, Bangalore, Chennai
Education: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field
Skill Set: Snowflake experience, Data Architecture experience, ETL process experience, Large Data migration solutions experience , data modelling, schema design DBT , Python/Java/Scala, SQL, ETL process , cloud data warehousing concept , data integration , AWS, Azure, and GCP , CDC , DataOps methodologies , cloud platforms/Snow-flak , data visualisation tools (e.g., Tableau, Power BI , data security and compliance standard
Job Description:
We are seeking an experienced Senior Data Architect – Snowflake to lead and design scalable, high-performance data solutions. The ideal candidate should have extensive expertise in data architecture, large-scale data migration, ETL processes, and cloud-based data warehousing. You will play a key role in designing optimized data models, ensuring efficient data integration, and implementing best practices for Snowflake and other cloud platforms.
Key Responsibilities:
- Architect, design, and implement scalable Snowflake-based data solutions.
- Develop data models, schema designs, and ETL pipelines to support business requirements.
- Lead large-scale data migration projects while ensuring performance optimization.
- Implement DataOps methodologies for efficient data management and automation.
- Work with AWS, Azure, and GCP to deploy cloud-based data architectures.
- Ensure data security and compliance with industry standards.
- Optimize CDC (Change Data Capture) processes for real-time data updates.
- Utilize DBT, Python, Java, or Scala to enhance data transformation and integration.
- Design and implement data visualisation solutions using tools like Tableau and Power BI.
Required Skills & Expertise:
- Strong expertise in Snowflake and cloud data warehousing concepts.
- Hands-on experience with ETL processes, data modelling, and schema design.
- Proficiency in SQL, DBT, Python, Java, or Scala for data transformation and automation.
- Experience in data integration and large-scale data migration solutions.
- Knowledge of CDC (Change Data Capture) methodologies.
- Familiarity with DataOps practices and modern data engineering workflows.
- Exposure to AWS, Azure, and GCP cloud platforms.
- Strong understanding of data security, governance, and compliance.
- Experience with data visualisation tools such as Tableau or Power BI.