Overview
Technical
Min of 3 years exp as Data engineer with exposure to min of 1 ETL tool and knowledge of developing Data pipelines on cloud ecosystem
Experience in Extracting data from heterogeneous sources, including databases, CRM systems, flat files, web services, etc.
Knowledge of Data warehousing concept and how data moving from one layer to another till visualization layer.
Transforming the extracted data involves cleaning, aggregating, mapping, and converting it to ensure it meets the business requirements and is in a suitable format for analysis.
Experience in Design, implement, and continuously expand data pipelines by performing extraction, transformation, and loading activities.
Good SQL knowledge (query performance tuning, index maintenance, etc.) as well as an understanding of database structure
Knowledge of SQL language and cloud-based technologies
Experience in developing ETL pipelines in Snowflake Datawarehouse on cloud.
Performance tuning and setting up resource monitors.
SQL performance measuring, query tuning, and database tuning
Coding knowledge in languages like Python is an added advantage.
Knowledge of working in AWS cloud ecosystem.
.
Behavioral :
Strong communication skills
Self Starter, with strong confidence level
Flexible in thought and creative in approach to testing
Solid understanding of databases and systems and the ability to communicate effectively across groups
Interact effectively with peers to resolve issues
Ability to deal with ambiguity
Promotes positive and professional work environment
The Skills that are Good To Have for this role
Experience in ETL Test automation tools is added advantage.
Experience in PYTHON scripting knowledge
Experience in Agile methodologies and best practices.
The Expertise We’re Looking For
3 to 5 years of relevant experience.
Bachelor’s / Master’s Degree (e.g., Computer Science, Engineering)
Location:
Bangalore