Overview
We have the job opening for Data Engineer (Linux Shell Scripting, Snowflake, Teradata)
Interested candidates please do fill the Google Form below
https://forms.gle/58ppgcPtjtYSTaA47
Job Title: Data Engineer (Linux Shell Scripting, Snowflake, Teradata)
Location: Pune, India
Job Type: Full-time
Budget : 7-8 LPA
Minimum Experience: 4+ Yrs.
Company Name : PibyThree Conuslting Pvt Ltd
Website: http://pibythree.com
About Us:
We're seeking an experienced Data Engineer to join our team. The ideal candidate will have expertise in Linux shell scripting, Snowflake, and Teradata, with a strong background in data engineering and management.
Key Responsibilities:
- Design and develop data pipelines using Linux shell scripting (Bash, Perl, etc.)
- Work with Snowflake and Teradata databases to optimize data models, queries, and performance
- Develop and maintain data workflows, ensuring data quality, integrity, and security
- Collaborate with cross-functional teams to drive business growth and improve data-driven decision-making
- Troubleshoot and resolve data-related issues, including data pipeline failures and performance issues
Requirements:
- 4+ years of experience in data engineering
- Strong expertise in Linux shell scripting and Snowflake/Teradata databases
- Experience with data warehousing, ETL, and data governance
- Strong problem-solving skills, excellent communication skills, and ability to work in a team environment
Job Type: Full-time
Pay: ₹400,000.00 - ₹700,000.00 per year
Schedule:
- Day shift
Application Question(s):
- what is your current CTC?
- What is your expected ctc?
- What is your notice period?
Experience:
- Total: 6 years (Required)
- Linux shell scripting: 4 years (Required)
- Snowflake: 2 years (Required)
- Teradata: 1 year (Required)
Location:
- Pune, Maharashtra (Required)
Work Location: In person