Overview
Experience: 10 - 15 years
Must-Haves :
- Programming: Python/Spark- Microsoft Fabric experience required minimum 1 year - Mandatory skills: Experience building and supporting EDW using Microsoft technologies – ADF, Synapse, Power BI, Microsoft Fabric - Monitoring and Optimizing pipelines, Data quality, CI/CD process within Fabric - Programming: Python/Spark ******************* Immediate to 20 days joiners only
Strict No-Nos :
Looking for an early joiner- Do not share the profile if the candidate doesn't have Microsoft fabric experience.
Nice to Haves :
- Good if the candidate has MS fabric certification- MS Fabric certificate
Additional Guidelines :
candidates who can join immediately (less than 15 - 20 days)
Interview Process: 3 rounds of interview
Responsibilities:
Architecting, designing, implementing, cloud data platform
Hands-on - implementing data platform, standing up POCs
Leading technical discussions and finalizing design on data quality, security, data ingestion, data transformation for cloud data platform
Developing data models, optimizing cloud storage solutions
Monitoring data pipeline performance and identifying bottleneck, optimizing queries and data structures to improve query response times, scaling data processing infrastructure to handle peak data loads
Ensure solution adheres to security and compliance regulations within the cloud environment
Working closely with data analysts, data scientists, infrastructure team and business stakeholders to understand data needs and requirements.
Requirements:
Bachelor's degree in Computer Science or a related field
Proven experience in data engineering
Mandatory skills: Experience building and supporting cloud platform using Microsoft technologies – ADF, Synapse, Power BI, Microsoft Fabric, Monitoring and Optimizing pipelines, Data quality, CI/CD process within Fabric
Programming: Python/Spark
Visualization – built or supported the built of reports and dashboard in PowerBI
Strong proficiency in SQL. Experience on relational databases (Azure SQL DB/MS SQL Server/Managed cloud databases)
Experience with data warehousing, data warehouse modelling and ETL processes
Familiarity with cloud-based data management platforms Snowflake, AWS Redshift or Databricks
Excellent problem-solving and analytical skills
Ability to work effectively in a collaborative team environment
Strong communication and interpersonal skills
Preferred Qualifications:
Master's degree in Computer Science or a related field
Certification in database administration or data engineering
Experience with big data technologies such as Hadoop or Spark
Knowledge of DevOps practices and tools
Job Types: Full-time, Permanent
Pay: ₹3,000,000.00 - ₹4,000,000.00 per year
Benefits:
- Health insurance
- Provident Fund
Schedule:
- Day shift
Supplemental Pay:
- Performance bonus
Experience:
- total work: 10 years (Required)
Work Location: In person