Bangalore, Karnataka, India
Information Technology
Contract
IOWeb3 Technologies
Overview
Role - Data Architect
Exp - 10+ yrs
Work time - hourly basis 5 to 6 hours daily
Mandatory skills -
Databricks, Azure, Python, Pyspark, SQL, Salesforce
budget is 1.1L
Roles and Responsibilities:- Develop, optimize, and maintain data pipelines using Databricks on Azure.
- Work extensively with PySpark and SQL to transform and analyze large datasets.
- Implement Azure services (Azure Data Lake, Azure Synapse, Azure Functions, etc.) for data processing and storage.
- Integrate and extract data from Salesforce and other data sources.
- Design and develop scalable ETL solutions to ensure efficient data processing.
- Monitor and troubleshoot performance issues in Databricks and PySpark workflows.
- Collaborate with cross-functional teams to support business intelligence and analytics initiatives.
✅ Databricks – Hands-on experience in developing data pipelines
✅ Azure – Expertise in cloud-based data solutions
✅ Python & PySpark – Strong programming and data processing skills
✅ SQL – Writing complex queries and performance tuning
✅ Salesforce – Data extraction and integration experience
Preferred Skills:- Experience with Azure for CI/CD pipelines
- Strong problem-solving and debugging skills
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in