Overview
xJob Title: Data Engineer (Azure)
Company: The HubOps
Location: Remote (India, US Time Zone)
Salary: ₹80,000 - ₹1,20,000 per month
Experience: 3-5 years
About The HubOps:
The HubOps is a leading provider of AI and tech talent, connecting top-tier developers with US-based companies. We specialize in AI, blockchain, and machine learning development, offering innovative solutions to businesses worldwide. We are currently hiring a Data Engineer with expertise in Azure, Data Factory, and Data Lakes to work remotely from India while supporting US-based projects.
Role Overview:
As a Data Engineer at The HubOps, you will design, develop, and optimize scalable data pipelines for our US-based clients. You will work with cutting-edge Azure data technologies to process, store, and analyze large datasets while ensuring high performance and reliability. This is a remote role that requires working US business hours and collaborating with global teams to drive data-driven solutions.
Key Responsibilities:
- Develop, maintain, and optimize Azure Data Factory (ADF) pipelines for data ingestion, transformation, and integration.
- Design and manage Azure Data Lakes for structured and unstructured data storage.
- Implement data processing solutions using Azure Synapse Analytics, Azure SQL, and Databricks.
- Build and maintain ETL/ELT workflows for large-scale data processing.
- Ensure high data availability, security, and integrity by following best practices in data governance and compliance.
- Monitor data pipelines for performance bottlenecks, troubleshoot issues, and optimize workflows.
- Collaborate with data scientists, analysts, and software engineers to develop and deploy scalable data solutions.
- Automate data engineering workflows using Azure DevOps, CI/CD pipelines, and Infrastructure as Code (IaC).
- Work with Kafka, Event Hubs, or similar streaming data services for real-time data processing.
- Provide documentation and technical guidance for data architecture and engineering solutions.
Required Skills & Experience:
- 3-5 years of hands-on experience in Data Engineering with a strong focus on Azure cloud technologies.
- Expertise in Azure Data Factory, Azure Data Lake, Synapse Analytics, and Azure SQL.
- Strong proficiency in SQL, Python, or Scala for data processing.
- Experience in building and optimizing ETL/ELT workflows.
- Familiarity with Azure DevOps, CI/CD, and automation tools.
- Ability to design scalable and cost-efficient cloud data architectures.
- Strong problem-solving and debugging skills.
- Excellent communication skills and ability to work remotely during US business hours.
Preferred Qualifications:
- Experience with Databricks, Power BI, or machine learning workflows.
- Knowledge of big data frameworks such as Apache Spark.
- Experience with data streaming technologies like Kafka or Event Hubs.
- Exposure to data security best practices and compliance frameworks.
Why Join The HubOps?
- Competitive compensation: ₹80,000 - ₹1,20,000 per month.
- Work on cutting-edge projects with US-based companies.
- Be part of a global team of AI, blockchain, and ML experts.
- Fully remote role with flexible work environment.
- Opportunities for career growth and professional development.
Job Type: Full-time
Pay: From ₹80,000.00 per month
Schedule:
- Day shift
Work Location: Remote