Verna, Goa, India
Information Technology
Full-Time
Ericsson
Overview
Join our Team
About this opportunity:
We are looking for a Senior Data Engineer with expertise in SAP HANA and Snowflake to design, develop, and manage scalable data pipelines and analytics solutions. This role involves data modelling, ETL/ELT development, access control, cloud infrastructure, and performance optimization to support real-time business intelligence and analytics.
Key Responsibilities:
Strong hands-on experience with SAP HANA – data modelling, SQL scripting, and performance optimization
Proficiency in SAP BODS (BusinessObjects Data Services) for ETL development, data migration, and data integration
Expertise in Snowflake – data warehousing, schema design, performance tuning, and ELT processes
Working knowledge of Python for scripting and automation
Experience with AWS data services (S3, Glue, Redshift, Lambda, etc.)
Exposure to PySpark for distributed data processing and large-scale data handling
Strong analytical and problem-solving skills with the ability to troubleshoot data issues
Excellent communication and collaboration skills to work effectively with cross-functional teams
Ability to translate business requirements into technical solutions
Proactive in identifying data quality issues and implementing robust data validation checks
Experience working in agile environments and participating in sprint planning, reviews, and retrospectives
Self-driven, with a continuous learning mindset and ability to adapt to evolving technologies
]]>
About this opportunity:
We are looking for a Senior Data Engineer with expertise in SAP HANA and Snowflake to design, develop, and manage scalable data pipelines and analytics solutions. This role involves data modelling, ETL/ELT development, access control, cloud infrastructure, and performance optimization to support real-time business intelligence and analytics.
Key Responsibilities:
- Data Engineering & Modelling
- Design and optimize data models in SAP HANA (Calculation Views, CDS Views) and Snowflake (Star/Snowflake Schema, Clustering).
- Develop ETL/ELT workflows for structured and semi-structured data (JSON, Parquet, Avro).
- Optimize query performance, storage, and compute costs.
- Data Integration & Pipeline Automation
- Ingest data from SAP, APIs, Databases, Cloud Storage (AWS S3, ADLS, GCS).
- Automate data processing pipelines using Apache Airflow, or Snowflake Streams & Tasks.
- Enable real-time data ingestion and transformation.
- Security & Access Management
- Implement Role-Based Access Control (RBAC), Row-Level Security (RLS), and Column-Level Masking.
- Manage IAM policies and authentication mechanisms (OAuth, SAML, LDAP).
- Monitor audit logs and access history for compliance.
- Cloud & Infrastructure Management
- Deploy and manage Snowflake workloads on AWS, Azure, or GCP.
- Automate infrastructure provisioning using Terraform (IaC).
- Optimize warehouse scaling and auto-suspend configurations.
- BI & Analytics Enablement
- Support real-time dashboards in Power BI.
- Integrate SAP HANA views with Snowflake for hybrid analytics.
- Collaborate with data analysts, BI teams, and business stakeholders.
- Performance Optimization & Cost Control
- Tune queries, indexes, partitions, and caching strategies.
- Monitor compute consumption, warehouse usage, and cost optimization.
- Reduce data redundancy and optimize storage layers.
Strong hands-on experience with SAP HANA – data modelling, SQL scripting, and performance optimization
Proficiency in SAP BODS (BusinessObjects Data Services) for ETL development, data migration, and data integration
Expertise in Snowflake – data warehousing, schema design, performance tuning, and ELT processes
Working knowledge of Python for scripting and automation
Experience with AWS data services (S3, Glue, Redshift, Lambda, etc.)
Exposure to PySpark for distributed data processing and large-scale data handling
Strong analytical and problem-solving skills with the ability to troubleshoot data issues
Excellent communication and collaboration skills to work effectively with cross-functional teams
Ability to translate business requirements into technical solutions
Proactive in identifying data quality issues and implementing robust data validation checks
Experience working in agile environments and participating in sprint planning, reviews, and retrospectives
Self-driven, with a continuous learning mindset and ability to adapt to evolving technologies
]]>
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in