Overview
Apply Here: https://forms.gle/dncMWCCyxEQbZMpF9
Job Title: Data Architect
Client: TE Connectivity
Work Schedule: 12 PM - 9 PM IST
Location: 100% Remote
Notice Period: Immediate
Interview Mode: Virtual
Salary: Based on experience
Job Overview:
TE Connectivity believes that data and analytics are strategic drivers for future success. We are building a world-class advanced analytics team to tackle complex strategic problems, drive top-line growth, and enhance operational efficiencies. Our Analytics team, part of the TE Information Solutions (TEIS) Organization, works closely with senior leadership, including the VP and Chief Data Officer and the SVP of Corporate Strategy.
We seek an experienced Data Architect to design, build, and optimize data lakes and data warehouses both on-premises and in the cloud. This role requires deep expertise in AWS Cloud services, data modeling, and ETL tools to develop scalable, reliable, and cost-efficient solutions.
Primary Responsibilities:
- Design & develop data lakes and manage data flows integrating multiple sources into a unified platform using ETL tools.
- Architect scalable data warehouses for optimized performance and storage.
- Design & evaluate data models (Star, Snowflake, Flattened schemas).
- Develop data access patterns for OLTP and OLAP transactions.
- Troubleshoot, debug, and resolve technical issues in data lakes and warehouses.
- Collaborate with business & technical teams across the software development lifecycle.
- Participate in major architectural & technical decisions.
- Conduct hands-on prototyping of new technology solutions.
- Maintain and manage code repositories like Git.
Required Qualifications:
- 5+ years experience with AWS Cloud and data architectures.
- 3+ years experience with AWS Data Services (S3, Glue, Lake Formation, EMR, Kinesis, RDS, DMS).
- 3+ years experience with Data Warehouses (HANA, Redshift, Snowflake).
- 3+ years experience with ETL tools (Talend, Informatica, SAP Data Services).
- 3+ years experience working with Apache Spark.
- 3+ years experience in Python, R, Scala, or Java.
- 3+ years experience with Data Modeling tools (Erwin, MagicDraw).
- Bachelor’s degree in Computer Science, Information Technology, Data Science, or related fields.
- Hands-on experience with Agile methodologies.
- Excellent problem-solving, communication, and teamwork skills.
- Strong data visualization & analytical skills.
Preferred Qualifications:
- Strong RDBMS & data modeling expertise.
- AWS Cloud certification (preferred but not mandatory).
- Experience with SAP functional modules.
How to Apply:
If you meet the qualifications and are available immediately, submit your updated resume through the Given Google Form link
Job Type: Full-time
Pay: Up to ₹3,300,000.00 per year
Schedule:
- Day shift
Experience:
- AWS Cloud: 5 years (Required)
- S3, Glue, Lake Formation, EMR, Kinesis, RDS, DMS: 3 years (Required)
- Data warehouse: 3 years (Required)
- ETL: 3 years (Required)
- Apacha Spark: 3 years (Required)
- Python, R, Scala, or Java.: 3 years (Required)
- Data Modeling tools: 3 years (Required)
Work Location: Remote