Free cookie consent management tool by TermsFeed DataBricks - Data Engineer | Antal Tech Jobs
Back to Jobs
5 Weeks ago

DataBricks - Data Engineer

decor
Thiruvananthapuram, Kerala, India
Information Technology
Other
Wipro Limited

Overview

About the role:

The Enterprise Document Delivery team is seeking a Sr ETL Software Engineer who will participate in the entire system development lifecycle of applications related to document generation processing, printing, electronic and postal delivery for the bank. The team supports high volume statement applications residing on different platforms. The position will work with various Lines of Business across the organization to understand their requirements and architecture to design and develop the required solution. The candidate should have a strong knowledge SDLC process as well as Agile experience , and ensure that all phases of our development are 100% Wells Fargo Technology SDLC compliant. This position requires someone that can be flexible, wants to be part of a dynamic team, and is able to manage multiple priorities and tasks simultaneously.

Responsibilities of the role include the following:

  • ETL Design and Development: Develop, maintain, and optimize ETL processes for data ingestion, transformation, and data warehousing across multiple platforms, including both SQL and NoSQL databases.
  • Data Pipelines: Design, build, and manage scalable data pipelines using technologies like Databricks, Apache Spark, Python, SQL, and NoSQL databases.
  • NoSQL/MongoDB Expertise: Work with MongoDB to design efficient document schemas, implement query optimization, and handle large-scale unstructured data.
  • Data Integration: Collaborate with cross-functional teams to ensure seamless data integration between different sources such as databases (both relational and NoSQL), APIs, and external files.
  • Performance Optimization: Implement and monitor performance metrics, optimize data processing performance, and manage ETL job scheduling and dependencies.
  • Data Quality: Ensure data quality and integrity across ETL pipelines, implementing processes for data validation, cleansing, and enrichment.
  • Automation: Automate repeatable ETL tasks and data processing workflows to improve efficiency and accuracy.
  • Collaboration: Work closely with data architects, analysts, and business stakeholders to gather and understand data requirements.
  • Cloud Platforms: Leverage cloud services (Azure, GCP) for data storage, processing, and infrastructure management, ensuring scalability and cost efficiency.
  • Best Practices: Maintain documentation, adhere to data governance, and best practices in data management, including security and compliance.
  • Build Microservices APIS to expose ETL services .


ESSENTIAL QUALIFICATIONS

  • Experience: Minimum of 5+ years of experience as a Data Engineer or ETL Developer in complex, large-scale data environments.
  • SSIS, Databricks Expertise: Strong hands-on experience working with SSIS, Databricks, including using Apache Spark for data processing and optimization.
  • ETL Tools: Proficient with various ETL tools and frameworks such as Informatica, Talend, or SSIS or DataBricks
  • Big Data Technologies: In-depth knowledge of big data processing frameworks like Spark, Hadoop, Kafka, etc.
  • NoSQL/MongoDB: Expertise in working with NoSQL databases, especially MongoDB, for large-scale data storage, retrieval, and optimization.
  • Programming Skills: Proficient in SQL, Python or Java, Power shell for building data pipelines.
  • SQL and NoSQL Proficiency: Strong knowledge of SQL and experience working with both relational databases and NoSQL databases like MongoDB.
  • Data Modeling: Expertise in designing and implementing data models, including OLAP, OLTP, dimensional, and document-based models (NoSQL).
  • Data warehousing: Data warehousing is a key part of ETL process, as it stores data from multiple sources in an organized manner and will be needed to build a repeatable ETL workflow to support many different data sources.
  • Redesign and refactor legacy custom ETL processes to reusable ETL workflows that can ingest diverse data sources to normalize data to standard JSON / XML output.
  • Data Governance: Knowledge of data governance, security standards, and best practices for managing sensitive data.
  • Version Control: Experience with Git or other version control systems for code management.
  • Certification: Databricks certification, MongoDB certification, or other relevant certifications in data engineering, cloud platforms, or big data technologies.
  • Soft Skills: Strong problem-solving skills, excellent communication, and the ability to work in a collaborative team environment.
  • Analytical Mindset: Ability to translate business requirements into scalable, efficient, and reliable ETL solutions.
  • Solid understanding of legacy communication protocols and migration strategies.
  • Experience with cloud platforms like AWS, Azure, or Google Cloud or TKGI


Project Details:

  • Team(s) using SSIS to transform mainframe formatted files to standard JSON and XML file format
  • Converting applications from hosted platform to distributed cloud hosted environment
  • Evaluating and reengineering customed ETL workflows to reusable ETL microservice API
  • Target migration from SSIS to Databricks


Role Purpose

The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations

Do

  • Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup)
    • Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution
    • Conduct technology capacity planning by reviewing the current and future requirements
    • Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable
    • Strategize & implement disaster recovery plans and create and implement backup and recovery plans
  • Manage the day-to-day operations of the tower
    • Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues.
    • Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower
    • Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges
    • Develop shift roster for the team to ensure no disruption in the tower
    • Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc.
    • Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps
    • Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness

  • Team Management
    • Resourcing
      • Forecast talent requirements as per the current and future business needs
      • Hire adequate and right resources for the team
  • Train direct reportees to make right recruitment and selection decisions
  • Talent Management
    • Ensure 100% compliance to Wipro’s standards of adequate onboarding and training for team members to enhance capability & effectiveness
    • Build an internal talent pool of HiPos and ensure their career progression within the organization
  • Promote diversity in leadership positions
  • Performance Management
    • Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports.
    • Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below
  • Employee Satisfaction and Engagement
    • Lead and drive engagement initiatives for the team
    • Track team satisfaction scores and identify initiatives to build engagement within the team
  • Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team
  • Exercise employee recognition and appreciation

Stakeholder Interaction


Stakeholder Type


Stakeholder Identification


Purpose of Interaction



Internal


Technology Solutions Group, BU Teams, Different Infrastructure teams


Understanding requirements, planning and status updates, maintenance and back up, issue resolution etc.



IRMC, QA


Guidance on risk mitigation and quality standards



External


Clients


Understanding requirements, planning and status updates, maintenance and back up, issue resolution etc.



Vendors/ Manufacturers


Development and deployment of platforms, applications, databases etc.



Display


Lists the competencies required to perform this role effectively:

  • Functional Competencies/ Skill
    • Technical Knowledge - Knowledge of own tower (platform, application, database etc) - Expert
    • Domain Knowledge - Understanding of IT industry and its trends - Competent to Expert

Competency Levels



Foundation


Knowledgeable about the competency requirements. Demonstrates (in parts) frequently with minimal support and guidance.



Competent


Consistently demonstrates the full range of the competency without guidance. Extends the competency to difficult and unknown situations as well.



Expert


Applies the competency in all situations and is serves as a guide to others as well.



Master


Coaches others and builds organizational capability in the competency area. Serves as a key resource for that competency and is recognised within the entire organization.



  • Behavioral Competencies
    • Managing Complexity
    • Client centricity
    • Execution Excellence
    • Passion for Results
    • Team Management
    • Stakeholder Management

Deliver


No.


Performance Parameter


Measure



1.


Operations of the tower


SLA adherence

Knowledge management

CSAT/ Customer Experience

Identification of risk issues and mitigation plans

Knowledge management



2.


New projects


Timely delivery

Avoid unauthorised changes

No formal escalations

Share job
Similar Jobs
View All
1 Day ago
MTS II - Software Engineer
Information Technology
  • 4 - 7 Yrs
  • Pune
MAJOR RESPONSIBILITIES • Design, implement, integrate, and verify software applications and tools using JavaScript, NodeJS, and C++. • Enhance, optimize, and improve the efficiency and robustness of current software, with a particular focus on OSS ...
decor
1 Day ago
Test Engineer - Functional Testing
Information Technology
  • Hyderabad, Telangana, India
Job Description Proven experience of 2 years of hands-on experience of functional testing. Strong knowledge of quality best practices and methodologies for software testing Experience with automation tools such as selenium, Cypress.io Katalon Stu...
decor
1 Day ago
UcodeSoft Solutions - iOS Developer - Xcode
Information Technology
  • Hyderabad, Telangana, India
Responsibilities Collaborate with the development team to design and implement new features for our iOS applications using Swift. Write clean, maintainable, and efficient code under the guidance of senior developers. Assist in translating UI/UX d...
decor
1 Day ago
Senior Technical Business Analyst
Information Technology
  • Hyderabad, Telangana, India
About the role:As a Senior Business Analyst, you will: Leads the creation & presentation of estimates for overall cost, skill, effort & timeline for new & existing solutions and projects from a functional perspective. Demonstrates solutions to inter...
decor
1 Day ago
Senior QA Engineer
Information Technology
  • Hyderabad, Telangana, India
Our Mission SPAN is enabling electrification for all ⚡We are a mission-driven company designing, building, and deploying products that electrify the built environment, reduce carbon emissions, and slow the effects of climate change. Decarbonization ...
decor
1 Day ago
Software Test Engineer
Information Technology
  • Bangalore, Karnataka, India
Job DescriptionWe are looking for Software Testing Engineers with the ability to architect and implement modern test automation tools and frameworks to support automated functional testing of mobile and web applications they will also facilitate the...
decor
1 Day ago
Full Stack Developer (JAVA & Angular)
Information Technology
  • Bangalore, Karnataka, India
Fullstack Developer must be proficient in Java (Spring Boot) for backend development and Angular for frontend development. The ideal candidate will be responsible for designing, developing, and maintaining scalable web applications, ensuring seamles...
decor
1 Day ago
QA Engineer
Information Technology
  • Bangalore, Karnataka, India
We're looking for a...QA EngineerApply Now!Position OverviewYou will participate in the testing effort of a leading SaaS product for small and medium sized hotels. He/she will lead the Quality Engineering effort and help to test and critique softwar...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media