Overview
Project description
Our program was started as the migration of the financial instruments trading legacy mainframe system to a new technical, highly-scalable platform. The success of the program in both the migration and creation of a scalable platform led to it being selected as the strategic platform providing full-scale of advisory services for one of the biggest financial institutions in the world. This has led to significant further investment for legacy system migrations and technical improvements. Currently, we have teams across several locations (Weehawken, Wroclaw, Toronto, Pune) on those projects.
Reengineer set of existing applications and introduce additional application interface layer, allowing to query underlying data from third-party applications
Responsibilities
Write clean code
Cover own code with tests
Participate in backlog refinement, planning and demos
Perform code reviews
Clarify the requirements with stakeholders
Participate in solution architecture design and implementation
Skills
Must have
5+ years of experience in enterprise Software development and Data engineering
Strong experience in ETL (3+ YoE) and Java 17 (3+ YoE) with a focus on data processing and analytics applications
Understanding of cloud platforms, particularly Microsoft Azure including services like, Azure Data Lake, Azure Blob Storage, Azure Data Factory, Azure Databricks
At least some practical experience with Spark (or any other distributed big data processing engine
Flink, Trino etc.)
Experience with AKS (or any other Kubernetes provider)
Familiar with Kafka, Delta Lake is a big plus
Nice to have
Microservices architecture and Spring framework experience
Full-stack (Java+React) experience
Experience in banking area/ with enterprise programs
Other
Languages
English: B2 Upper Intermediate
Seniority
Senior
Pune, India
Req. VR-111712
Java
BCM Industry
17/02/2025
Req. VR-111712