Bangalore, Karnataka, India
Information Technology
Full-Time
Optum
Overview
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together
Primary Responsibilities
Primary Responsibilities
- Develop and Deploy Apache Spark and Scala programs on Azure Databricks in a very dynamic and challenging work environment
- Help write analytics code, services and components in Java, Apache Spark, and related technologies such as Scala and Pyspark (Python)
- Responsible for systems analysis - Design, Coding, Unit Testing and other SDLC activities
- Requirement gathering and understanding, Analyze and convert functional requirements into concrete technical tasks and able to provide reasonable effort estimates
- Work proactively, independently and with global teams to address project requirements, and articulate issues/challenges with enough lead time to address project delivery risks
- Work as a Dev-Ops team member to successfully monitor and maintain Operational processes
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
- Undergraduate degree or equivalent experience
- 3+ years on working experience in Python, Pyspark, Scala
- 3+ years of experience working on MS Sql Server and NoSQL DBs like Cassandra, etc.
- Hands on working experience in Azure Databricks
- Ability to understand the existing application codebase, perform impact analysis and update the code when required based on the business logic or for optimization
- Exposure to following DevOps methodology and creating CI/CD deployment pipeline
- Exposure to following Agile methodology specifically using tools like Rally
- Proven excellent Analytical and Communication skills (Both Verbal and Written)
- Experience in the Streaming application (Kafka, Spark Streaming, etc.)
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in