Bangalore, Karnataka, India
Information Technology
Full-Time
Careernet
Overview
Company: Global Banking Organization
Key Skills: Python, Pyspark, AWS
Roles and Responsibilities:
- Develop, test, and maintain scalable and efficient data processing applications using Python and PySpark.
- Design and implement cloud-based solutions leveraging AWS services such as S3, Lambda, and EC2.
- Work closely with cross-functional teams to define, design, and ship new features.
- Optimize and troubleshoot complex data workflows to ensure high performance and reliability.
- Ensure best practices in software development, data processing, and cloud security.
- Participate in Agile development processes, including sprint planning, stand-ups, and retrospectives.
- Automate deployment and monitoring processes using Jenkins, Terraform, and other DevOps tools.
- Collaborate with stakeholders to understand business requirements and translate them into technical solutions.
- Maintain and enhance data pipelines using Databricks, Snowflake, and other big data technologies.
Skills Required:
- 3 - 5 years being part of Agile teams
- 3 - 5 years of scripting
- 2+ years of AWS Hand on (S3, Lamda)
- 2+ years of experience with Pyspark or Python
- 2+ Experience with cloud technologies such as AWS.
- 2+ years of hand on with SQL
Experience Desired:
- Experience with GITHUB
- Teradata, AWS (Glue, Lamda), Databricks, Snowflake, Angular, Rest API, Terraform, Jenkins (Cloudbees, Jenkinsfile/Groovy, password valt)
Education and Training Required:
- Knowledge and/or experience with Health care information domains is a plus
- Computer science - Good to have
Primary Skills:
- JavaScript, Python, PySpark, TDV, R, Ruby, Perl
- Lambdas, S3, EC2
- Databricks, Snowflakes, Jenkins, Kafka, API Language, Angular, Selenium, AI & Machine Learning
Education: Bachelor's Degree in related field
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in