Kolkata, West Bengal, India
Information Technology
Full-Time
UST
Overview
Role Description
Hiring Location: Chennai/Pune
Experience Range: 8+ years in a relevant DevOps or Cloud Engineering role
Job Summary
We are seeking a skilled and motivated DevOps Engineer with a strong background in AWS big-data solutions and expertise in implementing CI/CD pipelines using Harness.io. As a key member of our dynamic team, you will design, develop, and maintain robust, scalable, and secure data pipelines and infrastructure for our big-data applications.
Key Responsibilities
Must to Have
Hiring Location: Chennai/Pune
Experience Range: 8+ years in a relevant DevOps or Cloud Engineering role
Job Summary
We are seeking a skilled and motivated DevOps Engineer with a strong background in AWS big-data solutions and expertise in implementing CI/CD pipelines using Harness.io. As a key member of our dynamic team, you will design, develop, and maintain robust, scalable, and secure data pipelines and infrastructure for our big-data applications.
Key Responsibilities
- AWS Big-Data Solutions: Collaborate with cross-functional teams to design, deploy, and manage AWS-based big-data solutions, including data storage, processing, and analytics services. Leverage services like Amazon S3, Amazon EMR, Amazon Redshift, and AWS Glue for scalable data workflows.
- Harness.io Implementation: Lead the adoption and utilization of Harness.io for the CI/CD pipelines. Design, configure, and automate CI/CD workflows to streamline development, testing, and deployment of big-data applications.
- Security Validation: Implement security practices in CI/CD pipelines and build/release processes. Integrate security checks, vulnerability scanning, and compliance validation to ensure data privacy and protection.
- Infrastructure as Code (IaC): Drive IaC implementation using tools like AWS CloudFormation or Terraform to provision and manage AWS resources, ensuring consistency and reproducibility.
- Automation and Orchestration: Automate repetitive tasks, infrastructure provisioning, and configuration management using scripting languages and tools like Ansible.
- Collaboration and Knowledge Sharing: Foster a culture of collaboration and knowledge sharing within DevOps and the broader engineering teams. Mentor junior team members and engage in peer code reviews.
- Innovation and Best Practices: Stay updated with the latest trends and technologies in AWS and big-data. Introduce best practices and innovative solutions to improve performance, reliability, and security.
- AWS Big Data Expertise: Proven experience in designing and deploying AWS-based big-data solutions, including services like Amazon S3, Amazon EMR, Amazon Redshift, and AWS Glue.
- CI/CD Implementation: Hands-on experience with CI/CD pipelines, particularly using Harness.io (or similar tools) for big-data applications.
- Security in CI/CD: Strong knowledge of security principles and the ability to integrate security checks (e.g., SonarQube, Checkmarx) into CI/CD pipelines.
- Infrastructure Automation: Proficiency with Infrastructure as Code (IaC) tools like AWS CloudFormation, Terraform, and automation tools like Ansible.
- Scripting Skills: Solid experience in scripting languages such as Python, Bash, or PowerShell.
- Version Control & Collaboration: Familiarity with version control systems like BitBucket and experience working in a collaborative, team-oriented environment.
- Problem-Solving & Troubleshooting: Strong analytical and troubleshooting skills with the ability to resolve complex issues efficiently.
- Communication & Teamwork: Excellent communication and interpersonal skills, with the ability to collaborate effectively in a team environment.
- Cloud Services Experience: Additional experience with other AWS services or cloud providers will be beneficial.
- Containerization & Orchestration: Familiarity with containerization (e.g., Docker, Kubernetes) for big-data applications.
- Agile Practices: Experience working in Agile environments and contributing to agile project management tools.
-
- Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience).
- At least 3 years of experience in DevOps, AWS, and CI/CD pipeline development, particularly in big-data environments.
Must to Have
- Terraform
- Harness/Genkins
- Repository Management
- DevSecOps (Basics) - Sonar cube, Checkmarx, Wiz
- IAM roles & Policies
- AWS Infra Architecture / Networking Architecture
- Data Services - SNS, SQS, Kinesis
- Basic - Shell Scripting / Python Scripting
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in