Overview
Job Summary:
As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.
As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.
Technical skills :
3+ years of experience in the field of Data Warehousing and BI
Experience working with Snowflake Database
In depth knowledge of Data Warehouse concepts.
Experience in designing, developing, testing, and implementing ETL solutions using enterprise ETL tools .
Experience with large or partitioned relational databases (Aurora / MySQL / DB2)
Very strong SQL and data analysis capabilities
Familiarly with Billing and Payment data is a plus
Agile development (Scrum) experience
Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery
Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation
Responsibilities:
Be a key contributor to the design and development of a scalable and cost-effective cloud-based data platform based on a data lake design
Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties
Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data
Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services
Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models
Deep knowledge and skills current with the latest cloud services, features and best practices
Who We Are:
unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in the U.S., Colombia, Dominican Republic, India, Jamaica, Honduras, and the Philippines. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide. In nearly two decades, unifyCX has grown from a small team to a global organization with staff members all over the world dedicated to supporting our international clientele.
At unifyCX, we leverage advanced AI technologies to elevate the customer experience (CX) and drive operational efficiency for our clients. Our commitment to innovation positions us as a trusted partner, enabling businesses across industries to meet the evolving demands of a global market with agility and precision.
unifyCX is a certified minority-owned business and an EOE employer who welcomes diversity.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in