Overview
Job Title: Big Data Architect (AWS)
Job Description: The client requires an excellent candidate for this role. Based on their interview feedback for other candidates, we have revised the job description accordingly. Please look for candidates with 10+ years of experience for this role and share profiles on priority.
Ideal Candidate Profile:Expertise in Big Data Frameworks:
- Strong proficiency in widely used big data technologies, including Apache Hadoop, Apache Spark, and Apache Hudi.
- Proven experience in architecting and implementing complex data processing workflows using these frameworks.
Extensive Experience with AWS Cloud Platform:
- Deep understanding of the AWS cloud ecosystem and hands-on experience in leveraging AWS services for large-scale data processing and storage.
- Extensive involvement in end-to-end big data projects hosted on AWS, from architecture design to implementation and optimization.
Hands-on Experience with Data Lakes on AWS:
- Expertise in Data Lake architecture, including designing, building, and managing data lakes on the AWS cloud platform.
- Familiarity with key AWS services like Amazon S3, AWS Glue, Amazon Redshift, and AWS Lake Formation for building scalable and efficient data lakes.
- Proficiency in handling unstructured, semi-structured, and structured data within data lakes, ensuring smooth data ingestion, storage, and access.
Comprehensive Knowledge of Security, Compliance, and Governance:
- In-depth awareness of security and data governance best practices within the AWS ecosystem.
- Experience in implementing AWS security features, including IAM roles and KMS (Key Management Service), to ensure data privacy, compliance, and governance in large-scale data environments.
- Knowledge of compliance standards such as GDPR and SOC 2, with proven ability to implement them through AWS services.
Architectural and Strategic Thinking:
- Ability to design and optimize data architectures that are scalable, secure, and cost-efficient on the AWS cloud platform.
- Proven track record in driving strategic technical decisions related to data architecture, cloud infrastructure, and big data management in cloud environments.
Key Skills and Expertise:
- Big Data Frameworks: Apache Hadoop, Apache Spark, Apache Hudi
- Cloud Platform: AWS (Amazon S3, AWS Glue, AWS Redshift, AWS Lambda, AWS Lake Formation, AWS EMR, etc.)
- Data Governance and Security: IAM, KMS, Compliance (GDPR, SOC 2)
- Architecture: Data Lake Design, Cloud Data Architecture, Big Data Solutions
- Programming: Python, PySpark, SQL
Job Details:
- Years of Experience: 10+ Years
- Working Hours: UK Hours
- Job Location: Remote
- Contract Duration: 12 Months (Extendable)
Job Type: Contractual / Temporary
Pay: ₹1,300,000.00 - ₹1,500,000.00 per month
Benefits:
- Work from home
Schedule:
- UK shift
Supplemental Pay:
- Performance bonus
Work Location: Remote
Application Deadline: 22/02/2025