
Overview
What You Will Do:
Ability to design requirements on small systems or modules of medium systems (large scale) environment and technical documentation.
Apply basic principles of software engineering and can follow instructions. Provide meaningful feedback on the release process, code review, and design review.
Easily absorbs and applies new information. Displays a cooperative attitude and shares knowledge.
Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.). Work across teams to integrate our systems with existing corporate product platforms
Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality.
Participate in a tight-knit engineering team employing agile software development practices. Leverage automation within scope of effort
What Experience You Need:
Bachelor's degree or equivalent experience
6 months+ experience with Java software programming experience
6 months+ experience with Cloud technology: GCP, AWS, or Azure
What Could Set You Apart:
Strong Programming Skills: Proficiency in additional programming languages (Python, Scala, etc.) and frameworks that support data engineering tasks.
Big Data Technologies: Experience with other big data frameworks or tools (e.g., Hadoop, Kafka, Airflow) that complement data engineering efforts.
Cloud Certifications: Relevant GCP certifications (e.g., Google Cloud Professional Data Engineer) that demonstrate your commitment and expertise in the field.
Architectural Knowledge: Understanding of data architecture principles, including data lakes, data warehousing, and the concepts of batch and stream processing.
Active Participation in the Community: Contributions to opensource projects, speaking engagements at conferences, or involvement in data engineering forums can enhance your profile.
Business Acumen: Ability to translate technical requirements into actionable business solutions and insights.