Overview
About BillionApps: BillionApps is a 15+ years experienced technology company specializing in software development, Martech integrations, and mobile application development. We partner with marketing and advertising agencies, helping them implement cutting-edge technology solutions.
We are seeking a highly skilled Data Engineer with AI experience to join our team. This role will focus on designing, implementing, and optimizing scalable data pipelines while integrating artificial intelligence (AI) and machine learning (ML) models. The ideal candidate will have a strong background in data engineering, cloud computing, and AI-driven data processing solutions.
Responsibilities
● Design, build, and maintain robust data pipelines for AI and ML applications.
● Develop and optimize ETL (Extract, Transform, Load) processes to ensure efficient data processing.
● Work closely with data scientists, analysts, and software engineers to deploy AI/ML models into production environments.
● Implement scalable data storage solutions using cloud platforms such as AWS, Azure, or GCP.
● Ensure data quality, security, and compliance with industry standards.
● Develop APIs and data services to support AI-driven applications.
● Utilize big data technologies such as Spark, Hadoop, and Kafka to process large datasets.
● Monitor, troubleshoot, and enhance data infrastructure to ensure high availability and performance.
● Document data engineering processes, best practices, and technical workflows.
Qualifications
● Good English writing and communication skills.
● Ability to work independently.
● Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.
● 3+ years of experience in data engineering, with a focus on AI/ML applications.
● Proficiency in programming languages such as Python, Java, or Scala.
● Experience with cloud platforms like AWS (S3, Redshift, Lambda), Azure, or GCP.
● Strong expertise in SQL and NoSQL databases (PostgreSQL, MongoDB, etc.).
● Hands-on experience with data pipeline tools such as Apache Airflow, Luigi, or Prefect.
● Knowledge of AI/ML frameworks like TensorFlow, PyTorch, or Scikit-learn.
● Familiarity with containerization and orchestration tools like Docker and Kubernetes.
● Experience with dashboard building tools such as Power BI and Zoho Analytics.
● Excellent problem-solving skills and ability to work in a collaborative team environment.
Preferred Qualifications:
● Experience with real-time data processing using Kafka or Flink.
● Exposure to MLOps and automated AI model deployment.
● Understanding of data governance, security, and compliance best practices.
● Strong analytical skills with a focus on performance optimization.
Job Type: Full-time
Pay: ₹200,000.00 - ₹250,000.00 per month
Benefits:
- Health insurance
- Paid sick time
- Provident Fund
Schedule:
- Monday to Friday
Supplemental Pay:
- Yearly bonus
Work Location: Remote