Overview
Job Title: Python Developer with PySpark Expertise
Location: Remote
Experience: 3-5 Years
Employment Type: Full-time
About Us:
Leapcodes is a technology-driven company specializing in software development, cloud solutions, and digital transformation. We work with leading enterprises across various industries, delivering cutting-edge solutions tailored to business needs.
Job Description:
We are looking for a skilled Python Developer with PySpark expertise to join our team. The ideal candidate should have a strong foundation in Python, Big Data Processing, and Spark (PySpark) and be experienced in handling large-scale data processing workflows.
Key Responsibilities:
- Develop, optimize, and maintain data processing pipelines using PySpark and Python.
- Work with large datasets and ensure efficient data transformation, cleaning, and processing.
- Design and implement ETL workflows for structured and unstructured data.
- Collaborate with data engineers, data scientists, and cloud architects to build scalable data solutions.
- Optimize Spark jobs for performance and scalability on distributed systems.
- Integrate with cloud platforms like AWS, Azure, or GCP for data storage and processing.
- Troubleshoot and resolve issues related to Spark job execution and cluster performance.
Required Skills & Qualifications:
- 3-5 years of experience in Python development with expertise in PySpark.
- Strong understanding of Spark architecture, RDDs, DataFrames, and Spark SQL.
- Hands-on experience with big data technologies such as Hadoop, HDFS, Hive, Kafka, etc.
- Experience in writing optimized and efficient PySpark scripts for batch and real-time data processing.
- Strong knowledge of SQL and database technologies (PostgreSQL, MySQL, NoSQL, etc.).
- Experience with cloud services (AWS, Azure, or GCP) and data lake/data warehouse concepts.
- Familiarity with CI/CD pipelines and version control (Git, Bitbucket, etc.).
- Knowledge of containerization tools like Docker/Kubernetes is a plus.
Preferred Qualifications:
- Experience with workflow orchestration tools like Apache Airflow.
- Exposure to ML/AI pipelines using big data processing.
- Strong problem-solving and analytical skills.
- Ability to work in a collaborative and fast-paced environment.
Why Join Us?
- Exciting projects in big data and cloud computing.
- Competitive salary based on experience.
- Opportunities to work with cutting-edge technologies and industry experts.
- Growth-oriented culture with upskilling opportunities.
If you are passionate about big data and want to work in a dynamic environment, we’d love to hear from you!
How to Apply:
Send your updated resume to hr@leapcodes.com with the subject “Python Developer - PySpark”.
Job Type: Full-time
Pay: ₹50,000.00 - ₹150,000.00 per month
Benefits:
- Provident Fund
Schedule:
- UK shift
Work Location: Remote
Application Deadline: 28/02/2025
Expected Start Date: 28/03/2025