Overview
Company Profile :
DataNeuron is headquartered in San Francisco with offices in Delhi (NCR) and focuses on developing and scaling AI models. The company offers a data-centric, end-to-end solution for the Annotation, Training, and Management of Machine Learning models and LLMs. With DataNeuron, customers can quickly and easily produce prediction algorithms without writing any code.
Our team is made up of innovative data scientists, engineers, and program managers united by a passion for solving challenging problems in NLP at scale. We’re committed to ensuring that our components deliver the quality and performance that our customers expect.
Location - Noida, India
Experience - 2 - 3 Years
Job Responsibilities:-
- Collaborate seamlessly with cross-functional tech teams to architect, develop, and maintain our suite of innovative software products.
- Demonstrate prowess in coding, debugging, and troubleshooting, ensuring the delivery of high-quality solutions.
- Conduct research to identify innovative data science techniques and methodologies that align with the company’s product goals.
- Share ideas and perspectives to improve processes and deliver value to the product and the team.
- Efficiently manage deadlines and deliverables, exhibiting a results-oriented approach.
- Continuously enhance skills and knowledge to stay aligned with industry trends and company needs.
- Cultivate strong communication skills to articulate ideas effectively within the team, contributing to the creation of a product that is a true work of art.
Job Requirements :-
- Expertise and hands- on experience in Python coding, including proficiency in related frameworks and libraries.
- Deep understanding and development of backend applications and APIs using Flask, Django, etc., with experience scaling APIs for high-performance needs.
- Solid understanding and knowledge of databases such as MongoDB, SQL, etc., with a focus on optimizing data structures and improving overall performance.
- Strong understanding of machine learning models and methodologies, with proficiency in libraries like Scikit-learn, TensorFlow, and PyTorch, and the ability to apply them effectively in real-world scenarios.
- Hands-on experience working with neural networks, computer vision, and natural language processing (NLP) techniques using libraries such as Keras, OpenCV, Hugging Face Transformers, and SpaCy to solve complex problems.
- Knowledge and hands-on experience with large language models (LLMs) such as LLaMA and Mistral etc , along with experience in fine-tuning LLMs.
- Comprehensive knowledge of asynchronous APIs, including their design, implementation, and optimization for enhanced performance.
- Good understanding of model training and deployment on cloud platforms like Azure, GCP, and AWS.
- Experience with distributed data processing frameworks like Apache Spark or Hadoop for working with large datasets.
Job Types: Full-time, Permanent
Schedule:
- Day shift
Ability to commute/relocate:
- Noida, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Required)
Experience:
- total work: 2 years (Required)
- Python: 2 years (Required)
- SQL: 1 year (Required)
- MongoDB: 1 year (Required)
- Flask: 1 year (Required)
- Machine learning: 1 year (Required)
Location:
- Noida, Uttar Pradesh (Required)
Work Location: In person