Job Specifications
Job Title: AI/ML Engineer
Location: Newark, CA OR MI
Duration: Long Term
Client: Lucid
Job Description & Skill Requirement:
We’re seeking a highly skilled AI/ML Engineer to join our engineering team in USA. his role requires a strong foundation in computer science, statistics, and data science, along with hands-on experience in modern ML frameworks.
Required Experience and Exposure
Bachelor’s or master’s degree in computer science, Machine Learning, Business Analytics, Data Science or related field
Proven experience in building and deploying ML models in real-world applications
Strong programming skills in Python (preferred), Gherkin.
Solid understanding of machine learning algorithms, deep learning, and statistical modeling
Experience with any of the cloud platforms (AWS or any) and ML tools (Databricks, MLflow, etc.)
Job Duties and Responsibilities:
Develop AI-driven solutions and intelligent agents tailored to automotive applications, leveraging Python, Gherkin, C++, and modern AI/ML frameworks across both front-end and back-end systems.
Build and implement AI-powered tools to streamline and enhance various stages of automotive testing and validation.
Integrate and manage large language models (LLMs) such as Llamas, ChatGPT, and Gemini using AWS Bedrock and Amazon SageMaker for inference, deployment, and fine-tuning.
Fine-tune LLMs and AI models to optimize performance and accuracy for domain-specific automotive use cases.
Deploy, monitor, and maintain AI systems in production environments, ensuring reliability, scalability, and compliance.
Collaborate closely with cross-functional teams—including data scientists, HIL testers, and feature owners—to embed AI capabilities into automotive system architectures.
Required Skills:
• Programming: Python, Exposure to C++
• Behavior Driven Development: Gherkin
• Cloud Platforms: AWS
• Cloud Services: AWS Bedrock and Amazon SageMaker
Preferred Skills:
• ML Frameworks: (Example: TensorFlow, PyTorch, Scikit-learn)
• Data Management and Analysis: (Example: SQL, Spark, Pandas)
• Devops/ Deployment: exposure to Docker or Kubernetes, CI/CD pipeline etc