- Company Name
- Open Systems Inc.
- Job Title
- AI Applications/DevOps Engineer (Entry Level)
- Job Description
-
Job Title: AI Applications / DevOps Engineer (Entry Level)
Role Summary
Design, implement, and maintain AI-driven systems centered on large language models (LLMs) for automotive engineering workflows. Lead the creation of agentic pipelines, retrieval‑augmented generation (RAG) solutions, and end‑to‑end automation incorporating orchestrations such as LangChain, LlamaIndex, Haystack, or CrewAI.
Expectations
- Deliver high‑quality AI products within a 12‑month contract, aiming for long‑term engagement.
- Apply best practices in LLMOps, vector store management, and pipeline automation.
- Collaborate cross‑functionally with engineering, data science, security, and compliance teams.
Key Responsibilities
- Architect and deploy LLM solutions (Azure OpenAI, Claude, Llama, Mistral, Gemini, open‑source) for text generation, summarization, and reasoning.
- Build agentic workflows using multi‑agent architectures, chain‑of‑thought reasoning, and orchestration frameworks.
- Design and implement RAG systems: vector database integration (Pinecone, Weaviate, FAISS, ChromaDB, Elastic), chunking, indexing, and retrieval pipelines.
- Automate human‑in‑the‑loop processes, chaining LLMs, expert systems, and external tools.
- Fine‑tune, deploy, and monitor LLMs on private datasets, ensuring compliance with data protection standards.
- Evaluate and integrate emerging AI tools, APIs, and infrastructure (LLMOps, prompt management, guardrails).
- Optimize models for cost, latency, and scalability in production.
- Maintain documentation, monitor system performance, and iterate on feedback loops.
Required Skills
- Strong Python programming and experience with Transformers, HuggingFace, FastAPI, Azure OpenAI, and related libraries.
- Deep familiarity with LLM prompt engineering, fine‑tuning, selection, and evaluation.
- Hands‑on experience building agentic pipelines and workflow automation using LangChain, LlamaIndex, Semantic Kernel, Haystack, or similar.
- Proven expertise in RAG system design: vector database selection, chunking strategies, retrieval optimization, and search engine integration.
- Knowledge of multimodal data handling, knowledge graphs, and semantic search techniques.
- Experience with AI safety, guardrails, evaluation, and synthetic data integration.
- Ability to orchestrate CI/CD for AI pipelines, containerization (Docker), and cloud services (Azure ML, AWS SageMaker, GCP Vertex).
- Strong analytical, research, communication, and documentation skills.
Required Education & Certifications
- Bachelor’s, Master’s, or PhD in Computer Science, Artificial Intelligence, or a related discipline.
- No mandatory certifications, but familiarity with data privacy and security standards (e.g., GDPR, CCPA) is advantageous.