- Company Name
- AddSource
- Job Title
- Senior AI/ML Backend Developer
- Job Description
-
**Job Title:** Senior AI/ML Backend Developer
**Role Summary:**
Design, build, and scale production‑grade AI agent systems and retrieval‑augmented generation (RAG) pipelines. Deliver high‑throughput, low‑latency backend services on AWS, integrating foundation models via Bedrock and ensuring robust observability, governance, and compliance.
**Expectations:**
- 6–10+ years of backend engineering experience, with ≥3 years delivering AI/ML applications in production.
- Proven ability to architect and operationalize complex AI agents and RAG workflows.
- Strong ownership of code quality, performance, security, and cost efficiency.
- Collaborative mindset across product, data, and security teams to meet SLAs and compliance standards.
**Key Responsibilities:**
- Architect and implement AI agents using AgentCore runtime and Strands (or equivalent) for task orchestration and multi‑step reasoning.
- Design, deploy, and monitor RAG pipelines (chunking, embeddings, vector stores, ranking) with comprehensive observability.
- Develop scalable Python microservices (FastAPI, asyncio) and async data processing pipelines (ETL/ELT) with queuing and back‑pressure handling.
- Integrate AWS Bedrock foundation models, implementing prompt orchestration, safeguard mechanisms, and cost controls.
- Productionize services via CI/CD, IaC (Terraform/CDK), and container orchestration on Amazon EKS/ECS/Fargate.
- Establish evaluation and governance frameworks for accuracy, latency, hallucination, model drift, and PII handling.
- Work with cross‑functional teams to ensure performance, security, and regulatory compliance.
**Required Skills:**
- Python (FastAPI, asyncio, typing, pydantic) and automated testing (pytest).
- Experience with agent frameworks (AgentCore, Strands) or comparable systems.
- Deep expertise in RAG and vector databases (Pinecone, OpenSearch/KNN, Milvus, pgvector).
- AWS cloud services: EKS/ECS, Lambda, SQS/Kinesis, Step Functions; familiarity with AWS Bedrock.
- Scalable data processing (Spark, Flink, Dask) and feature/log pipeline construction.
- Infrastructure as Code (Terraform, CDK) and CI/CD pipelines.
- Security and compliance knowledge: PII handling, IAM, key management, audit trails.
**Required Education & Certifications:**
- Bachelor’s degree in Computer Science, Software Engineering, or related field (Master’s preferred).
- Relevant certifications (e.g., AWS Certified Solutions Architect, AWS Certified Machine Learning – Specialty) are a plus but not mandatory.