- Company Name
- Kaizen Technologies
- Job Title
- Artificial Intelligence Engineer
- Job Description
-
**Job Title**: Artificial Intelligence Engineer
**Role Summary**
AI Engineer focused on natural language processing (NLP) and large language model (LLM) implementation. Responsible for designing, training, evaluating, and deploying production-grade LLMs and traditional ML models, managing model pipelines on AWS infrastructure, and ensuring robust performance on structured big‐data sets, with a preference for experience in equities data.
**Expectations**
- Deliver end‑to‑end LLM solutions that meet business use cases.
- Optimize models for latency, throughput, and cost on AWS.
- Maintain high model quality, reproducibility, and compliance.
- Collaborate with cross‑functional teams to integrate AI outputs into production systems.
**Key Responsibilities**
- Design, train, and fine‑tune LLMs (e.g., GPT, BERT, domain‑specific architectures).
- Build and maintain end‑to‑end machine‑learning pipelines on AWS (S3, SageMaker, EMR, Glue, Lambda).
- Deploy models to production Linux servers, expose APIs, and monitor performance.
- Implement and manage model versioning, continuous integration/continuous deployment (CI/CD) pipelines.
- Evaluate traditional ML models on structured, large‑scale datasets and translate insights into actionable recommendations.
- Work with data engineering teams to source, clean, and preprocess domain data (preferably equities).
- Conduct A/B testing, performance benchmarking, and model drift analysis.
- Document architecture, code, and best practices.
**Required Skills**
- Deep expertise in NLP and LLM engineering (tokenization, embeddings, prompt engineering).
- Practical experience with large‑scale model training and inference on AWS (SageMaker, EC2, EKS, ECS).
- Strong Linux system administration skills.
- Familiarity with classic supervised/unsupervised ML algorithms (scikit‑learn, XGBoost, LightGBM).
- Proficiency in Python, PyTorch/TensorFlow, and related ML libraries.
- Experience with data pipelines, ETL, and big‑data technologies (Spark, Hive).
- Knowledge of security best practices, model governance, and compliance.
- Excellent problem‑solving and communication skills.
**Required Education & Certifications**
- Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, Data Science, or related field.
- Relevant certifications (e.g., AWS Certified Machine Learning – Specialty, TensorFlow Developer Certificate, or equivalent) are a plus.