cover image
AIM

AIM

aim.vision

1 Job

99 Employees

About the Company

Unlock the ultimate level of safety and productivity in your operations

The AIM platform provides a rugged plug-and-play solution for a wide range of heavy equipment fleets in the field today. We take customers from their current mode of operation to autonomous operation with a rigorous 3-step process.

This solution enables mining and earthmoving equipment to run continuously at peak performance across weather conditions—thereby unlocking operational and CapEx value across fuel savings, duty cycle, fleet availability, optimal AI site planning and more. The result is maximally safe zero-entry sites, where all ground staff is out of harm’s way.

AIM technology has been built by a team of earth works managers and operators together with engineers who developed autonomous products at Waymo, SpaceX, Google & Tesla. We bring deep combined expertise across mining, earthmoving, robotics, hardware, and advanced AI deployed at scale.

Listed Jobs

Company background Company brand
Company Name
AIM
Job Title
Perception & SLAM ML Engineer
Job Description
**Job Title:** Perception & SLAM ML Engineer **Role Summary** Design, develop, and deploy machine‑learning perception and SLAM systems for autonomous heavy‑equipment vehicles operating in dynamic, harsh environments. Advance sensor‑fusion algorithms, ensuring high‑performance, real‑time perception for 2D/3D depth sensing and localization. **Expectations** - Deliver production‑grade models with robustness, low latency, and safety compliance. - Maintain a rigorous research‑to‑deployment pipeline, from algorithm prototyping to field validation. - Collaborate cross‑functionally with hardware, firmware, and operations teams. **Key Responsibilities** - Architect and implement perception algorithms and multi‑sensor fusion (camera, lidar, radar). - Develop SLAM, localization, and depth‑estimation pipelines suitable for large‑scale, real‑time deployment. - Conduct data‑driven research to introduce state‑of‑the‑art machine‑learning solutions. - Calibrate, validate, and fine‑tune sensor models; perform geometry‑based corrections. - Deploy and monitor models on edge hardware; perform A/B testing and performance tuning. - Write clean, maintainable code in Python; produce unit and integration tests. - Maintain documentation of models, calibration procedures, and performance metrics. - Iterate on models based on field data, feedback loops, and rigorous benchmarking. **Required Skills** - Proven track record of deploying ML‑based perception systems on autonomous vehicles. - Strong foundation in machine‑learning theory and mathematics. - Expertise in 2D/3D perception, depth sensing, point‑cloud processing. - Advanced knowledge in SLAM, localization, and sensor‑fusion algorithms. - Hands‑on experience with TensorFlow or PyTorch. - Proficient in Python programming; familiarity with C++/CUDA optional. - Experience shipping production models, handling edge deployment constraints, and continuous integration. - Additional skills (preferred): reinforcement learning, robotics, radar perception, perception sensor calibration techniques. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Electrical Engineering, Robotics, or related field (Master’s or PhD preferred). - No mandatory certifications required.
Seattle, United states
Hybrid
23-02-2026