- Company Name
- Aker Systems
- Job Title
- Lead Software Engineer (Scala & Spark)
- Job Description
-
**Job Title:** Lead Software Engineer (Scala & Spark)
**Role Summary:**
Lead the design, development, and maintenance of secure, high‑performance, cloud‑based data infrastructure. Own end‑to‑end solutions using Scala and Apache Spark, ensuring robust performance, scalability, and compliance with enterprise security standards.
**Expectations:**
- Design and architect distributed data pipelines and services that meet strict security and performance benchmarks.
- Deliver production‑grade Scala/Spark code within defined timelines.
- Collaborate with product, data science, and DevOps teams to align technical solutions with business objectives.
- Mentor junior engineers and champion best practices in coding, testing, and deployment.
- Monitor system health, troubleshoot, and optimize for latency, throughput, and cost.
- Participate in code reviews, security audits, and continuous improvement initiatives.
**Key Responsibilities:**
- Develop scalable batch and streaming data pipelines in Scala using Spark.
- Implement secure data handling, encryption, and access controls in accordance with enterprise policies.
- Architect cloud deployments (AWS, Azure, or GCP) leveraging managed services, container orchestration, and IaC.
- Optimize Spark jobs for memory usage, shuffle performance, and overall runtime.
- Ensure code quality through automated unit, integration, and performance tests.
- Deploy and monitor services via CI/CD pipelines, observability tools, and alerting systems.
- Conduct architectural reviews, propose enhancements, and stay current with emerging data‑engineering technologies.
**Required Skills:**
- Deep expertise in Scala and Apache Spark (both batch and streaming).
- Strong background in cloud data services (AWS Glue, EMR, Databricks, Azure Databricks, or GCP Big Data services).
- Proficiency with distributed system concepts (partitioning, fault tolerance, reconciliation).
- Secure coding practices, data privacy, and compliance awareness.
- Experience with CI/CD, containerization (Docker, Kubernetes), and IaC (Terraform or equivalent).
- Familiarity with CI pipelines, testing frameworks, and performance profiling tools.
- Excellent debugging, profiling, and tuning capabilities.
**Required Education & Certifications:**
- Bachelor’s or Master’s degree in Computer Science, Software Engineering, or related field.
- Professional certifications such as Certified Spark Developer, AWS Big Data – Specialty, Azure Data Engineer, or equivalent are a plus.