- Company Name
- Zeta Global
- Job Title
- Lead Software Engineer - Data Connectivity
- Job Description
-
**Job Title**
Lead Software Engineer – Data Connectivity
**Role Summary**
Lead design, development, and maintenance of scalable data‑connectivity systems for a marketing platform. Own architecture of Python microservices, high‑throughput ETL pipelines, and connector frameworks that integrate external marketing data sources into the core datastore. Drive engineering excellence, mentor team members, and collaborate across product, security, and platform functions to deliver reliable, high‑performance data pipelines and APIs.
**Expectations**
* 8+ years backend development, 2+ years in leadership roles.
* Deliver enterprise‑grade services meeting SLAs, RPO/RTO, and data quality requirements.
* Foster a culture of observability, performance optimization, and continuous improvement.
* Mentor junior engineers and facilitate technical reviews.
**Key Responsibilities**
* Design, implement, and operate Python‑based microservices and ETL/ELT pipelines.
* Embed observability: structured logging, metrics, distributed tracing; define SLOs, error budgets, incident runbooks.
* Conduct design/code reviews and optimize performance and cost.
* Build reusable connector frameworks for integrations with ad tech platforms (e.g., Facebook Ads, Google Ads) and enterprise systems.
* Ensure modularity, versioning, and backward compatibility in connector design.
* Collaborate with Product, SecOps, and Platform teams to deliver solutions on Kubernetes, message buses, workflow orchestrators.
* Evaluate design proposals from team members.
* Mentor and provide technical guidance to junior engineers.
**Required Skills**
* Strong proficiency in Python; experience in Go or Java acceptable.
* Distributed systems knowledge (Kafka, Kinesis, etc.).
* Deep understanding of ETL/ELT, data modeling, and connector architecture.
* Cloud expertise: AWS (Lambda, ECS/EKS, S3), GCP, or Azure.
* REST/gRPC API design, OAuth, webhook integrations.
* Experience building self‑healing, high‑throughput pipelines.
* Familiarity with query engines (Trino, Athena, Spark).
**Nice to Have**
* Experience with lakehouse technologies (Iceberg, Delta Lake).
* Temporal or similar workflow orchestration tools.
**Required Education & Certifications**
* Bachelor’s degree in Computer Science, Software Engineering, or related field, or equivalent professional experience.
* No specific certifications required, but relevant cloud or distributed systems certifications are a plus.
San francisco, United states
On site
Senior
24-12-2025