- Company Name
- Doximity, Inc.
- Job Title
- Software Engineer (Python), Data Platform
- Job Description
-
**Job title:** Software Engineer (Python), Data Platform
**Role Summary:**
Develop and maintain standardized tools, frameworks, and libraries that support a horizontally scalable data stack. Work closely with data analysts, data engineers, and machine learning engineers to design and implement solutions that reduce technical bottlenecks and enable faster data-driven innovation.
**Expactations:**
- Advanced experience in Python programming with a strong grasp of language internals.
- Proficiency in SQL and analytical database concepts.
- Deep knowledge of containerized development workflows (Docker, Kubernetes, Podman) and cloud services (AWS, Snowflake).
- Experience with data orchestration and streaming platforms such as Apache Airflow and Kafka.
- Proven track record of delivering high‑quality, automated‑tested code in a fast‑moving environment.
- Strong self‑management, prioritization, and communication skills.
**Key Responsibilities:**
1. Design, build, and evolve internal tooling and automation libraries used by the entire data organization.
2. Collaborate with cross‑functional teams to understand current challenges and future needs, translating them into technical specifications.
3. Implement solutions that maximize the time data professionals spend on analytics and model development, while minimizing unrelated technical work.
4. Champion and enforce software engineering best practices, including code reviews, automated testing, CI/CD, and documentation.
5. Serve as the last line of defense for diagnosing and resolving development, performance, and stability issues affecting data pipelines and analytics workloads.
6. Continuously improve the engineering culture within the data organization through mentorship, knowledge sharing, and process improvements.
7. Deploy and manage infrastructure using container orchestration (Kubernetes), cloud services (AWS), and data platforms (Snowflake, Airflow, Kafka).
**Required Skills:**
- Python (expert level)
- SQL and familiarity with analytical databases (e.g., Snowflake, Redshift)
- Containerization and orchestration: Docker, Kubernetes, Podman
- Cloud platforms: AWS (EC2, S3, RDS, etc.)
- Data orchestration and streaming: Apache Airflow, Kafka
- Automated testing (unit, integration) and CI/CD pipelines
- Strong understanding of software engineering principles and best practices
- Excellent written and verbal communication
- Self‑driven, goal‑oriented, and able to manage multiple priorities
**Required Education & Certifications:**
- Bachelor’s degree or higher in Computer Science, Software Engineering, or a related technical field.
- Relevant certifications (AWS Certified Developer, Certified Kubernetes Administrator, Snowflake Certified Technical Specialist, etc.) are a plus but not mandatory.
San francisco, United states
On site
28-11-2025