- Company Name
- Beacon Hill
- Job Title
- Integration Developer
- Job Description
-
**Job Title:** Integration Developer
**Role Summary:**
Seasoned software engineer responsible for designing, building, and maintaining scalable data integration solutions, APIs, and micro‑services in a fully remote, contract‑to‑hire position. Works within an Agile, cross‑functional team to deliver cloud‑based data pipelines and support production operations for healthcare and retail applications.
**Expectations:**
- 5+ years of integration/software engineering experience.
- U.S. citizen or Green Card holder (no sponsorship).
- Full remote work, Monday‑Friday business hours (flexible).
- Contract‑to‑hire (90‑day trial) with potential transition to full‑time ($125K‑$135K target salary).
- Ability to independently manage deliverables, stakeholder relationships, and mentor junior engineers.
**Key Responsibilities:**
- Develop integration processes and fast APIs using Python (or comparable scripting language).
- Build micro‑services and streaming solutions across multiple languages.
- Design, implement, and maintain high‑volume ETL/ELT pipelines into data warehouses or data lakes.
- Implement data validation, quality checks, and MongoDB‑based storage solutions.
- Apply CI/CD, observability, automated testing, and containerization (Docker/Kubernetes) to production workloads.
- Document workflows, identify efficiency improvements, and contribute to AI‑powered automation tools.
- Participate in code reviews, architecture discussions, Agile ceremonies, and cross‑team collaboration.
- Provide technical guidance to junior engineers and align solutions with product requirements.
**Required Skills:**
- Python scripting for API integration and data processing (strong proficiency).
- SQL and NoSQL (MongoDB) data modeling and query optimization.
- Designing and operating high‑throughput data pipelines (ETL/ELT).
- Experience with big‑data and cloud platforms (GCP, AWS, or Azure).
- Real‑time/streaming technologies (e.g., Kafka, Pub/Sub).
- Container orchestration (Docker, Kubernetes) and DevOps practices (Git, CI/CD).
- Requirement gathering, stakeholder management, and independent delivery.
- Familiarity with reporting/visualization tools (Tableau, PowerBI) and analytics concepts.
- Preferred: machine‑learning libraries (scikit‑learn, TensorFlow, PyTorch), GCP data engineering services, vulnerability scanning, SAFe/Agile methodologies.
**Required Education & Certifications:**
- Bachelor’s degree in Computer Science, Software Engineering, Information Systems, or related field (or equivalent professional experience).
- Relevant certifications (e.g., Google Cloud Professional Data Engineer, AWS Certified Solutions Architect, Azure Data Engineer) are a plus but not mandatory.