- Company Name
- Centraprise
- Job Title
- AWS Cloud Engineer
- Job Description
-
Job Title: AWS Cloud Engineer
Role Summary:
Design, develop, and maintain scalable data solutions on AWS, encompassing ingestion, transformation, storage, and analytics. Utilize AWS data services, Databricks, and automation tools to build efficient ETL/ELT pipelines, ensuring data quality, performance, and compliance. Collaborate with cross-functional teams to enable data-driven insights.
Expectations:
* Build and optimize AWS-based data pipelines at peak performance.
* Deliver secure, cost‑effective, and compliant data infrastructure.
* Automate repetitive tasks using IaC and scripting.
* Maintain high availability with robust monitoring and alerting.
* Work in a CI/CD, Agile environment.
Key Responsibilities:
* Design, develop, and tune ETL/ELT pipelines using S3, Glue, Redshift, Athena, Lambda, Step Functions, Kinesis, and Apache Airflow.
* Create and manage data models and schema in relational (RDS) and NoSQL (DynamoDB) databases.
* Integrate data from relational sources, APIs, and streaming feeds, ensuring consistency and quality.
* Build and manage data infrastructure with Terraform, AWS CloudFormation, and other IaC.
* Automate deployment and operations using Python, Bash, and CI/CD tools (GitLab).
* Implement security controls, encryption, and compliance with GDPR, CCPA, ITAR, etc.
* Monitor pipeline health, performance, and cost; set up alerts and troubleshoot incidents.
* Collaborate with data scientists, analysts, and engineers to translate business requirements into technical solutions.
Required Skills:
* Proficient with AWS data services: S3, Glue, Redshift, Athena, Lambda, Step Functions, Kinesis, RDS, DynamoDB.
* Strong programming in Python, Scala, or PySpark for data processing and automation.
* Expertise in SQL; experience with relational and NoSQL databases.
* Skilled in data pipeline design, performance tuning, and cost optimization.
* Knowledge of Databricks Unity Catalog, PySpark, and Apache Airflow.
* Experience with IaC (Terraform, CloudFormation), scripting (Python, Bash), and CI/CD pipelines (GitLab).
* Familiarity with monitoring (CloudWatch, Grafana, ELK) and alerting frameworks.
* Understanding of security best practices, encryption, and regulatory compliance.
* Strong communication and collaboration in Agile teams.
Required Education & Certifications:
* Bachelor’s degree in Computer Science, Engineering, or related field.
* AWS Certified Solutions Architect – Associate or equivalent cloud certification preferred.
* Experience with CI/CD, Agile methodologies, and data engineering tools.