Job Specifications
Hybrid in Charlotte, NC, 2-3 days/week on site
1-year contract with potential for extension or full-time conversion
W-2 only (C2C/1099 is not possible for this role)
REQUIRED EXPERIENCE:
5+ years of AWS experience
AWS services: S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions, QuickSight
Experience with Kafka/Messaging, preferably Confluent Kafka
Experience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB, and Aurora
Experience with AWS data warehousing tools such as Amazon Redshift and Amazon Athena
Proven track record in the design and implementation of data warehouse solutions using AWS
Skilled in data modeling and executing ETL processes tailored for data warehousing
Competence in developing and refining data pipelines within AWS
Proficient in handling both real-time and batch data processing tasks
Extensive understanding of database management fundamentals
Expertise in creating alerts and automated solutions for handling production problems
Tools and Languages: Python, Spark, PySpark, and Pandas
Infrastructure as Code technology: Terraform/CloudFormation
Experience with Secrets Management Platform like Vault and AWS Secrets manager
Experience with Event Driven Architecture
DevOps pipeline (CI/CD): Bitbucket; Concourse
Experience with RDBMS platforms and strong proficiency with SQL
Experience with Rest APIs and API gateway
Deep knowledge of IAM roles and Policies
Experience using AWS monitoring services like CloudWatch, CloudTrail, and CloudWatch events
Deep understanding of networking DNS, TCP/IP, and VPN
Experience with AWS workflow orchestration tool like Airflow or Step Functions
RESPONSIBILITIES:
Where applicable, collaborate with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead) to understand requirements/use cases to outline technical scope and lead delivery of the technical solution
Confirm required developers and skill sets specific to the product
Collaborate with Data and Solution architects on key technical decisions
Skilled in developing data pipelines, focusing on long-term reliability and maintaining high data quality
Design data warehousing solutions with the end-user in mind, ensuring ease of use without compromising on performance
Manage and resolve issues in production data warehouse environments on AWS
CORE EXPERIENCE AND ABILITIES:
Ability to perform hands-on development and peer review for certain components/tech stack on the product
Standing up of development instances and migration path (with required security, access/roles)
Develop components and related processes (e.g., data pipelines and associated ETL processes, workflows)
Ability to build new data pipelines, identify existing data gaps, and provide automated solutions to deliver analytical capabilities and enriched data to applications
Ability to implement data pipelines with the right attentiveness to durability and data quality
Implement data warehousing products thinking of the end user's experience (ease of use with the right performance)
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state, and local laws.
About the Company
Here at Brooksource, relationships are at the center of everything we do. Since 2000, we have established and maintained lasting relationships with our clients, consultants, and internal employees to create an unparalleled experience.
Brooksource is a trusted provider of Engineering & Technology solutions for Fortune 500 organizations, specializing in Experience-Driven Staffing, Professional Services, and our innovative Workforce Transformation program, Elevate. Leveraging our partnerships with Salesforce, AWS, Microsoft, ...
Know more