Job Specifications
Job Title: GCP Engineer
Location: Tampa, FL, Dallas, TX, Basking Ridge, NJ
Employment Type: 12+ Months
Key skills: GCP, Google BigQuery, Informatica and SSIS
Job Summary:
We are looking for a highly skilled GCP Engineer with expertise in BigQuery and other GCP services to design, implement, and optimize data solutions on Google Cloud Platform. The ideal candidate will have strong experience in data engineering, cloud computing, large-scale data processing, and hands-on experience with ETL tools such as SSIS and/or Informatica for migration and integration projects.
Key Responsibilities:
• Design, develop, and optimize BigQuery data warehouses, ensuring high performance and cost efficiency.
• Build and maintain scalable ETL pipelines using GCP services such as Cloud Dataflow, Cloud Dataproc, and Cloud Composer (Airflow).
• Extract, analyze, and migrate existing ETL logic from SSIS and/or Informatica to GCP-native solutions.
• Implement data ingestion, transformation, and processing workflows using Python, SQL, and Spark.
• Work with Cloud Storage, Pub/Sub, and Cloud Functions to facilitate real-time and batch data processing.
• Ensure data security, governance, and compliance using IAM, encryption, and audit logging.
• Collaborate with Data Scientists and Analysts to optimize query performance and enable advanced analytics.
• Troubleshoot performance issues, optimize queries, and manage BigQuery cost and resource utilization.
• Implement CI/CD pipelines for data workflows using Terraform and Cloud Build.
• Monitor and maintain GCP infrastructure, ensuring reliability and scalability.
Required Skills & Experience:
• 8+ years of experience working as a GCP Data Engineer or similar role.
• Strong expertise in Google BigQuery – performance tuning, partitioning, clustering, and cost optimization.
• Hands-on experience with SSIS and/or Informatica for ETL development and migration to cloud platforms.
• Hands-on experience with ETL tools and frameworks, including Cloud Dataflow (Apache Beam), Cloud Dataproc (Spark), and Cloud Composer (Airflow).
• Proficiency in SQL, Python, and Shell scripting for data transformation and automation.
• Experience with Google Cloud Storage, Pub/Sub, and Cloud Functions.
• Strong understanding of GCP IAM, security best practices, and data governance.
• Familiarity with Terraform, Cloud Build, and CI/CD for infrastructure automation.
• Ability to work with large-scale datasets and real-time streaming data processing.
• Experience in data modeling, schema design, and optimization techniques.
• Strong analytical and problem-solving skills.
Preferred Qualifications:
• GCP Professional Data Engineer Certification is a plus.
• Experience migrating legacy ETL workloads (SSIS/Informatica) to GCP is highly desirable.
• Experience with Machine Learning workflows on GCP (Vertex AI, AutoML, TensorFlow on AI Platform) is a plus.
• Experience with Kubernetes (GKE) and containerized deployments is a plus.