Job DescriptionRole & responsibilitiesLead the GCP pillar within the Data Engineering CoE, establishing technical standards, best practices, and reusable accelerators for Google Cloud Platform data implementations. This role is critical for supporting high-value client engagements other GCP-focused opportunities. Key Responsibilities Develop architecture patterns and implementation accelerators for GCP data platforms Establish best practices for BigQuery, Dataflow, Dataproc, and other GCP data services Support pre-sales activities for GCP-based opportunities Design migration pathways from legacy systems to GCP Create technical documentation and playbooks for GCP implementations Mentor junior team members on GCP best practices Work with cloud-agnostic platforms (Databricks, Snowflake) in GCP environments Build deep expertise in enterprise-scale GCP deployments Collaborate with other pillar architects on cross-platform solutions Represent the company's GCP capabilities in client engagementsQualifications 10+ years of data engineering experience with minimum 5+ years focused on GCP Deep expertise in BigQuery, Dataflow, Dataproc, and Cloud Storage Experience implementing enterprise-scale data lakes on GCP Strong knowledge of data integration patterns and ETL/ELT frameworks in GCP Experience with migration from legacy systems to GCP Google Cloud Professional Data Engineer certification requiredExperience developing reusable templates and accelerators Strong coding skills in Python, Spark, and SQL Experience with Terraform, Deployment Manager, or other infrastructure as code tools Excellent communication skills with the ability to explain complex technical concepts
Job Title
GCP Data Engineering Architect