About the role
AI summarisedAt the forefront of the AI revolution, you will architect and build a next-generation, enterprise-wide data platform on Google Cloud. This role moves beyond traditional data engineering to create an intelligent, automated system that serves as the foundation for critical AI and analytics initiatives.
ConsultingHybridEngineering, AI & Data
Key Responsibilities
- Design and implement a modern, scalable, and cost-optimized cloud data platform on Google Cloud Platform (GCP).
- Develop robust data pipelines to power self-service analytics, Agentic AI, and advanced machine learning use cases.
- Utilize advanced analytics and cognitive technologies to uncover hidden insights and solve critical business challenges.
- Engineer automated data governance, lineage, and quality frameworks to ensure data reliability.
- Guide the technical direction of the platform, champion best practices, and mentor engineering team members.
Requirements
- Proven experience designing and building enterprise-scale, cloud-native data platforms.
- Required expertise in GCP; strong AWS experience is valued.
- Mastery of building robust, metadata-driven ingestion pipelines for diverse data types (Batch, Streaming, API, IoT/Geospatial) using Python.
- Strong expertise in data modeling (Dimensional/Star Schema, 3NF) and building performant data marts.
- Hands-on experience with core GCP services (Google BigQuery, Dataflow, Cloud Composer, Pub/Sub, Dataplex).
- Experience implementing enterprise data governance frameworks including automated lineage and access controls.
- Proficiency with FinOps principles for cost-efficient architecture.