About the role
AI summarisedSenior Associate Data Engineer role at EY's AI & Data consulting practice in Asia-Pacific, focusing on designing and implementing scalable data pipelines on Databricks across public clouds. The role involves coding complex data platforms, integrating with governance tools, and collaborating with stakeholders to deliver end-to-end data solutions.
BusinessFull-timeGeneral
Key Responsibilities
- Ability to design data pipelines with scalability, high performance and resilience in-built
- Ability to code complex requirements as part of large complex data platforms - develop, refine, and implement high quality delivery executions
- Deliver project deployments on public clouds and highly secure environments
- Integrate with data governance tools, ecosystem products across the cloud landscape
- Ability to understand the business requirements and translate into high-level and low-level designs to successfully meets the business objectives.
- Collaborate with stakeholders, presenting findings to a non-technical audience
- Bring in best practices for coding and building modular data engineering code that can be reused
- Stay current with technical and industry developments and standards to ensure effective and advanced applications of data analysis techniques and methodologies. Including AI integrations.
Requirements
- Min Bachelor's degree in Computer science, Mathematics, Engineering, Statistics, or a related field
- Databricks certifications, Azure certifications on data engineering
- At least 4 years of experience on various data platform solutions across data and analytics across the entire SDLC
- Atleast 2 years' experience on designing and mapping requirements to technical capabilities of data pipelines on databricks across Azure, AWS or GCP cloud
- Proficiency in programming leveraging Python, Spark, Scala and other programming languages
- Understanding of end-to-end data analytics domain and insights delivery
- Excellent understanding of data modelling techniques and leading trends on data platforms including AI
- Possess strong problem-solving, analytical and strategic thinking abilities
- Proficiency with agile boards like JIRA, Atlassian products
- Strong hands-on knowledge and project experience in technical delivery of data platform projects across any of the – Databricks on Azure cloud, Google cloud, AWS cloud.
- Must have hands-on experience with ELT, data pipeline and orchestration tools.
- Experience in end-to-end technical project delivery following agile delivery methodologies
- Understanding of data modelling techniques and data governance tools will be an added advantage
- Experience in estimation, infrastructure sizing for data platform solutions