SMRT

Senior/Executive, Ops Analysis & Technology

SMRT
Public Transport & Rail OperationsSingapore, SGOnsitePosted 4 weeks ago

About the role

AI summarised

This senior role focuses on data engineering and business intelligence for railway infrastructure maintenance. The incumbent will design and manage Azure-based data pipelines, develop Power BI dashboards and models, and collaborate with engineering teams to support predictive and condition-based maintenance. Responsibilities include ensuring data quality, optimizing system performance, mentoring junior staff, and translating operational needs into analytics solutions.

TransportOnsite

Key Responsibilities

  • Design, implement, and operate production-grade data pipelines on the Azure platform (ADF, Synapse/Fabric, ADLS, Functions)
  • Take full ownership of existing data solutions (including CDE integrations, SharePoint/REST ingestions, and scheduled jobs), maintaining comprehensive documentation, runbooks, and recovery procedures
  • Enhance and manage Power BI data models and dashboards, providing timely and accurate information to engineering and operational teams
  • Establish rigorous monitoring, alerting, and capacity planning for all datasets and pipelines to achieve ≥99% on-time refresh for Tier-1 reports
  • Collaborate with Railway Infrastructure Engineers to support fault diagnostics and mine reliability trends to inform maintenance programs
  • Translate operational needs into actionable dashboards, alerts, and self-service analytics tools
  • Work closely with operations teams to understand their challenges, with future opportunities for cross-training and shadowing to gain deep domain expertise
  • Manage projects end-to-end including requirements gathering, design, build, testing, and handover with clear timelines and stakeholder communication
  • Collaborate with various teams to align initiatives with sustainability and compliance goals and research emerging maintenance and green technologies
  • Support the implementation of sustainability projects by developing data models and dashboards to monitor performance metrics

Requirements

  • A degree in Engineering (Mechanical, Electrical, Civil), Computer Science, Information Systems, or equivalent professional experience
  • 3–6 years of hands-on experience in data engineering or analytics engineering, with proven ownership of production pipelines and BI models
  • Prior experience in maintenance, rail, asset-intensive, or other industrial operations is strongly preferred
  • Advanced proficiency in SQL, including performance tuning, partitioning, indexing, and complex queries
  • Strong understanding of dimensional modelling concepts like star schemas and slowly changing dimensions (SCDs)
  • Proficient in Python for data engineering, including libraries such as pandas, PySpark, and data quality frameworks
  • Expert-level skills in DAX, Power Query (M), composite models, incremental refresh, RLS/OLS, deployment pipelines, and gateway administration
  • Comfortable working with Azure data services, including Azure Data Factory (ADF), Synapse/Fabric (SQL Warehouse or Lakehouse), ADLS, Key Vault, and Azure Functions/containers
  • Experience with version control (Git) and CI/CD principles (e.g., YAML pipelines)
  • Excellent documentation and communication skills, with the ability to create clear diagrams, runbooks, and concise status updates
  • Proficiency in Microsoft Excel for ad-hoc analysis and stakeholder reporting