Micron Technology

SR ENGINEER, DATA ENGINEERING, SMAI

Micron Technology
Integrated Device ManufacturingSingapore, SingaporeFull-time3 weeks ago

About the role

AI summarised

Senior Data Engineer responsible for building and maintaining data pipelines, ETL processes, and data structures to support Smart Manufacturing AI solutions at Micron. Role involves working with cloud platforms (Snowflake, GCP), big data technologies, and AI-enabled analytics to extract insights from expanding data streams. Position requires strong technical background in data engineering, analytics, and software development with ability to work in a fast-paced, collaborative environment.

IDMFull-timeSmart MFG/AI

Key Responsibilities

  • Build and maintain data/solution pipelines that ingested and match the new requirements, with AI support to accelerate data onboarding and quality checks
  • Work in a technical team through development, deployment, and application of preparing, and optimizing the developing and growing Micron's methods and systems for extracting new insight for expanding data streams, driving AI-enabled analytics at scale
  • Develop, Automate, and Orchestrate an Ecosystem of ETL Processes for Varying Volumes of Data
  • Design and optimize data structures in data management systems (Snowflake, and Google Cloud Platform) to enable Smart Manufacturing AI solutions, supporting AI-driven workloads and performance
  • Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical data model
  • Build custom software components and analytics applications, with AI support to enhance intelligence and automation
  • Create/Maintain CI/CD pipelines of data engineering solutions in the cloud, driving AI-informed testing, monitoring, and release automation
  • Use AI to complete work more efficiently, empowering AI applications and teams to achieve faster, smarter outcomes

Requirements

  • Experience developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions
  • Ability to work with multiple operating systems (e.g., MS Office, Unix, Linux, etc.)
  • Experienced in developing ETL/ELT processes using Apache Ni-Fi and Snowflake, GCP Big Query or any Equivalent Data Warehouse
  • Significant experience with big data processing and/or developing applications and data sources via Hadoop, Yarn, Hive, Pig, Sqoop, MapReduce, HBASE, Flume, etc.
  • Understanding of how distributed systems work
  • Familiarity with software architecture (data structures, data schemas, etc.)
  • Strong working knowledge of databases (Oracle, MSSQL, etc.) including SQL and NoSQL
  • Strong mathematics background, analytical, problem solving, and organizational skills
  • Knowledge in building APIs for application integration
  • Experience with continuous integration/continuous delivery (CI/CD) tools (Jenkins, Git, Docker, Kubernetes)
  • Outstanding analytical thinking, interpersonal, oral and written communication skills
  • Ability to prioritize and meet critical project timelines in a fast-paced environment
  • Self-motivated and team oriented
  • B.S. degree in Computer Science, Software Engineering, Electrical Engineering, Applied Mathematics or related field of study. M.S. degree preferred
  • Minimum of 3 years' experience in any of the following: At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.) at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.) one or more Data Extraction Tools (SSIS, Informatica/Apache Nifi or equivalent etc.)
  • Software development skills and the desire to work on cutting edge development in a Cloud environment
  • Ability to travel as needed