About the role
AI summarisedThis is a data engineering internship focused on ETL processes and Python development at a technology company. The intern will assist in building and maintaining data pipelines, performing data transformations, and ensuring data quality.
IndustrialFull-timeEngineering Services
Key Responsibilities
- Design, develop, and maintain ETL pipelines to extract, transform, and load data from various sources into data warehouses.
- Write efficient Python scripts for data processing and automation tasks.
- Collaborate with data engineers and analysts to understand data requirements and implement solutions.
- Perform data validation and quality checks to ensure accuracy and consistency.
- Document data pipelines, processes, and technical specifications.
- Assist in optimizing existing data workflows for performance and scalability.
- Participate in code reviews and team meetings to contribute to best practices.
Requirements
- Currently pursuing a Bachelor's or Master's degree in Computer Science, Information Systems, Data Science, or a related field.
- Strong proficiency in Python programming.
- Solid understanding of SQL and relational databases.
- Familiarity with ETL concepts and data warehousing principles.
- Excellent problem-solving and analytical skills.
- Strong written and verbal communication skills.
- Ability to work independently and as part of a team.
- Prior internship or project experience in data engineering is a plus.
- Knowledge of cloud platforms (e.g., AWS, GCP, Azure) is preferred.
- Experience with version control systems (e.g., Git) is a plus.