Description:
AECOM is seeking a talented Junior Data Engineer to join our growing team of data professionals.
As a Junior Data Engineer, your primary focus will be on supporting Senior and Principal Engineers through design and the wider solution lifecycle.
Role responsibilities:
- Work alongside senior engineers to support the delivery of various data solutions and data pipelines
- Collaborate with other data professionals (Data Analysts, Data Scientists) and contribute to requirements-gathering sessions for internal and external stakeholders across the organisation.
- Support our quality assurance and refinement processes through reviews and hands-on investigations.
- Support workflow orchestration development using cloud tools like Apache Airflow or Azure Functions.
- Assist in solution and system health monitoring - keeping track of key metrics while also troubleshooting deployed solutions and infrastructure resources with logging tools.
- Support user and technical documentation for the solutions or tools we develop and deploy.
- Support compliance with our company-wide data governance activities such as metadata management and access controls.
- Maintain continuous learning and keep up to date with current emerging tools, cloud technologies, and industry practices.
- Contribute to team discussions and solution-shaping efforts with a proactive, learning-oriented mindset.
- Support technical reviews where you can learn how to identify and resolve issues.
- Support the prototyping and proof-of-concept development lifecycle of data solutions and tools for external clients.
- Provide general operational assistance to fellow data teams.
Qualifications
Minimum requirements:
- Education: bachelor’s degree in computer science, Engineering, Mathematics, or a related field (or equivalent experience).
- Basic proficiency in at least one programming language (e.g., Python, Java, JavaScript, Scala or C#), with a keen desire to continue developing.
- Good working knowledge of SQL for querying and managing relational databases.
- Good understanding of core data principles and methodologies (relational & non-relational data models, dimensional modelling (star/snowflake), ETL vs ELT, data warehouse vs data lake ect.)
- Basic familiarity with API concepts or REST principles.
- Familiarity with version control tools such as Git, and a basic understanding of CI/CD practices.
- Some practical experience—through internships, academic work, or early professional roles—supporting analytics, data manipulation, or systems development initiatives.
Preferred qualifications:
- Some cloud platform expertise is preferred with at least one major cloud platform (AWS, Azure, or Google Cloud Platform).
- Familiarity with cloud-native data services (e.g., Databricks, Synapse Analytics, Azure Data Factory, Google BigQuery).
- Some experience with data processing frameworks such as Apache Spark or Kafka is preferred.