Description:
Role: Data Engineer
Location: UK (Remote)
Salary: Up to £55k + benefits
ASAP
Key Responsibilities:
- Apply best practices in data architecture using methodologies such as Kimball, Data Vault, and Data Mesh in an enterprise data warehouse environment.
- Design and develop robust dimensional data models, including facts, dimensions, and hierarchies to support analytical reporting.
- Build and maintain scalable cloud-based data warehouse architecture to ensure seamless data ingestion, transformation, and delivery to connected systems.
- Develop and support integrations between cloud and on-premises systems, ensuring reliable data flow across the ecosystem.
- Architect a future-proof platform foundation that balances performance, cost-efficiency, and resilience.
- Define and implement end-to-end data lifecycle management processes — from ingestion and storage to archival and deletion.
- Collaborate closely with Product Managers, Business Analysts, and engineering teams to translate business needs into technical solutions.
- Design and implement efficient ETL/ELT pipelines, selecting optimal data formats and storage mechanisms based on use case.
- Monitor system performance and proactively resolve issues to maintain established SLAs.
- Recommend scalable solutions to reduce technical debt and streamline long-term maintenance.
- Establish and execute robust testing strategies to validate data quality and integrity across integrated sources.
- Maintain detailed documentation including configurations, test cases, reporting specs, and data integration processes.
- Champion continuous process improvement initiatives within the data engineering team.
- Foster innovation through a collaborative and open-minded problem-solving approach.
- Provide guidance and mentorship to peers and new adopters of the platform.
Required Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or a related technical field.
- 4+ years of hands-on experience with advanced SQL across databases such as Oracle, PostgreSQL, and MSSQL.
- 3+ years of scripting experience with Python and/or PowerShell.
- Proficient in building integrations using APIs (SOAP, REST) with at least 3 years of practical experience.
- 2+ years of experience working with DBT (Data Build Tool).
- Demonstrated expertise in data modeling with at least 3 years of experience designing complex data models.
- Working knowledge of the Snowflake Analytics Platform.
- Familiarity with Agile development methodologies and practices.
- Proficient in CI/CD tools like Git and Jenkins.
- Experience with automated testing, including unit and integration tests.
- Solid understanding of cloud computing concepts and services.
- Self-starter with strong analytical thinking and problem-solving skills.
- Consultative mindset with the ability to assess business needs and recommend effective data solutions.
- Ability to clearly articulate technical design choices and their trade-offs.
- Strong communication, leadership, and teamwork skills.
- Proven ability to take initiative and drive projects to completion.