Location: London, UK
Contract: Permanent, full-time - Hybrid / Remote options may apply
Visa/Sponsorship: Candidates must have the right to work in the UK
Job Purpose
Design, develop, and maintain scalable data pipelines and architectures to support analytics, reporting, and business intelligence initiatives. Ensure high-quality, reliable, and secure data flow across systems to enable data-driven decision making.
Key Responsibilities
Build and maintain efficient, reliable, and scalable data pipelines for structured and unstructured data. Design, implement, and optimise data models, ETL processes, and workflows. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Ensure data quality, integrity, and security across multiple systems. Develop automated data validation and monitoring processes. Troubleshoot, debug, and resolve data-related issues in a timely manner. Document data processes, workflows, and system architecture. Stay up-to-date with emerging data technologies and best practices.
Requirements
Bachelor's degree in Computer Science, Engineering, Mathematics, or related field. 3+ years of experience in data engineering, data warehousing, or analytics engineering. Proficiency in SQL and experience with relational and non-relational databases. Experience with ETL tools and data pipeline frameworks (e.g., Apache Airflow, Talend, dbt). Familiarity with cloud platforms (AWS, Azure, or GCP) and big data technologies (e.g., Spark, Hadoop). Strong programming skills in Python, Java, or Scala. Understanding of data governance, security, and privacy standards. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills.