Lead Data Engineer
Find your perfect job.
Lead Data Engineer
Job Details
Published:
27-Oct-2025
Salary:
£170,000.00 - £200,000.00 Annual
Location:
Oxford
Category:
Permanent
Sector:
Data
Reference:
4782
Work Model:
Hybrid
Description
About the Organisation
Our client is a global research and technology organisation dedicated to addressing humanity’s most pressing challenges. By combining advances in data science, artificial intelligence, and cloud computing, they are developing transformative solutions in areas such as health, sustainability, and climate resilience.
The organisation brings together engineers, scientists, and technologists to turn research breakthroughs into practical, high-impact outcomes for global benefit.
The Role
We are seeking a Lead Data Engineer to take ownership of designing and building the next generation of a large-scale, cloud-based data platform supporting cutting-edge science and analytics.
As a Lead, you will be a hands-on technical expert — someone who thrives on building complex systems from the ground up, shaping architecture through implementation, and setting high standards for engineering quality. You will collaborate closely with data architects, software engineers, and scientists to deliver performant, reliable, and secure data solutions that enable advanced analytics and AI.
This is a role for someone who loves solving deep technical challenges and wants to have a tangible impact on systems that drive global research and innovation.
Key Responsibilities
- Take ownership of designing and implementing scalable, high-performance data pipelines and infrastructure.
- Build and optimize data ingestion, transformation, and storage systems across large, heterogeneous datasets.
- Develop and deploy solutions in modern cloud environments using infrastructure-as-code and containerization.
- Shape technical direction by driving hands-on architectural decisions and proving out new technologies through prototypes and POCs.
- Ensure data reliability, integrity, and security across the platform lifecycle.
- Collaborate closely with architects, scientists, and analytics teams to understand data needs and deliver production-grade solutions.
- Contribute to defining engineering standards, tools, and best practices.
Essential Skills & Experience
- Deep experience building complex data platforms or large-scale data pipelines using cloud technologies (AWS, Azure, GCP, or similar).
- Advanced proficiency in Python and SQL, including optimization for performance and scalability.
- Expertise with workflow orchestration (Airflow, Prefect, or similar).
- Strong knowledge of data modeling, ETL/ELT, and distributed data storage systems.
- Experience with data lakes, lakehouses, or data mesh architectures (e.g., Apache Iceberg, Delta Lake).
- Understanding of streaming and event-driven systems (Kafka, Kinesis, etc.).
- Familiarity with containerization (Docker, Kubernetes) and CI/CD workflows.
- Strong problem-solving skills with a focus on elegant, maintainable, and performant solutions.
Desirable
- Experience with scientific or healthcare data, or other complex, high-dimensional data domains.
- Understanding of data security, governance, and compliance in regulated environments.
- Interest in exploring and adopting emerging data technologies and architectures.
Key Attributes
- Highly technical and hands-on — enjoys solving deep engineering problems.
- Self-starter who takes ownership from concept to implementation.
- Collaborative and curious, thrives in multidisciplinary environments.
- Pragmatic problem-solver with strong attention to detail and a focus on quality.
- Motivated by building systems that have tangible global impact.



