Senior DataOps Engineer

Leeds
2 weeks ago
Create job alert

At Peregrine, we’re constantly seeking Specialist Talent that offer the ideal mix of skills, experience, and attitude, to place with our vast array of clients. From Project Change Professionals in large government organisations to Software Developers in the private sector – we are always in search of the best talent to place, now.  

How Specialist Talent Works

At Peregrine, we find the best talent for our clients. Unlike traditional contractors, where you are hired by the client, you remain a permanent employee of Peregrine, with access to all our standard benefits:

A Permanent Position

Life Assurance

5% annual bonus

Pension Scheme – Employer matched to 5%

Voluntary Benefits – Health Cash Plan, Dental, Will Writing etc

Annual Leave – 23 days rising to 27 with length of service

Sick Pay – Increasing with length of service

The Role: DataOps Engineer

Job Description

We are seeking a senior Data DataOps Engineer to serve as our first DataOps specialist in a growing team of Data Engineers and DevOps professionals. In this pivotal role, you will focus on operationalising and automating our data lifecycle to ensure that data workflows perform with reliability and efficiency. You will integrate CI/CD data pipelines, streamline deployment processes, enforce robust data governance, and optimise operational costs within our Microsoft Azure environment. Your work will be centred on proactive system monitoring, error resolution, and continuous improvements, while mentoring and guiding colleagues.

Role Requirements:

The role will:

Oversee and automate the operational processes that support data workflows developed by the Data Engineering team while ensuring seamless coordination with the DevOps group.

Spearhead the development, integration, and maintenance of CI/CD data pipelines for automated deployments.

Integrate best practices for monitoring and observability to proactively detect, analyse, and resolve issues.

Enforce robust data governance and security protocols through tools like Azure Key Vault, ensuring compliance with standards such as GDPR, and other regulatory frameworks.

Collaborate closely with Data Engineering, Data Science, Analytics, and DevOps teams to align operational strategies with technical and business requirements.

Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, Delta Lake, and Azure Data Lake Storage.

Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous process improvements.

Evaluate and adopt emerging technologies and methodologies to further enhance operational efficiency and automation within our data ecosystem.

Minimum Criteria

To work confidently in this role, you will have:

Demonstrable experience in DataOps, Data Engineering, DevOps, or related roles focused on managing data operations in complex, data-centric environments.

Proven experience working with agile teams and driving automation of data workflows within the Microsoft Azure ecosystem.

Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Delta Lake, Azure SQL, Purview and APIM.

Proficiency in developing CI/CD data pipelines and strong programming skills in Python, SQL, Bash, and PySpark for automation.

Strong aptitude for data pipeline monitoring and an understanding of data security practices such as RBAC and encryption.

Implemented data and pipeline observability dashboards, ensuring high data quality, and improving the efficiency of data workflows.

Experience ensuring compliance with regulatory frameworks and implementing robust data governance measures.

Demonstrated ability to implement Infrastructure as Code using Terraform, to provision and manage data pipelines and associated resources.

Essential Criteria

In addition to your technical skills you will also:

Enjoy sharing knowledge and training / upskilling others.

Have excellent communication and collaboration skills, with both technical and business colleagues.

Be motivated to learn, upskill and improve yourself in both technical and relevant complementary soft skills.

Desirable Criteria

It would be beneficial to already have:

Proven experience in optimizing and scaling high-volume data operations within enterprise Azure environments.

Familiarity with Azure services that enhance data operational efficiency and support complex analytics.

Exposure to Spark-based processing and advanced analytics techniques to further empower data-driven decision-making.

A track record of successfully mentoring teams and implementing strategic improvements in data operations.

Demonstrated expertise in cost optimisation and performance tuning within an Azure-based data infrastructure.

Relevant certifications such as Azure Data Engineer Associate, Azure DevOps, Databricks Data Engineer Professional, or equivalent credentials.

Ability to build and manage the DataOps processes in a newly formed team which is planned to continue to expand over the next few months and years.

If you are an experienced DataOps Engineer and feel you have the desired skills and experience which would enable you to hit the ground running, please apply to find out more information about this exciting opportunity.

About us: 

At Peregrine, we see beyond the immediate and look to the horizon. We build lasting, meaningful partnerships with our clients, and deliver flexible solutions for every resourcing need, both now and in the future. Together, we help our clients to engage, develop and harness the skills they need to achieve and grow the workforce they want. 

Our culture: 

At Peregrine we embrace fresh ideas, and we love learning fast. Our solutions are trusted and established, so we have the confidence of knowing we have a solid foundation. We rely on openness and honesty, and we’re always ready to help each other out. And we believe that our work can benefit society – whether it’s finding the digital talent of the future or being a driver for social mobility. 

Our commitment to diversity:  

At Peregrine, we’re proudly committed to championing diversity and inclusion, with company-wide initiatives to drive greater social mobility and reduce our environmental impact. Our teams represent a huge breadth of cultures, languages, and ethnicities, and over 20 different nationalities. We also employ candidates from a range of educational and socioeconomic backgrounds. Our partnerships with numerous charities ensure that we can stay well-informed and continue to improve our practices for the future. It reflects in the way we recruit for our clients as we assist them in becoming more diverse

Related Jobs

View all jobs

Senior DevOps Engineer

Senior Data Engineering Consultant

Senior Trading Software Developer

Senior Back-End Developer

Senior Data Engineer - Snowflake - £100,000

Senior Data Analyst

Get the latest insights and jobs direct. Sign up for our newsletter.

By subscribing you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

How to Write a Winning Cover Letter for Data Science Jobs: Proven 4-Paragraph Structure

Learn how to craft the perfect cover letter for data science jobs with this proven 4-paragraph structure. Ideal for entry-level candidates, career switchers, and professionals looking to advance in the data science sector. When applying for a data science job, your cover letter is an essential part of your application. The data science field is highly competitive, with many candidates vying for the same roles. Your cover letter allows you to showcase your technical expertise, analytical skills, and your ability to apply data to solve real-world problems. It’s the perfect opportunity to demonstrate your passion for data and your understanding of how it can drive meaningful insights for an organisation. Whether you're new to data science, making a career switch, or looking to advance your skills, this article will guide you through a proven four-paragraph structure to craft a compelling cover letter. We’ll provide sample lines and tips to help you stand out in the highly competitive data science job market.

How to Write a Winning Cover Letter for Data Engineering Jobs: Proven 4-Paragraph Structure

Learn how to craft the perfect cover letter for data engineering jobs with this proven 4-paragraph structure. Ideal for entry-level candidates, career switchers, and professionals looking to advance in the data engineering sector. When applying for a data engineering job, your cover letter is a vital component of your application. The data engineering field is critical for organisations that rely on big data to make informed decisions, and a well-crafted cover letter will allow you to demonstrate your expertise in data management, data architecture, and data pipeline construction. Writing a cover letter for a data engineering role can seem challenging, but with the right structure, it becomes much easier to highlight your strengths. Whether you're just entering the field, transitioning from another role, or seeking to advance your career in data engineering, this article will walk you through a proven four-paragraph structure. We’ll provide sample lines and practical tips to help you create a cover letter that stands out from the competition in the data engineering job market.

Veterans in Data Engineering: A Military‑to‑Civilian Pathway into Data Careers

Introduction Every modern mission—whether directing humanitarian aid, mapping enemy positions, or forecasting equipment failures—runs on data. The same is true for British business. The UK Big Data & Analytics market is forecast to hit £36 billion by 2026 (IDC), and Gartner reports that data engineering vacancies grew 38 % in 2024, outpacing data‑science demand for the first time. Organisations urgently need professionals who can collect, clean, and pipeline petabytes of information—exactly the logistical, analytical, and security‑minded tasks veterans perform in theatre. If you’ve routed tactical sensor feeds, managed supply‑chain databases, or written Python scripts to crunch signal logs, you already think like a data engineer. This guide maps military skills to civilian data‑engineering roles, spotlights Ministry of Defence (MoD) transition funding, and shows you how to secure a rewarding second career building the pipelines that power AI and business intelligence. Quick Win: Browse our live listings for Data Pipeline Engineer roles to see which employers are hiring this week.