National AI Awards 2025Discover AI's trailblazers! Join us to celebrate innovation and nominate industry leaders.

Nominate & Attend

Integrator Eng

London
1 week ago
Create job alert

Senior Integrator Eng

London - Hybrid (2 days/week, usually Tues & Fri)

12-month contract

Day rate: £800 (umbrella)

About the Role

Join a critical data remediation programme within the Surveillance Team/Lab, part of the Markets Platform area of a major financial services business.

You will lead design and delivery of stable, scalable, and performant data solutions within a complex GCP architecture, driving simplification and improving pipeline automation to enable timely, secure data delivery to surveillance systems.

Key Responsibilities

Engineer stable, scalable, performant, accessible, testable, and secure data products aligned with endorsed technologies and best practices
Lead the design, build, and optimisation of GCP-based data pipelines for critical surveillance and remediation projects
Apply common build patterns to minimise technical debt and adhere to group policies and frameworks for build and release
Participate actively in design and implementation reviews, Agile ceremonies, and planning to set clear goals and priorities
Identify opportunities to automate repetitive tasks and promote reuse of components
Engage with technical communities to share knowledge and advance shared capabilities
Lead incident root-cause analysis and promote active application custodianship to drive continuous improvements
Invest in your ongoing development of technical and Agile skills
Collaborate closely with architecture, security, and compliance teams to ensure solutions meet regulatory requirements
Manage and enhance CI/CD automated pipelines supporting data engineering workflows
Support potential transition planning to vendor surveillance products

Core Skills & Experience

Proven hands-on experience with Google Cloud Platform (GCP) and components such as Dataflow, Pub/Sub, Cloud Storage
Deep expertise in BigQuery and Big Data processing
Strong programming skills in Python or Java used extensively in surveillance data integration
Solid SQL expertise for querying, transforming, and troubleshooting data
Experience building robust ETL pipelines from diverse source systems
Familiarity with CI/CD pipeline automation and enhancement
Understanding of Agile delivery frameworks, including Jira and sprint planning
Knowledge of Terraform for infrastructure as code provisioning
Experience with containerization technologies to deploy and manage environments
Ability to simplify complex architectures and communicate effectively across teams
Strong stakeholder management and influencing skills
Experience in financial services or surveillance system data engineering
Understanding of data governance and data quality best practices
Exposure to vendor-based surveillance solutions

Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. If you require reasonable adjustments at any stage, please let us know and we will be happy to support you

Related Jobs

View all jobs

Technical Support Engineer - Office based

Data Migration Developers x 2

Managing Data Engineer

Data Migration Developers x 2

Data Migration Developers x 2

Data Lead

National AI Awards 2025

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

LinkedIn Profile Checklist for Data Engineering Jobs: 10 Tweaks to Maximise Recruiter Visibility

As organisations harness vast volumes of data, the demand for skilled data engineers—experts in ETL pipelines, data warehousing, and scalable architectures—has surged. Recruiters routinely search LinkedIn for candidates proficient in tools like Spark, Kafka and SQL pipelines. To stand out, your profile must be optimised for relevant keywords and showcase your technical impact. This LinkedIn for data engineering jobs checklist provides ten precise tweaks to maximise recruiter visibility. Whether you’re building your first data platform or architecting petabyte-scale systems, these targeted adjustments will make your profile attract hiring managers and land interviews.

Part-Time Study Routes That Lead to Data Engineering Jobs: Evening Courses, Bootcamps & Online Masters

Data engineering is at the heart of modern digital transformation. From building scalable ETL pipelines in finance to designing real-time analytics platforms in e‑commerce, organisations across the UK are investing heavily in data infrastructure. As a result, demand for skilled data engineers—professionals who can ingest, process, store and serve vast volumes of data—is soaring. Yet many aspiring engineers cannot pause their careers to study full time. Thankfully, an extensive range of part-time learning pathways—Evening Courses, Intensive Bootcamps and Flexible Online Master's Programmes—allows you to learn data engineering while working. This in-depth guide covers every route: foundational modules and short courses, hands‑on bootcamps, accredited online MScs, plus funding options, planning strategies and a real-world case study. Whether you’re a database administrator, software developer or business analyst aiming to pivot into data engineering, this article will help you map out a tailored path to build in-demand skills without interrupting your professional or personal life.

The Ultimate Assessment-Centre Survival Guide for Data Engineering Jobs in the UK

Assessment centres for data engineering positions in the UK rigorously test your ability to design, build and optimise data pipelines under real-world conditions. Employers use a blend of technical challenges, psychometric assessments, group exercises and interviews to see how you handle data architecture, collaboration and problem-solving at scale. Whether you’re focusing on batch processing, stream engineering or data warehousing, this guide will lead you through every stage with actionable strategies to stand out.