Data Engineer

Birchgrove
Cobham
3 days ago
Create job alert

Job Title: Data Engineer Location: Hybrid or remote Term: 3 month fixed term contractRenumeration : £12,500 for 3 monthsAbout Birchgrove Birchgrove is the only build-to-rent operator in the UK exclusively for older adults. Our mission is to enrich the lives of our neighbours and add healthy years to their lives. We operate neighbourhoods rather than care homes, placing independence, dignity and community at the heart of what we do.Were a forward-thinking organisation using data to improve neighbour wellbeing, operational performance and long-term decision-making.The Opportunity Were looking for an experienced Data Engineer to join Birchgrove on a 3-month contract to deliver several clearly defined, high-impact data integration projects.This is a hands-on, delivery-focused role. Youll design, build and document reliable, production-grade ETL/ELT pipelines that integrate operational systems into our cloud data warehouse enabling improved reporting and analytics across the business.Youll be joining at an exciting stage in our data journey, helping us move from early foundations to a more connected, scalable and dependable data platform.Key Project Deliverables During the contract, you will deliver the following priority projects:1) Fall detection system integration• Ingest data from a fall detection platform using APIs and webhooks• Land and model the data in Snowflake• Implement reliability best practices: monitoring, alerting, logging, retries, and clear documentation2) Resident management system integration• Extract and ingest data from our resident management system• Design robust data models to support reporting on neighbour wellbeing and operations• Ensure maintainable transformations and clear data definitions3) Facilities management systems integration• Design and build an API-based integration between two facilities management systems• Enable joined-up reporting across maintenance, safety and operational data• Deliver clean, consistent datasets suitable for analytics and dashboards4) Marketing automation platform integration• Ingest data from our marketing platform using APIs• Land and model the data in SnowflakeThese projects will directly support improved insight, faster decision-making and better outcomes for our neighbours and team.Tools & Technology Stack Youll work with and help establish best practice around the following tools:• Snowflake (cloud data warehouse)• Fivetran (managed ingestion)• Airbyte (custom & API-based integrations)• dbt (transformations, testing and documentation)• Power BI (analytics and dashboards)Were particularly keen to speak with candidates who are highly confident with: • API-driven pipeline design (authentication, pagination, rate limiting, incremental loads)• Webhook ingestion patterns and event-driven data capture• Building reliable, well-monitored pipelines with clear documentation and ownershipAbout You • Proven experience as a Data Engineer, delivering pipelines end-to-end in modern cloud stacks• Strong hands-on skills with APIs, webhooks, and pipeline-based ETL/ELT• Confident using Python for data integration and automation• Comfortable implementing practical reliability patterns (e.g., idempotency, retries, monitoring, alerting)• Strong data modelling and transformation experience (ideally with dbt)• Able to work independently, but collaborate closely with non-technical stakeholders• Motivated by purpose-driven work and using data to improve real livesHow to Apply If youre an experienced Data Engineer looking for a short-term contract where you can deliver meaningful work with real-world impact, wed love to hear from you.REF-

Related Jobs

View all jobs

Data Engineer - AI Analytics and EdTech Developments

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

How Many Data Engineering Tools Do You Need to Know to Get a Data Engineering Job?

If you’re aiming for a career in data engineering, it can feel like you’re staring at a never-ending list of tools and technologies — SQL, Python, Spark, Kafka, Airflow, dbt, Snowflake, Redshift, Terraform, Kubernetes, and the list goes on. Scroll job boards and LinkedIn, and it’s easy to conclude that unless you have experience with every modern tool in the data stack, you won’t even get a callback. Here’s the honest truth most data engineering hiring managers will quietly agree with: 👉 They don’t hire you because you know every tool — they hire you because you can solve real data problems with the tools you know. Tools matter. But only in service of outcomes. Jobs are won by candidates who know why a technology is used, when to use it, and how to explain their decisions. So how many data engineering tools do you actually need to know to get a job? For most job seekers, the answer is far fewer than you think — but you do need them in the right combination and order. This article breaks down what employers really expect, which tools are core, which are role-specific, and how to focus your learning so you look capable and employable rather than overwhelmed.

What Hiring Managers Look for First in Data Engineering Job Applications (UK Guide)

If you’re applying for data engineering jobs in the UK, the first thing to understand is this: Hiring managers don’t read every word of your CV. They scan it. They look for signals of relevance, credibility, delivery and collaboration — and if they don’t see the right signals quickly, your application may never get a second look. In data engineering, hiring managers are especially focused on whether you can build and operate reliable, scalable data systems, handle real-world data challenges and work effectively with analytics, BI, data science and engineering teams. This guide breaks down exactly what they look at first in your application — and how to shape your CV, portfolio and cover letter so you stand out.

The Skills Gap in Data Engineering Jobs: What Universities Aren’t Teaching

Data engineering has quietly become one of the most critical roles in the modern technology stack. While data science and AI often receive the spotlight, data engineers are the professionals who design, build and maintain the systems that make data usable at scale. Across the UK, demand for data engineers continues to rise. Organisations in finance, retail, healthcare, government, media and technology all report difficulty hiring candidates with the right skills. Salaries remain strong, and experienced professionals are in short supply. Yet despite this demand, many graduates with degrees in computer science, data science or related disciplines struggle to secure data engineering roles. The reason is not academic ability. It is a persistent skills gap between university education and real-world data engineering work. This article explores that gap in depth: what universities teach well, what they consistently miss, why the gap exists, what employers actually want, and how jobseekers can bridge the divide to build successful careers in data engineering.