Copy of Copy of Data Engineer

Back to all jobs
  • SW Group
  • Southampton, Southampton
  • Full-Time
  • 3 days ago
  • £65,000 - £74,000
Published
May 6, 2026
Location
Southampton, United Kingdom
Job Type

Copy of Copy of Data Engineer: our view in 3 lines...

  • The Role: Build and maintain Microsoft Fabric data pipelines and warehouse structures to provide reliable data for reporting and analytics across the organisation.
  • The Person: Design, implement and operate end-to-end ETL/ELT pipelines and Lakehouse/Warehouse structures in Microsoft Fabric while ensuring data quality, master data management and supporting Power BI semantic models.
  • Requirements: Experience with Microsoft Fabric, Data Factory, advanced SQL, Python/PySpark, ETL/ELT patterns, dimensional modelling, CI/CD and Power BI integration is required.

Job Description

Company Description

At S&W, we help our clients thrive by simplifying the complex, illuminating new paths, and shaping solutions that make a difference. As one of the UK’s top 10 fastest-growing accountancy firms, we have been a trusted partner since 1881—helping businesses and individuals meet challenges and seize opportunities across generations.

Built on expertise and driven by ambition, we provide a comprehensive range of services, including tax and accountancy, advisory and assurance, corporate finance, and restructuring. We are defined by our purpose—to help navigate challenges, unlock potential, and achieve the extraordinary.

Job Description

The Data Engineer designs, builds, and maintains scalable data solutions that bring together data from across the organisation. They ensure data is well structured, reliable, and easy to use for reporting, analytics, and decision‑making. Working as part of the data platform team, they help create a high‑quality, future‑ready data environment that supports business needs now and as the organisation grows.

Primary Responsibilities

Design, Build, and Manage End‑to‑End Data Pipelines

  • Architect, develop, and maintain scalable ETL/ELT pipelines to ingest, process, and transform data from multiple internal and external sources.
  • Optimise data workflows for performance, reliability, and cost‑efficiency, ensuring alignment with enterprise data architecture standards.
  • Implement robust data quality, validation, and monitoring mechanisms to ensure accuracy and consistency throughout the pipeline lifecycle.
  • Collaborate with solution architects, analysts, and platform engineers to ensure pipeline solutions integrate effectively with wider systems and data products.

Lakehouse and Warehouse Architecture

  • Design, implement, and maintain Lakehouse and Warehouse structures in Fabric, ensuring optimal layout, partitioning, governance, and performance.
  • Define and enforce data modelling standards across semantic layers—supporting enterprise-wide reporting and analytics.

Master Data Management (MDM)

  • Define, implement, and maintain a master data structure within the Fabric platform to ensure a client can be referenced in a single view, combining data from across multiple applications.
  • Work with the Platform Integration team to ensure the correct client data flows seamlessly from one application to another.
  • Ensure that master datasets are reflected accurately in Fabric semantic models, powering unified reporting across business domains.

Production Support for Fabric Ingestion and Transformation

  • Provide day‑to‑day operational support for Microsoft Fabric data ingestion, transformation processes, and dataflows.
  • Monitor Fabric workloads, proactively identify performance issues, and implement corrective actions to maintain seamless operations.
  • Troubleshoot ingestion failures, pipeline latency, and transformation logic issues, ensuring timely resolution to minimise business disruption.
  • Optimise Fabric configurations, capacity usage, and performance tuning for scalable data operations.

Data Quality, Validation & Issue Resolution

  • Identify and resolve data issues such as inconsistencies, duplication, incomplete datasets, schema mismatches, and integrity anomalies.
  • Partner with data owners and source system teams to address root causes, implement preventive measures, and improve end‑to‑end data health.
  • Establish automated data validation rules, checks, and alerting to ensure continuous monitoring of data quality across critical domains.

Gold Layer Development & Dimensional Modelling

  • Design, develop, and maintain Gold (business‑ready) data layers that provide trusted, consistent, and performant datasets for enterprise reporting and analytics.
  • Apply dimensional data modelling techniques (including star and snowflake schemas) to support clearly defined KPIs, reusable measures, and scalable analytics solutions.
  • Translate business requirements into curated Gold datasets aligned to agreed business definitions, metrics, governance standards, and reporting needs.
  • Collaborate with BI developers, analysts, and business stakeholders to ensure Gold layer datasets effectively support Power BI semantic models and long‑term platform scalability.

Secondary Responsibilities

Support Business Intelligence Development & Semantic Model Design

  • Collaborate with BI developers to design, build, and maintain enterprise‑level semantic models that support analytics, reporting, and self‑service capabilities.
  • Prepare optimised datasets and curated layers to enable efficient Power BI model performance and governance.
  • Ensure semantic models adhere to data governance standards, definitions, naming conventions, and security requirements.

Documentation & Knowledge Management

  • Create and maintain comprehensive documentation covering data pipelines, metadata, data dictionaries, transformation logic, and architecture diagrams.
  • Ensure documentation is kept current and accessible, supporting knowledge sharing and enabling effective onboarding of new team members.
  • Contribute to developing best‑practice guidelines, reusable patterns, and standard operating procedures for data engineering workflows.

Qualifications

  • Experience with Microsoft Fabric including Lakehouse, Data Engineering, Data Factory, and semantic modelling components.
  • Advanced SQL and proficiency with Python/PySpark.
  • Strong understanding of relational, dimensional, and semantic data modelling principles.
  • Strong understanding of ETL/ELT patterns, optimisation, and automation
  • Experience with performance tuning and monitoring.
  • Familiarity with CI/CD, version control, and GitHub practices.
  • Knowledge of data quality, lineage, and governance.
  • Experience working in Agile Scrum delivery environments.

Competencies

  • Strong trouble shooting and problem-solving skills
  • Strong communication and collaboration
  • Good adaptability to support the evolution and longevity of our data platform.
  • Demonstrate good participation in design conversations
  • Ability to build strong, trusted relationships with business stakeholders
  • Ability to articulate technical information to non-technical colleagues
  • Highly organised and methodical
  • Strong prioritisation and planning skills
  • Strong attentional to detail

Additional Information

As a colleague here at S&W you will have access to benefits that include

  • Competitive salary
  • Private medical insurance
  • Life assurance
  • Pension contribution
  • Hybrid working model (role dependent)
  • Generous holiday package
  • Option to purchase additional holiday
  • Shared parental leave
  • Cycle to work scheme
  • Season ticket loan
  • Eye care support

We are proud to value the differences that a diverse workforce brings, representative of society and our clients. At S&W we have a wide range of highly active employee resource groups and we’re delivering multiple diversity, equity and inclusion initiatives across the organisation. It is our commitment to provide a workplace where all colleagues, regardless of identity, background, or circumstance, feel respected as individuals and feel that they can achieve their full potential and work in a safe, supportive, and inclusive environment.

We are happy to make any reasonable adjustments to accommodate for your needs throughout the application process. Please let your Recruiter know.

  • Office Location: Southampton
  • Compensation: GBP 65000 - GBP 74000 - yearly
  • Key Skills
    ? Key Skills in dark blue have been inferred based on similar industry roles
    Microsoft Fabric ETL/ELT Pipelines Dimensional Modelling Power BI CI/CD Data Quality And Lineage Snowflake Agile Scrum Safe Python Pyspark SQL Data Factory ETL

    Subscribe to Career Resources

    Get the latest career advice, industry insights, and job opportunities delivered to your inbox.