Senior Full Stack Engineer – Data Platform

Back to all jobs
Hirehangar
Published
March 24, 2026
Location
Ukraine, United Kingdom
Job Type

Job Description

Join Hire Hangar and work with fast-growing global companies while building a long-term, remote career.

Job Title Senior Full Stack Engineer – Data Platform

Location Remote

Time Zone Flexible / Aligned to HQ Time Zone

Role Overview We are building the data backbone for the next generation of AI-powered automation — and we need a Senior Full Stack Engineer who can operate across the full stack: from data ingestion pipelines to the polished frontend dashboards our enterprise customers rely on.

This role is ideal for an engineer who thrives where data engineering meets product engineering. You will own the interfaces and services that make complex data meaningful and actionable — and you will work alongside automation engineers to ensure that RPA agents have the structured, reliable data they need to function intelligently.

Key Responsibilities

  • Build and maintain frontend experiences for our data platform, including dashboards, data explorers, and workflow builders

  • Design and implement RPA agent integrations that interact with structured and unstructured data sources

  • Develop APIs and services that expose platform data to both internal consumers and customer-facing products

  • Contribute to the semantic layer that contextualises data for AI agents — including tagging, categorisation, and metadata enrichment

  • Collaborate with data engineers to ensure frontend and API layers align with underlying data models

  • Drive performance and scalability improvements across frontend and backend components

Required Qualifications

  • Senior-level full stack experience with strong frontend depth in React and TypeScript

  • Experience building data platform or BI-adjacent frontends including dashboards, query interfaces, and real-time data views

  • Familiarity with RPA tooling and automation workflow design

  • Understanding of data modelling, ETL patterns, and API-first architecture

  • Experience with cloud data services such as Snowflake, BigQuery, or Redshift

  • Strong attention to UX detail and the ability to translate complex data into clear, usable interfaces

  • Must have prior remote work experience and proven ability to work independently within a distributed team

Preferred Qualifications

  • Experience with semantic metadata layers or data cataloguing tools

  • Background in enterprise SaaS product development

  • Knowledge of streaming data technologies such as Kafka or Kinesis

  • Exposure to dbt, Airflow, or similar data pipeline tooling

Tools & Technology

  • React / TypeScript

  • Python / Node.js

  • Cloud data warehouses (Snowflake, BigQuery, or similar)

  • RPA platforms (UiPath, Power Automate, or similar)

  • REST and event-driven APIs

  • Slack, Zoom, Google Workspace

Please Note It is crucial that you complete the application form in full. As part of the application process, you will be required to complete a short technical assessment. If your application is successful, you will receive an email confirming next steps. Applications that are not completed in full will not be considered for any open roles.

We connect top talent with vetted employers, competitive pay, and real growth opportunities.

Key Skills
? Key Skills in dark blue have been inferred based on similar industry roles
Typescript Airflow Dbt Snowflake Redshift Bigquery UX Kafka Kinesis Data Pipeline React Node.js Python ETL REST

Subscribe to Career Resources

Get the latest career advice, industry insights, and job opportunities delivered to your inbox.