Open roles

Geospatial Engineer

Role info
Consultant
Full Time
UK
Competitive
Apply Now
Share this role

Introduction

Joining nxzen’s rapidly growing geospatial capability means becoming part of a team redefining how

organisations use location intelligence across critical infrastructure sectors. As a Geospatial Data Engineer,

you will play a key role in building reliable spatial data pipelines, automated workflows, APIs, and

transformation processes that underpin our next generation of geospatial solutions.

Reporting to the UK GIS Lead, you will support day-to-day delivery across data-engineering workstreams,

contribute to solution implementation, and help ensure the smooth operation of our geospatial data

workflows. This role combines hands-on engineering with opportunities to influence technical approaches,

helping to improve the reliability, automation, and performance of our pipelines.

You will build ETL processes, transform and validate geospatial datasets, support API and service

implementations, contribute to CI/CD workflows, and enhance observability across our geospatial data

operations.

The role reports to the UK GIS Lead and will be responsible for:

• Developing and maintaining end-to-end spatial ETL pipelines in collaboration with consultants, data

engineers, architects, and platform teams.

• Implementing geospatial data services and APIs that reliably expose processed datasets for analytical,

operational, and mapping use cases.

• Optimising query performance, indexing strategies, and storage patterns across spatial databases and

cloud-hosted datasets

• Contributing to improvements in CI/CD processes that automate testing, validation, and deployment

of geospatial data workflows.

• Enhancing observability through structured logging, data-quality checks, lineage tracking, and

pipeline health indicators.


The role

Key Responsibilities

Data Pipeline Delivery:

• Build and maintain geospatial ETL pipelines using Python, SQL, FME, ArcGIS tools, or similar

technologies.

• Prepare, transform, validate, and load spatial datasets from multiple structured and unstructured

sources.

• Implement workflow scheduling, automation, and repeatable processing patterns.

• Diagnose issues, contribute to root-cause analysis, and implement stabilisation fixes.

• Support performance optimisation including indexing, partitioning, caching, and schema

refinement.

Data Services & API Implementation

• Implement geospatial APIs and services to publish processed datasets.

• Configure secure access patterns, schema-aware endpoints, and version-controlled data outputs.

• Support integrations with upstream and downstream systems.

Solution Implementation:

• Contribute to geospatial data‑engineering components of larger solutions.

• Configure data storage, ETL environments, and processing layers.

• Assist with deployment scripts, infrastructure configurations, and environment setup.

Data & Analysis:

• Support data preparation, transformation, and migration activities using tools like ArcGIS Pro, FME,

Python, or ModelBuilder.

• Develop repeatable processes for data quality, validation, and structured workflows.

Observability, Quality & Reliability:

• Add structured logging, validation checks, and lineage tracking into pipelines.

• Contribute to dashboards monitoring pipeline health, reliability, and data‑quality metrics.

• Apply testing and engineering discipline to improve predictability and reduce defects.

Documentation & Governance:

• Produce clear documentation describing pipeline behaviour, data flows, dependencies, and

operational expectations.

• Maintain structured coding practices, version control discipline, and consistent naming/met

Practice Support & Collaboration:

• Work closely with peers and senior engineers to adopt best-practice engineering patterns.

• Participate in peer reviews and knowledge-sharing sessions.

• Provide input into continuous improvement of internal engineering methods and processes.


Responsibilites

Essential Skills & Experience:

• Competent with Python, SQL, PostgreSQL, and PostGIS to develop, optimise, and operate reliable

geospatial data workflows.

• Strong hands‑on experience with geospatial data processing, ETL workflows, and spatial

transformations.

• Practical expertise with geospatial data formats (e.g., Shapefile, GPKG, GeoJSON, FGDB, GeoTIFF,

GeoParquet etc.), coordinate systems, and spatial data standards.

• Experience with cloud platforms (Azure, AWS, or GCP) for data engineering workloads.

• Experience using ESRI tools, FME, GDAL/OGR or equivalent technologies.

• Understanding of spatial indexing, query optimisation, schema design, and data‑model evolution.

• Experience building repeatable workflows for data validation, quality checks, logging, and

structured processing.

• Ability to communicate technical details clearly and collaborate effectively across multidisciplinary

teams.

• Strong problem‑solving skills and a methodical approach to debugging and optimisation.

Desirable (but not essential) Skills & Experience:

• Experience in utilities, infrastructure, or other asset‑intensive sectors.

• Experience with ArcGIS Utility Network is highly desirable.

• Familiarity with CI/CD pipelines, DevOps practices, and infrastructure-as-code.

• Exposure to APIs, microservices, or event-driven data architectures.

• Knowledge of ArcGIS Enterprise, ArcGIS Online, or other geospatial platforms.

• Scripting or workflow automation experience beyond core data engineering.

• Exposure to open-source geospatial tools and libraries.