A Field Guide to the GeoRust Ecosystem
The GeoRust ecosystem has quietly become one of Rust’s most coherent and well-maintained domain-specific ecosystems. If you’re working with geospatial data in Rust — whether vector geometries, raster files, coordinate transforms, or spatial indexes — the odds are good that a mature, well-documented crate already exists for your problem. This post is a map of that territory as of mid-2026.
The Core: geo and geo-types
The gravitational centre of the whole ecosystem is the geo crate (~1.8k stars, 280 reverse dependencies). It provides the planar geometry types you’d expect from any serious GIS library:
use geo::{line_string, polygon, Point};
use geo::ConvexHull;
let poly = polygon![
(x: 0.0, y: 0.0),
(x: 4.0, y: 0.0),
(x: 4.0, y: 1.0),
(x: 1.0, y: 1.0),
(x: 1.0, y: 4.0),
(x: 0.0, y: 4.0),
(x: 0.0, y: 0.0),
];
let hull = poly.convex_hull();
geo is built on top of geo-types, which defines the primitive types: Point, Line, LineString, Polygon, MultiPoint, MultiLineString, MultiPolygon, GeometryCollection, and Rect. These types implement the OpenGIS Simple Feature Access standard, meaning they’re interoperable with JTS (Java), GEOS (C++), and any other SFA-compliant library.
The algorithm coverage in geo is surprisingly broad:
- Topological predicates (DE-9IM):
contains,intersects,touches,crosses,within,equals - Spatial analysis:
convex_hull,chamberlin_davoust(triangulation) - Boolean set operations:
union,difference,intersection,symmetric_difference(via thei_overlaycrate) - Buffer / offset:
buffer,buffer_clockwise - Clustering: DBSCAN and k-means
- Distance: Euclidean, Haversine, geographic
- Affine transforms:
rotate,scale,translate,skew - CRS projection (via
projbindings)
All of this is pure Rust — no C library dependencies required. For the heavy boolean overlay operations, the i_overlay crate is the underlying engine. The proj feature pulls in PROJ bindings for coordinate reference system work.
Spatial Indexing: rstar and geo-index
For spatial queries, rstar (528 stars) is the go-to R*-tree implementation in the Rust ecosystem. It’s no_std-compatible and doesn’t allocate at query time:
use rstar::{RTree, AABB};
use rstar::primitives::GeomWithData;
let points: Vec<GeomWithData<_, _>> = locations
.iter()
.map(|(coords, id)| GeomWithData::new(Point::from(coords), *id))
.collect();
let tree = RTree::bulk_load(points);
let search_box = AABB::from_corners(
[145.0, -35.0],
[146.0, -34.0],
);
let results: Vec<_> = tree.search_in_box(&search_box).collect();
geo re-exports rstar behind the rstar feature, and ships a BallTree for point-cloud nearest-neighbour queries specifically — useful for point-in-polygon enrichment or KNN lookups on large point datasets.
There’s also geo-index (packed, immutable, zero-copy spatial indexes via on-disk Roaring Bitmap + R-tree structures), purpose-built for static datasets where you want memory-mapped, query-only indexes without the bulk-loading overhead of rstar.
Coordinate Reference Systems: proj and geodesy
Working with geospatial data almost always involves coordinate reference systems. EPSG codes, proj strings, WKT — the plumbing is everywhere but often hidden. Two crates handle this:
proj — official PROJ bindings (~173 stars). This is the gold standard CRS library used by GDAL, QGIS, PostGIS, and every serious geospatial tool. The Rust bindings are straightforward:
use proj::{Projector, Coord};
let projector = Projector::new(
geo::CoordinateReferenceSystem::EpsgCode(4326),
geo::CoordinateReferenceSystem::EpsgCode(32755), // UTM zone 55S
None,
).unwrap();
let result = projector.convert(Coord { x: 152.0, y: -27.5 });
geodesy — a pure-Rust geodesy platform (~74k downloads, MIT/Apache 2.0). Think of it as a research-oriented, Rust-native alternative to PROJ for transformation primitives. Supports 30+ transformation primitives and 40+ ellipsoidal operations. It’s more of a research platform than a production drop-in for PROJ, but it has no native dependencies and is useful when you need custom geodetic chains. It integrates with geo via geo-geodesy and crs-definitions.
Data Formats: The geozero Abstraction
If geo is the geometry engine, geozero (~453 stars) is the format abstraction layer. It defines traits for zero-copy reading and writing of geospatial data formats — no intermediate type representations required:
use geozero::{GeoJson, ToGpx, ProcessToPostgis};
use postgis::ewkb::PointZ;
// Read GeoJSON, write to PostGIS in one pass — no intermediate type
let data = std::fs::read("features.geojson")?;
let mut conn = postgres::Connection::connect("...")?;
let processor = ProcessToPostgis::new(&mut conn, "table_name", srid)?;
GeoJson::read(&data, &mut processor)?;
The format support matrix is broad:
| Format | Read | Write | Notes |
|---|---|---|---|
| GeoJSON | ✅ | ✅ | |
| WKB/WKT | ✅ | ✅ | PostGIS, SQLx, Diesel, GeoPackage |
| Shapefile | ✅ | ❌ | |
| GPX | ✅ | ❌ | |
| FlatGeobuf | ✅ | ✅ | Via flatgeobuf crate |
| GeoArrow | ✅ | ✅ | Via geoarrow crate |
| GeoParquet | ✅ | ✅ | Via geoarrow crate |
| MVT (Mapbox Vector Tiles) | ✅ | ✅ | |
| SVG | ❌ | ✅ |
This is particularly powerful because you can read from one format and write to another without any intermediate allocations — the data flows through the processing pipeline as zero-copy views.
Vector Format Crates
Built on top of geozero and geo-types, the individual format crates handle specific formats:
geojson(332 stars) — the web mapping standard. Parses and serialises GeoJSON with full feature/feature collection support.wkb/wkt— Well-Known Binary and Well-Known Text. Fundamental interchange formats for database drivers.gpx— GPS exchange format. Reading waypoints, tracks, and routes.kml— Keyhole Markup Language (Google Earth).topojson— Topological JSON, where shared edges between polygons are stored once.tilejson— TileJSON client and server support.osm— OpenStreetMap PBF file reading.wkt/wkb— interchange with databases and file formats.
Raster I/O: gdal and Friends
For rasters, gdal (437 stars) is the dominant crate — it wraps the full GDAL library (GEOS, PROJ, and all raster/vector format drivers):
use gdal::Dataset;
let ds = Dataset::open("scene.tif")?;
let band = ds.rasterband(1)?;
let (cols, rows) = (band.size().0, band.size().1);
let data: ndarray::Array2<f32> = band
.read_as::<f32>((0, 0), (cols, rows), (cols, rows), None)?;
This is what powers eorst under the hood — GDAL handles the actual raster I/O and CRS reprojection, while eorst provides the block-based parallel processing layer on top.
For GeoTIFF specifically, geotiff (from the georust org) provides a Rust-native reader for basic GeoTIFF access without the full GDAL dependency. And async-tiff (used in eorst’s S3 path) wraps the AWS SDK for direct cloud access.
Cloud-Native Raster Access: async-tiff and zarr
Two crates handle cloud-native raster formats directly:
async-tiff (by Development Seed, ~7k downloads) is an async, read-only TIFF/GeoTIFF reader built for object storage. It decouples I/O from decompression so that network-bound reads and CPU-bound decoding can be pipelined:
use async_tiff::{TIFF, TiffMetadataReader};
use async_tiff::reader::ObjectReader;
use object_store::local::LocalFileSystem;
let store = Arc::new(LocalFileSystem::new_with_prefix(current_dir()?)?);
let reader = ObjectReader::new(store, "tiled-jpeg-rgb-u8.tif".into());
let mut meta = TiffMetadataReader::try_open(&reader).await?;
let ifds = meta.read_all_ifds(&reader).await?;
let tiff = TIFF::new(ifds, meta.endianness());
// Fetch and decode a tile without loading the whole file
let tile = tiff.ifds()[0].fetch_tile(0, 0, &reader).await?;
Supported compressions: Deflate, LERC, LERC+Deflate, LERC+ZSTD, LZMA, LZW, JPEG, JPEG2000, WebP, ZSTD. The ndarray feature gives you decoded tiles as ndarray::Array3. The GeoTIFF metadata module reads georeferencing tags directly. Tile request merging prevents redundant reads when workers request overlapping regions — crucial when many rayon threads are hitting S3 simultaneously.
zarr (v0.0.1-placeholder, by aschampion) defines the trait layer for Zarr tensor storage — the cloud-optimised format favoured by cloud-native raster processing systems. The landscape here is still maturing: the core trait definitions exist, but expect active development. For now, the practical path for Zarr access in Rust goes through the Arrow integration (geoarrow writes Zarr-compatible arrays) or through GDAL’s Zarr driver.
Binary Vector Formats: flatgeobuf and gpkg
The most practical binary vector formats in Rust are handled by geozero-compatible crates:
flatgeobuf (6.0.1, BSD-2-Clause) is a flatbuffers-based binary format with a built-in packed Hilbert R-tree for fast spatial filtering. It’s read and written via geozero:
use flatgeobuf::*;
use geozero::{ToJson, GeozeroDatasource};
let mut fgb = FgbReader::open(&mut filein)?.select_all()?;
while let Some(feature) = fgb.next()? {
println!("{}", feature.property::<String>("name")?);
println!("{}", feature.to_json()?);
}
HTTP access is first-class — HttpFgbReader supports range requests so you can query a bounding box on a remote FGB without downloading the whole file. And geo_traits integration gives you zero-copy random access to coordinates without a streaming iterator:
use flatgeobuf::*;
use geo_traits::{GeometryTrait, GeometryType};
// Zero-copy coordinate access via geo_traits trait
assert_multi_polygon(&feature.geometry_trait()?.unwrap());
Elevation Data: srtm
srtm reads NASA’s SRTM HGT files (90m and 30m resolution) directly:
use srtm::{Tile, Resolution};
let tile = Tile::new(Resolution::Srtm1, 35, 145)?; // lat 35S, lon 145E
let elevation = tile.get_elevation(lat, lon)?;
No API keys, no network calls — just read the HGT file and get the elevation. Useful for terrain analysis, hillshading, or as a DEM input for hydrological models in whitebox-tools.
GIS Toolkits: whitebox-tools and geomesh
The heavyweight GIS toolkits:
whitebox-tools (1.2k stars, 195 forks, MIT) is the big one. Originally written in Rust by Prof. John Lindsay at the University of Guelph, it’s a complete geospatial analysis platform covering:
- Raster analysis: cost-distance, buffering, reclassification, image enhancement, filters, classification
- Hydrology: flow accumulation, watershed delineation, stream networks, sink detection
- Terrain analysis: slope, aspect, curvature, wetness index, hillshading, multi-scale topographic position
- LiDAR: point cloud processing, outlier detection, interpolation to DEMs, ground-point classification
It’s a standalone binary with a Python interface (whitebox_tools.py) and a plugin system. You can write your own Rust plugins to add tools. The Rust source is fully available and compilable with cargo build. Pre-built binaries exist for Windows, macOS, and Linux. If you need an operation that eorst doesn’t have (e.g. a specific flow-routing algorithm), whitebox-tools is the next port of call.
Our Crate: eorst
eorst (Earth Observation and Remote Sensing Toolkit) is the raster processing crate built at JRSRP for this ecosystem. Where geo handles vectors and gdal handles low-level I/O, eorst sits above them to provide the pipeline layer for satellite imagery:
RasterDataset— virtual dataset from files or STAC queries, with block-based decompositionapply()— parallel block worker, writing directly to GeoTIFF viarayonapply_reduction()— reduce a dimension (e.g. mean over time) and write the resultapply_with_mask()— pair two datasets block-by-block for masked computationsSelecttrait — semantic layer and time selection by name (block.select_layers(&["red", "nir"])?)rss_coreintegration — STAC catalog queries (DEA, Planetary Computer, Element84) with canonical band resolution and file caching
The typical eorst pipeline: query via rss_core → build RasterDataset from files → apply() a worker function in parallel → write results. It’s designed for single-machine, multi-core processing — no cluster required.
use eorst::{types::BlockSize, RasterDatasetBuilder, RasterDataBlock};
use ndarray::Array4;
fn ndvi_worker(block: &RasterDataBlock<u16>) -> anyhow::Result<Array4<i16>> {
let red = block.select_layers(&["red"])?;
let nir = block.select_layers(&["nir"])?;
let ndvi = ((&nir - &red) / (&nir + &red + 1e-10)) * 10000i16;
Ok(ndvi)
}
let rds = RasterDatasetBuilder::<u16>::from_sources(&scene_files)
.block_size(BlockSize { cols: 2048, rows: 2048 })
.build();
rds.apply::<i16>(ndvi_worker, 8, &PathBuf::from("ndvi_output.tif"))?;
See eorst docs for the full API, and the blog series for end-to-end tutorials.
The Arrow-Native Stack: geoarrow and geoparquet
The most exciting recent development in GeoRust is the Arrow-native stack. Apache Arrow’s columnar format has become the lingua franca of analytical data processing (Polars, DuckDB, DataFusion, Parquet). GeoRust has followed:
geoarrow (8k downloads) provides Arrow-native geometry arrays — GeoParquet-compatible, zero-copy when read from Parquet files, and SIMD-accelerated. The key insight is that geometry columns in a GeoParquet file can be read into Arrow arrays without any deserialisation or allocation:
use geoarrow::array::PolygonArray;
use geoarrow::io::parquet::read_geo_metadata;
let metadata = read_geo_metadata("districts.parquet")?;
let geometries: PolygonArray = metadata.geometry_column(0)?;
let area_sqkm: Vec<f64> = geometries.iter()
.map(|poly| poly.map(|p| p.euclidean_area()).unwrap_or(0.0))
.collect();
This opens up integration with the broader Arrow ecosystem: geodatafusion (Apache DataFusion with spatial extensions, 388k downloads) lets you run SQL queries with PostGIS-style spatial predicates directly on GeoParquet:
use datafusion::prelude::*;
use geodatafusion::*;
let ctx = SessionContext::new();
ctx.register_parquet("districts", "districts.parquet")?;
let sql = r#"
SELECT name, ST_Area(geom) as area
FROM districts
WHERE ST_Intersects(geom, ST_GeomFromText('POINT(152.0 -27.5)', 4326))
"#;
let results = ctx.sql(sql).await?.collect().await?;
This is the emergent pattern for large-scale geospatial analytics: store data as GeoParquet, query it with DataFusion + geodatafusion, and use eorst only for the heavy per-pixel lifting that needs its own worker function.
Putting It Together
Here’s how these pieces fit together in a typical pipeline:
- Query —
staccrate (used byrss_core) queries cloud STAC APIs to find scenes - Download / Access —
gdalorasync-tiffreads cloud-optimised GeoTIFFs from S3 - Vector operations —
geo+rstarfor spatial joins and index lookups - CRS transforms —
projorgeodesyfor coordinate reprojection - Format conversion —
geozero+flatgeobufto read/write any format without intermediate types - Elevation data —
srtmfor HGT SRTM data,whitebox-toolsfor hydrological/terrain analysis - Large-scale analytics —
geodatafusion+ GeoParquet for SQL spatial queries on columnar data - Per-pixel processing — eorst’s block-based parallel workers for satellite imagery analysis
The ergonomics have improved dramatically over the past two years. The geo crate adopted Rust 2024 edition in its latest releases. MSRV has been creeping upward (now ~1.85 for geo v0.32) as the ecosystem uses newer language features, which means you’re getting modern Rust even in a conservative setup. Most crates have good documentation, serde support, no_std variants where possible, and feature flags to manage dependency trees.
The Discord community is active, and the maintainers are responsive. If you’ve been put off by Rust’s “rewriting everything in Rust” reputation, the GeoRust ecosystem is a counterexample: it builds on battle-tested native libraries (PROJ, GDAL, GEOS) where appropriate, and re-implements in pure Rust where it makes sense (geometry algorithms, spatial indexes). That’s the pragmatic approach that actually ships.
Further Reading
- georust.org — the ecosystem homepage
- Geospatial Rust book — lighthearted intro to Rust + geospatial
- geo crate docs — geometry primitives and algorithms
- geozero docs — zero-copy format conversion
- geodatafusion on crates.io — spatial SQL on Arrow data
- async-tiff docs — async cloud-native TIFF/GeoTIFF reader
- whitebox-tools on GitHub — full GIS toolkit (1.2k stars)
- flatgeobuf docs — binary vector format with packed R-tree
- srtm crate — read NASA SRTM HGT elevation files