I’ve had nothing but problems using GeoJson. The specification has limitations everywhere and doesn’t even support z + m values at the same time.
But thankfully there is also the SQLite backed GeoPackage, which is not only more flexible but also much smaller. It takes some extra steps to get testing teams working due to it’s binary nature, but other than that it is the best format in geospatial data analysis.
We used this extensively when I worked in this space (2010 - 2014). My favorite addition was using https://github.com/topojson/topojson to add arcs. That cut down on quite a bit of points to represent curves.
I’ve applied GeoJSON (among many other GIS tech) for mapping and monitoring tens of thousands of warehouse robots. It works great as long as you squint just a bit, ignoring that it generally calls for long,lat and is designed with the assumption of a world CRS.
The dangerous part is that some tools fully assume this and will completely screw with calculations if you’re assuming a flatland CRS. So you’ve got to be careful in checking and setting those parameters.
One nice thing is that the structure of GeoJSON works incredibly well in typescript. It has discriminated unions built in so you can walk entire geodatasets in a pretty comfortable way.
> It works great as long as you squint just a bit, ignoring that it generally calls for long,lat and is designed with the assumption of a world CRS.
I thought the spec allowed you to specify the CRS, but I just checked the RFC and they removed that from the 2016 specification and WGS84 is specified. It does allow for alternative CRS with prior arrangement, but like you said that does require a lot of care.
GeoJSON is super useful. At Getcho (delivery, logistics) we use zip code GeoJSON encodings to draw polygons on zone maps and quickly generate rates. This has been a persistently annoying thing to do until we discovered this format. If you're curious, someone made a repo with all the 2010 census zips a while back [0].
About 25% of ZIP codes don't have a corresponding Census Bureau ZCTA, for example 10118. Do you end up needing special handling for those cases? Or has it not yet come up in practice?
Excellent question it certainly does come up. Practically speaking the more populous zip codes are all accounted for and that’s where the vast majority of deliveries go to. For example I took the census zip code data 150 miles (crow flies) outside Philly and found virtually 100% coverage.
For missing ones you have to fall back to distance based estimates and in my business that means you’re quote may be off and you’re exposed
Have been using GeoJSON, very handy and human-readable, but we recently switched to GeoPackage files, as it allows for different layers, each with a different schema for additional data.
GeoPackages also allow to set a proper CRS, which is not as easy in GeoJSON IIRC.
And with PostgREST [0], you can automatically convert any PostGIS table (with geometry or geography column) to GeoJSON by using an "Accept: application/geo+json" header in the request.
This is nice. I haven't worked with GIS data for ages but I really like the idea of a well-understood plain text container for it. Much nicer than wrangling with binary formats like shapefiles, especially when something goes wrong and you're not sure if it's your code (well more precisely your usage of whatever library you've got for it) or the data.
Nope, properties must be an object (dictionary or null). Which means each property can only appear once.
The spec doesn't say what type the value of a property can be, though. Page 6 of the RFC shows a nested object, for example. So you could probably put a list in there if you want to store multiple values under the same key, provided that your decoder knows what to do with such values. (GeoJSON is often converted to and from WKB/WKT, and unorthodox values may be lost in the conversion.)
Recently I got into cartography software for a bit and the horrors of the data formats in this industry are real. Feels like everyone under the sun has their own.
All that said, GeoJSON was a great change of pace, I enjoyed using it. While I'm no professional and have no idea what the professional needs are, it was very good for my hobbyist needs.
I’ve had nothing but problems using GeoJson. The specification has limitations everywhere and doesn’t even support z + m values at the same time.
But thankfully there is also the SQLite backed GeoPackage, which is not only more flexible but also much smaller. It takes some extra steps to get testing teams working due to it’s binary nature, but other than that it is the best format in geospatial data analysis.
Long live SQLite!
We used this extensively when I worked in this space (2010 - 2014). My favorite addition was using https://github.com/topojson/topojson to add arcs. That cut down on quite a bit of points to represent curves.
I’ve applied GeoJSON (among many other GIS tech) for mapping and monitoring tens of thousands of warehouse robots. It works great as long as you squint just a bit, ignoring that it generally calls for long,lat and is designed with the assumption of a world CRS.
The dangerous part is that some tools fully assume this and will completely screw with calculations if you’re assuming a flatland CRS. So you’ve got to be careful in checking and setting those parameters.
One nice thing is that the structure of GeoJSON works incredibly well in typescript. It has discriminated unions built in so you can walk entire geodatasets in a pretty comfortable way.
> It works great as long as you squint just a bit, ignoring that it generally calls for long,lat and is designed with the assumption of a world CRS.
I thought the spec allowed you to specify the CRS, but I just checked the RFC and they removed that from the 2016 specification and WGS84 is specified. It does allow for alternative CRS with prior arrangement, but like you said that does require a lot of care.
> tens of thousands of warehouse robots
Sounds like Amazon
Definitely not Amazon. Yuck.
GeoJSON is super useful. At Getcho (delivery, logistics) we use zip code GeoJSON encodings to draw polygons on zone maps and quickly generate rates. This has been a persistently annoying thing to do until we discovered this format. If you're curious, someone made a repo with all the 2010 census zips a while back [0].
[0] https://github.com/OpenDataDE/State-zip-code-GeoJSON/blob/ma... although you can generate newer versions from the last census.
About 25% of ZIP codes don't have a corresponding Census Bureau ZCTA, for example 10118. Do you end up needing special handling for those cases? Or has it not yet come up in practice?
Excellent question it certainly does come up. Practically speaking the more populous zip codes are all accounted for and that’s where the vast majority of deliveries go to. For example I took the census zip code data 150 miles (crow flies) outside Philly and found virtually 100% coverage.
For missing ones you have to fall back to distance based estimates and in my business that means you’re quote may be off and you’re exposed
Have been using GeoJSON, very handy and human-readable, but we recently switched to GeoPackage files, as it allows for different layers, each with a different schema for additional data.
GeoPackages also allow to set a proper CRS, which is not as easy in GeoJSON IIRC.
Getting your CRSes wrong is fun...
Also https://postgis.net/
And with PostgREST [0], you can automatically convert any PostGIS table (with geometry or geography column) to GeoJSON by using an "Accept: application/geo+json" header in the request.
[0] https://docs.postgrest.org/en/v14/how-tos/working-with-postg...
Also https://github.com/timescale/timescaledb
I've found it very useful for storing geospatial data over time.
nice and simple, great. but because it's json, most parsers are horribly inefficient, which is tough, because a lot of geodata is massive.
This is nice. I haven't worked with GIS data for ages but I really like the idea of a well-understood plain text container for it. Much nicer than wrangling with binary formats like shapefiles, especially when something goes wrong and you're not sure if it's your code (well more precisely your usage of whatever library you've got for it) or the data.
Dunno whose website this is, but the format itself is great, and it allows for a relatively compact and relatively human-readable presentation.
A few weeks ago I (vibe)coded mxmap.be and if not for the ubiquity of geojson, it would have taken me significantly more effort.
Nice work with mxmap. It's a very good way to appreciate to what extent EU depends on US - email providers being just one of several dimensions.
The properties key is plural but contains a dictionary. Does the schema allow for this to be a list?
Nope, properties must be an object (dictionary or null). Which means each property can only appear once.
The spec doesn't say what type the value of a property can be, though. Page 6 of the RFC shows a nested object, for example. So you could probably put a list in there if you want to store multiple values under the same key, provided that your decoder knows what to do with such values. (GeoJSON is often converted to and from WKB/WKT, and unorthodox values may be lost in the conversion.)
vega-lite supports rendering of GeoJSON via 'geoshape'
https://vega.github.io/vega-lite/docs/geoshape.html
Recently I got into cartography software for a bit and the horrors of the data formats in this industry are real. Feels like everyone under the sun has their own.
All that said, GeoJSON was a great change of pace, I enjoyed using it. While I'm no professional and have no idea what the professional needs are, it was very good for my hobbyist needs.