What I currently do:
- Read
.shp
file usingGeoPandas
(Python), parse it toGeoJSON
and write the data intoMongoDB
- (Create a 2D-Sphere Index on the collection for efficient querying.)
- Query a subset of the data via HTTP (Python
FastApi
) from aReact
frontend application - Visualize the
GeoJSON
data on aleaflet
map.
Problem:
- The amount of data that has to be transported is big (might be > 1 GB depending on the area) which makes the user experience a nightmare.
- I try to reduce the size of the data using the
GeoPandas
/Shapely
simplify()
function. That works to some degree, but not as good as hoped. I assume this is because theMultiPolygon
consists of many tiny Polygons, which probably cannot be simplified any further (might consist of only four points).
The Question
-
Is there a way to combine multiple tiny polygons into bigger ones and thus making it more likely that the
simplify()
function will have a meaningful effect? To achieve this, I already tried to use theexplode()
function, which gives me aDataFrame
s ofPolygon
s instead of a fewMultiPolygon
and then applyunary_union()
. Sadly, this gives me roughly the same result as the original data. That leads me to the assumption that the polygons might be in such a fragmented state that does not allow for any combination of those, since they share not common edges. -
Is there any other way to reduce the size of the geometries /
GeoJSON
data or improve transport?