Last time, world map data was created and rendered.
If there is polygon data + properties for the whole world as a layer in Mapbox, there remains room for ingenuity when rendering and various things can be done. On the other hand, fetching and downloading of large data is inevitably required at the time of initial data acquisition, which slows down rendering. The experience is noticeably worse, especially when network bandwidth is weak, such as on smartphones.
Several measures have been implemented to improve this experience.
Optimise data and processing.
In order to optimise the system, it was necessary to acquire data in parallel and perform code optimisation while keeping the data transmission volume small.
Specifically, smooth map display was achieved by performing the following three points.現した。
Reduce the volume of data
Essentially, the most important thing is to reduce data traffic. In fact, this was the biggest obstacle.
Unfortunately, data could not be changed, so this time it was abandoned. However, it is the most effective option.
In the original code, the GeoJSON data was going to be retrieved at fetch timing.
fetch('./polygons.json')
.then(response => response.json())
.then(polygonGeoJSON => {
// Layers and events are processed after data acquisition.
});
Prefetch the data.
In this case, we could not reduce the amount of data traffic, so we considered prefetch so that it could be retrieved asynchronously.
This specified the resources that the page needed immediately.
It is recommended not to use this if it is not intended, as taking data that is too heavy too quickly may delay the loading of other resources.
Originally, we wanted to get the data as API or lightweight data. A bitter pill to swallow…
<link rel="preload" href="./polygons.json" as="fetch" type="application/json" crossorigin="anonymous">
Reduce and optimise unnecessary loops.
We will review the unnecessary loops after data acquisition.
What we wanted to do was to extract the world map JSON data ‘geoJsonData’ with only the countries matching the ‘country_data’ and display them on the map. This processed data ‘polygonGeoJSON’ is then applied to the layer with map.on(“load”, () => {}).
This is the original process.
In the process in then in fetch, the ‘country_data’ is filled with the country data to be extracted. Then, in the features property, only the matching features from geoJsonData.features were filtered, mapped and properties were attached.
const country_data = [
{ "SU_A3": "XXX", "country": "country_name", "lng": "-1.0000", "lat": "12.0000", "attr": [{ "name": "country_name" }] },
];
const polygonGeoJSON = {
...geoJsonData,
features: geoJsonData.features
.filter(feature => country_data.some(country => country.SU_A3 === feature.properties.SU_A3))
.map(feature => {
const country = country_data.find(c => c.SU_A3 === feature.properties.SU_A3);
if (country) {
return {
...feature,
properties: {
...country
}
};
}
return feature;
})
};
However, with redundant loops and huge array sizes, processing could be heavy.
So we improved the performance in this way.
- Creation of countryDataMap:.
- country_data.reduce(…) loops through the country_data array once.
- This creates a map (JavaScript object) with SU_A3 as key and the corresponding country_data object as value.
- Creating processedFeatures:.
- geoJsonData.features.reduce() loops through the geoJsonData.features array once.
- Within each feature, the country information is retrieved by countryDataMap[countryKey]. Access by object key is very fast.
The reduce method pushes new features to acc only if countryInfo exists, so it is effectively both a filter and a map in one loop.
The difference is especially noticeable when the array size of country_data or geoJsonData.features is large. The pre-processing makes the search more efficient, thereby significantly reducing the overall computational complexity.
In terms of loops, the refactoring consists of two independent loops (pre-processing and main processing), whereas before the processing was practically similar to a nested loop structure, making the processing within each loop lighter.
const country_data = [
{ "SU_A3": "XXX", "country": "country_name", "lng": "-1.0000", "lat": "12.0000", "attr": [{ "name": "country_name" }] },
];
const countryDataMap = country_data.reduce((acc, country) => {
acc[country.SU_A3] = country;
return acc;
}, {});
const processedFeatures = geoJsonData.features.reduce((acc, feature) => {
const countryKey = feature.properties.SU_A3;
const countryInfo = countryDataMap[countryKey];
if (countryInfo) {
acc.push({
...feature,
properties: {
// ...feature.properties,
...countryInfo
}
});
}
return acc;
}, []);
const polygonGeoJSON = {
...geoJsonData,
features: processedFeatures
};
On the other hand, if complex logic or many conditions are involved, readability is reduced, so it may be better to consider using map + filter, depending on processing speed.
Summary
The world map data will require a large amount of traffic if used as is, so it needs to be optimised on the code side.
If the data can be optimised, it is recommended to start with that consideration. This is because the improvement in communication volume is the most effective.