// Find cities within 50km of London
const center = [51.5074, 0.1278];
const radiusInM = 50 * 1000;
// Each item in 'bounds' represents a startAt/endAt pair. We have to issue
// a separate query for each pair. There can be up to 9 pairs of bounds
// depending on overlap, but in most cases there are 4.
const bounds = geofire.geohashQueryBounds(center, radiusInM);
const promises = [];
for (const b of bounds) {
const q = query(
collection(db, 'cities'),
orderBy('geohash'),
startAt(b[0]),
endAt(b[1]));
promises.push(getDocs(q));
}
Does the radius affect the performance of geo query? (The code example is taken from firebase docs). I am particularly interested in performance assocaited it with locating documents, not transfering them to client. Because I am thinking about doing these operation on firebase function and then transfer only three or five documents to client. So in short, Does locating documents also take more time when I increase radius? Because I read somewhere the firebase query takes same time without depending on db size. Also, do you think, tranfering the fetching part to firebase function increases performance?
Cloud Firestore has a pretty simple but unique performance guarantee: the performs of any query depends on the number (and size) of the results that are returned, and not on the number of documents it needs to consider.
So if the larger region leads to Firestore having to return more documents, it will be slower than with a query that returns fewer documents. But the region size itself has no impact on it.
In other words, if you have a query on the same collection for a 10m radius that returns 100 documents will take just as long as a query with a 1000m radius that returns 100 documents from that collection.