Search code examples
javascriptfirebasereact-nativegoogle-cloud-platformgoogle-cloud-firestore

Realtime updates with Lazyloading in reactnative with Firestore


I want to lazyload a list and at the same time, there should be real-time updates to the data that is already fetched.

First batch:

const querySnapshot = await query.get();
  const customers = querySnapshot.docs.map(doc => ({
    id: doc.id,
    ...doc.data()
  }));

For 2nd and subsequent batch how this same listener can be used? Otherwise, for each batch, I have to create a new listener which I think is not a good strategy. Or is it ok to have many listeners? Every user using the app will create many listeners.


Solution

  • I want to lazyload a list and at the same time, there should be real-time updates to the data that is already fetched.

    When you are using in your code a get() call like in the line below:

    const querySnapshot = await query.get();
    

    You are reading the data from Firestore only once, you aren't listening for any real-time updates. If you need to be notified in real-time when something in your result set changes, then you should use a persistent listener.

    For 2nd and subsequent batch how this same listener can be used? Otherwise, for each batch, I have to create a new listener which I think is not a good strategy.

    Why would you think that creating listeners is not a good strategy? It's actually a good one. You can create as many listeners as you, but only as long as you detach them when they are not needed anymore.

    Every user using the app will create many listeners.

    That would be no problem with that. There is no limitation on how many listeners you can use. Listeners are cheap. The single thing that you should worry about is the number of documents you read. Since we cannot see how your query object is defined, I assume that have already limited the number of documents using a limit(n) call. If you didn't add such a limitation, I recommend you do it. If you have a large collection of documents, a more wise solution would be to use pagination, and load the data in smaller chunks.