Search code examples
javascriptnode.jsgatsbyisomorphic-fetch-api

Multiple fetch requests, create nodes using gatsby-node (async /await)


Below I have two fetch requests, the first request is an oauth request and returns an authentication token so I can then run the second request which uses that token and returns content (Graphql) from my headless cms (squidex).

Currently this second request only works with one end point as the cms can only query one schemas contents at a time, how can I refactor this second singular request so I can have multiple requests each fetching data from different schemas and each creating a gatsby node.

Something like:

const endpoints = ['endpoint1','endpoint2','endpoint3'];

 endpoints.map(endpoint => {
    //do all the fetches in here and build a gatsby node for each of them
  });
const path = require('path');
require('dotenv').config({
  path: `.env.${process.env.NODE_ENV}`,
});

require('es6-promise').polyfill();
require('isomorphic-fetch');

const crypto = require('crypto');
const qs = require('qs');

exports.sourceNodes = async ({ actions }) => {
  const { createNode } = actions;
  // This is my first request
  let response = await fetch(process.env.TOKEN_URI, {
    method: 'POST',
    headers: {
      'Content-Type': 'application/x-www-form-urlencoded',
    },
    body: qs.stringify({
      grant_type: 'client_credentials',
      client_id: process.env.CLIENT_ID,
      client_secret: process.env.CLIENT_SECRET,
      scope: 'squidex-api',
    }),
  });

  let json = await response.json();

  // I have to wait for this first request to run the next one

  response = await fetch(`${process.env.API_URI}${process.env.END_POINT}`, {
    method: 'GET',
    headers: {
      Authorization: `${json.token_type} ${json.access_token}`,
    },
  });

// I want to create a loop here an pass an array of different END_POINTS each doing a fetch then returning a response and building a gatsby node like the below.

  json = await response.json();


  // Process json into nodes.
  json.items.map(async datum => {
    const { id, createdBy, lastModifiedBy, data, isPending, created, lastModified, status, version, children, parent } = datum;

    const type = (str => str.charAt(0).toUpperCase() + str.slice(1))(process.env.END_POINT);

    const internal = {
      type,
      contentDigest: crypto.createHash('md5').update(JSON.stringify(datum)).digest('hex'),
    };

    const node = {
      id,
      createdBy,
      lastModifiedBy,
      isPending,
      created,
      lastModified,
      status,
      version,
      children,
      parent,
      internal,
    };

    const keys = Object.keys(data);
    keys.forEach(key => {
      node[key] = data[key].iv;
    });

    await createNode(node);
  });
};

This code was taken from a gatsby-source-squidex plugin which is no longer in github. I realise this is a unique problem but most of my troubles come from chaining fetch requests. Please be gentle SO.


Solution

  • First, as an aside, you don't have to await response.json() as you have already awaited the response before that.

    If I understand your question correctly, you want to run a bunch of these requests and then go over their results.

    I would probably create a promise array and Promise.All() that array like

    const endpoints = [/* enrpoint1, endpoint2 ... endpointN */];
    const promiseArray = endpoints.map(endpoint => fetch(`${process.env.API_URI}${endpoint}`, {
      method: 'GET',
      headers: {
        Authorization: `${json.token_type} ${json.access_token}`,
      },
    }));
    
    const promiseResults = await Promise.all(promiseArray) // returns an array of all your promise results and rejects the whole thing if one of the promises rejects.
    

    Or if you need to examine the promise results one by one as they come in, you can do something like this:

    for await ( let result of promiseArray){
      console.log(result.json()) // this is each response 
    }
    

    Hope this makes sense and answers your question.