Search code examples
javascriptfirebasereact-nativegoogle-cloud-firestoregoogle-vision

Google vision API is not working after upload image to Firebase


I built a image detection mobile app (e.g. Plastic Bottle, Aluminum Can, Milk Jug, etc.) with React-Native by using google vision API.

It worked well before and got response successfully.

But after I add Firebase image uploading function for store image, it (google vision api) didn't work.

In my guess, Firebase image upload and google vision API seems conflict and not compatible with each other.

Or in my image upload function, there seems error, but I am still not sure what is issue. Following is my code.

  const takePicture = async () => {
    if (this.camera) {
      const options = { quality: 0.5, base64: true };
      const data = await this.camera.takePictureAsync(options);
      setScannedURI(data.uri)
      imageUploadToFirebase(data)
      // callGoogleVisionApi(data.base64)  //============> After comment image upload function(above line) and if I call vision api here, it works well.
      setIsLoading(true)
    }
  };

  const imageUploadToFirebase = (imageData) => {
    const Blob = RNFetchBlob.polyfill.Blob;    //firebase image upload
    const fs = RNFetchBlob.fs;
    window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
    window.Blob = Blob;
    const Fetch = RNFetchBlob.polyfill.Fetch
    window.fetch = new Fetch({
      auto: true,
      binaryContentTypes: [
        'image/',
        'video/',
        'audio/',
        'foo/',
      ]
    }).build()
    let uploadBlob = null;
    var path = Platform.OS === "ios" ? imageData.uri.replace("file://", "") : imageData.uri
    var newItemKey = Firebase.database().ref().child('usersummary').push().key;
    var _name = newItemKey + 'img.jpg';
    setIsLoading(true)
    fs.readFile(path, "base64")
      .then(data => {
        let mime = "image/jpg";
        return Blob.build(data, { type: `${mime};BASE64` });
      })
      .then(blob => {
        uploadBlob = blob;
        Firebase.storage()
          .ref("scannedItems/" + _name)
          .put(blob)
          .then(() => {
            uploadBlob.close();
            return Firebase.storage()
              .ref("scannedItems/" + _name)
              .getDownloadURL();
          })
          .then(async uploadedFile => {
            setFirebaseImageURL(uploadedFile)
            // callGoogleVisionApi(imageData.base64)  //============> If I call here, it didn't work.
          })
          .catch(error => {
            console.log({ error });
          });
      });
  }

This is my callGoogleVisionApi function.

  const callGoogleVIsionApi = async (base64) => {
    let googleVisionRes = await fetch(config.googleCloud.api + config.googleCloud.apiKey, {
      method: 'POST',
      body: JSON.stringify({
        "requests": [{
          "image": { "content": base64 },
          features: [
            { type: "LABEL_DETECTION", maxResults: 30 },
            { type: "WEB_DETECTION", maxResults: 30 }
          ],
        }]
      })
    })
      .catch(err => { console.log('Network error=>: ', err) })
    await googleVisionRes.json()
      .then(googleResp => {
        if (googleResp) {
          let responseArray = googleResp.responses[0].labelAnnotations
          responseArray.map((item, index) => {
            if (item.description != "" && item.description != undefined && item.description != null) {
              newArr.push(item.description)
            }
          })
        } 
      }).catch((error) => {console.log(error)})
  }

Note: If I upload an image to firebase after getting the result from google vision api, the second call to vision api does not work.

I added my callGoogleVIsionApi function. (It is working well without Firebase image upload function.)

What will be the solution of this issue?


Solution

  • I found the reason, but I am still curious why. Fetch blob and google vision seems conflict each other. I changed Firebase image upload function, and it worked well.

    Following is my modified Firebase image upload function.

    const imageUploadToFirebase = () => {
          var path = Platform.OS === 'ios' ? scannedURI.replace('file://', '') : scannedURI;
          const response = await fetch(path)
          const blob = await response.blob();
          var newItemKey = Firebase.database()
            .ref()
            .child('usersummary')
            .push().key;
          var _name = newItemKey + 'img.jpg';
          Firebase.storage()
            .ref(_name)
            .put(blob)
            .then(() => {
              return Firebase.storage()
                .ref(_name)
                .getDownloadURL();
            })
            .then(async uploadedFile => {
              let image = selectImage(sendItem.name?.toLowerCase());
              sendItem.image = image;
              sendItem.scannedURI = uploadedFile;
              AsyncStorage.getItem('@scanedItemList')
                .then(res => {
                  if (res != null && res != undefined && res != '') {
                    let result = `${res}#${JSON.stringify(sendItem)}`;
                    AsyncStorage.setItem('@scanedItemList', result);
                  } else {
                    AsyncStorage.setItem(
                      '@scanedItemList',
                      JSON.stringify(sendItem),
                    );
                  }
                })
                .catch(err => console.log(err));
            })
            .catch(error => {
              console.log({error});
            });
    }