Google 将图像上传到 Firebase 后视觉 API 不工作

Google vision API is not working after upload image to Firebase

我使用 google 视觉 API.

和 React-Native 构建了一个图像检测移动应用程序(例如塑料瓶、铝罐、牛奶罐等)

之前很好用,回复成功

但是我为商店图片添加Firebase图片上传功能后,它(google视觉api)没有用。

我猜,Firebase图片上传google视觉API似乎冲突,不兼容彼此。

或者在我的图片上传功能中,似乎有错误,但我仍然不确定是什么问题。以下是我的代码。

  const takePicture = async () => {
    if (this.camera) {
      const options = { quality: 0.5, base64: true };
      const data = await this.camera.takePictureAsync(options);
      setScannedURI(data.uri)
      imageUploadToFirebase(data)
      // callGoogleVisionApi(data.base64)  //============> After comment image upload function(above line) and if I call vision api here, it works well.
      setIsLoading(true)
    }
  };

  const imageUploadToFirebase = (imageData) => {
    const Blob = RNFetchBlob.polyfill.Blob;    //firebase image upload
    const fs = RNFetchBlob.fs;
    window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
    window.Blob = Blob;
    const Fetch = RNFetchBlob.polyfill.Fetch
    window.fetch = new Fetch({
      auto: true,
      binaryContentTypes: [
        'image/',
        'video/',
        'audio/',
        'foo/',
      ]
    }).build()
    let uploadBlob = null;
    var path = Platform.OS === "ios" ? imageData.uri.replace("file://", "") : imageData.uri
    var newItemKey = Firebase.database().ref().child('usersummary').push().key;
    var _name = newItemKey + 'img.jpg';
    setIsLoading(true)
    fs.readFile(path, "base64")
      .then(data => {
        let mime = "image/jpg";
        return Blob.build(data, { type: `${mime};BASE64` });
      })
      .then(blob => {
        uploadBlob = blob;
        Firebase.storage()
          .ref("scannedItems/" + _name)
          .put(blob)
          .then(() => {
            uploadBlob.close();
            return Firebase.storage()
              .ref("scannedItems/" + _name)
              .getDownloadURL();
          })
          .then(async uploadedFile => {
            setFirebaseImageURL(uploadedFile)
            // callGoogleVisionApi(imageData.base64)  //============> If I call here, it didn't work.
          })
          .catch(error => {
            console.log({ error });
          });
      });
  }

这是我调用GoogleVisionApi 的函数。

  const callGoogleVIsionApi = async (base64) => {
    let googleVisionRes = await fetch(config.googleCloud.api + config.googleCloud.apiKey, {
      method: 'POST',
      body: JSON.stringify({
        "requests": [{
          "image": { "content": base64 },
          features: [
            { type: "LABEL_DETECTION", maxResults: 30 },
            { type: "WEB_DETECTION", maxResults: 30 }
          ],
        }]
      })
    })
      .catch(err => { console.log('Network error=>: ', err) })
    await googleVisionRes.json()
      .then(googleResp => {
        if (googleResp) {
          let responseArray = googleResp.responses[0].labelAnnotations
          responseArray.map((item, index) => {
            if (item.description != "" && item.description != undefined && item.description != null) {
              newArr.push(item.description)
            }
          })
        } 
      }).catch((error) => {console.log(error)})
  }

Note: If I upload an image to firebase after getting the result from google vision api, the second call to vision api does not work.

我添加了 callGoogleVIsionApi 函数。 (没有Firebase图片上传功能也能正常工作。)

这个问题的解决方案是什么?

我不确定您是否将 @google-cloud/vision package (in the callGoogleVisionApi() function) but as far as I know that is meant to be used in server side and authenticate with a service account. As an alternative to this method, you can use Cloud Storage Triggers 用于 Cloud 函数,它会在上传新文件时触发一个函数,然后使用 Cloud Vision API。

Google Vision API 可以使用 base64 编码图像、可公开访问的 HTTP URI 或 google 云存储中的 blob。

为了使用 HTTP URI,您应该从您的 callGoogleVisionAPI 函数中更改 JSON 负载:

{
        "requests": [{
          "image": { "content": base64 },
          features: [
            { type: "LABEL_DETECTION", maxResults: 30 },
            { type: "WEB_DETECTION", maxResults: 30 }
          ],
        }]
      }

对此:

{
        "requests": [{
          "image": { "source": {"imageUri": 'https://PUBLIC_URI_FOR_THE_IMAGE' }  },
          features: [
            { type: "LABEL_DETECTION", maxResults: 30 },
            { type: "WEB_DETECTION", maxResults: 30 }
          ],
        }]
      }

你在这里有更好的解释:Make a Vision API request

我找到了原因,但我仍然很好奇为什么。 Fetch blob 和 google 视觉似乎相互冲突。 我修改了Firebase图片上传功能,效果不错

以下是我修改的Firebase图片上传功能

const imageUploadToFirebase = () => {
      var path = Platform.OS === 'ios' ? scannedURI.replace('file://', '') : scannedURI;
      const response = await fetch(path)
      const blob = await response.blob();
      var newItemKey = Firebase.database()
        .ref()
        .child('usersummary')
        .push().key;
      var _name = newItemKey + 'img.jpg';
      Firebase.storage()
        .ref(_name)
        .put(blob)
        .then(() => {
          return Firebase.storage()
            .ref(_name)
            .getDownloadURL();
        })
        .then(async uploadedFile => {
          let image = selectImage(sendItem.name?.toLowerCase());
          sendItem.image = image;
          sendItem.scannedURI = uploadedFile;
          AsyncStorage.getItem('@scanedItemList')
            .then(res => {
              if (res != null && res != undefined && res != '') {
                let result = `${res}#${JSON.stringify(sendItem)}`;
                AsyncStorage.setItem('@scanedItemList', result);
              } else {
                AsyncStorage.setItem(
                  '@scanedItemList',
                  JSON.stringify(sendItem),
                );
              }
            })
            .catch(err => console.log(err));
        })
        .catch(error => {
          console.log({error});
        });
}