我在相机预览中得到 "NoSuchMethodError",这是因为我没有将 dart 文件加载到物理设备中吗?

I'm getting a "NoSuchMethodError" on camera preview, is this because I'm not loading the dart file into a physical device?

当我通过虚拟 google 像素上的 visual studio 代码 运行 调试会话时,我在打开后收到“nosuchmethoderror:null:value 上的无效成员” vs code 中的调试控制台说它正在谈论相机预览。这是我在虚拟机上运行造成的吗?或者我还能怎么解决这个问题?这是文件的代码:

import 'dart:async';
import 'package:demo_2/main.dart';
import 'package:flutter/material.dart';
import 'package:flutter/widgets.dart';
import 'package:camera/camera.dart';
import 'package:speech_recognition/speech_recognition.dart';

// Virtual Therapist hosted through unity?
// Hard animate VT for demo and then post through unity --> then Unity to Flutter

// Camera need data ouput

// Microphone need data output

class VirtualTherapist extends StatefulWidget {
  @override
  _VirtualTherapistState createState() => _VirtualTherapistState();
}

class _VirtualTherapistState extends State<VirtualTherapist> {
  SpeechRecognition _speechRecognition;
  bool _isListening = false;
  // ignore: unused_field
  bool _isAvailable = false;
  CameraController _controller;
  Future<void> _initCamFuture;

  String resultText = "";

  @override
  void initState() {
    super.initState();
    _initApp();
  }

  _initApp() async {
    final cameras = await availableCameras();
    //slect another camera here
    final frontCam = cameras[1];
    _controller = CameraController(
      frontCam,
      ResolutionPreset.medium,
    );
    _initCamFuture = _controller.initialize();
  }

  @override
  void dispose() {
    _controller.dispose();
    super.dispose();
  }

  void initSpeechRecognizer() {
    _speechRecognition = SpeechRecognition();
    _speechRecognition.setAvailabilityHandler(
      (bool result) => setState(() => _isAvailable = result),
    );
    _speechRecognition.setRecognitionStartedHandler(
      () => setState(() => _isListening = true),
    );

    _speechRecognition.setRecognitionResultHandler(
      (String speech) => setState(() => resultText = speech),
    );

    _speechRecognition.setRecognitionStartedHandler(
      () => setState(() => _isListening = false),
    );

    _speechRecognition.activate().then(
          (result) => setState(() => _isAvailable = result),
        );
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
        appBar: AppBar(
          title: Text("Therapist"),
        ),
        body: Column(
            mainAxisAlignment: MainAxisAlignment.center,
            crossAxisAlignment: CrossAxisAlignment.center,
            children: <Widget>[
              FutureBuilder<void>(
                future: _initCamFuture,
                builder: (context, snapshot) {
                  return CameraPreview(_controller);
                },
                //could change to future builder
                // if (_isAvailable && !_isListening)
                // _speechRecognition
                // .listen(locale: "en_US")
                // .then((result) => print('$result'))
              ),
              Row(
                  mainAxisAlignment: MainAxisAlignment.center,
                  children: <Widget>[
                    FloatingActionButton.extended(
                      backgroundColor: Colors.blue,
                      hoverColor: Colors.green,
                      label: const Text(
                        "Here's what we think could help you.",
                        style: TextStyle(
                            color: Colors.white,
                            fontFamily: 'Netflix',
                            fontSize: 15),
                      ),
                      onPressed: () async {
                        if (_isListening)
                          _speechRecognition.stop().then(
                                (result) =>
                                    setState(() => _isListening = result),
                              );
                        Navigator.push(
                            context,
                            MaterialPageRoute(
                                builder: (context) => ThirdScreen()));
                      },
                    )
                  ])
            ]));
  }
// Unity
}


我认为没有必要 post 我的 main.dart 文件。此代码用于应用程序的第二页。它加载一个交互式统一对象,该对象使用用户的摄像头和麦克风进行交互。我应该为这些问题创建一个单独的容器吗?

问题与您的 _controller 对象有关。调用时是否为空 CameraPreview(_controller); 要解决此问题,您需要从 initState 函数中删除 _initApp(),并将其作为您未来构建器的未来提供,如

future: _initApp(),
builder: (context, snapshot) {
    if(snapshot.connectionState == ConnectionState.done){
       return CameraPreview(_controller);
    }
   //else show loading
},

和 return _controller.initialize(); 来自您的 _initApp 方法。