Search code examples
flutterunity-game-enginedartcameramicrophone

I'm getting a "NoSuchMethodError" on camera preview, is this because I'm not loading the dart file into a physical device?


When I run a debug session through visual studio code on my virtual google pixel, I get a "nosuchmethoderror: invalid member on null:value", after opening the debug console in vs code it says it's talking about the camera preview. Is this caused by me running this on a virtual machine? Or how else would I fix this? Here is the code for the file:

import 'dart:async';
import 'package:demo_2/main.dart';
import 'package:flutter/material.dart';
import 'package:flutter/widgets.dart';
import 'package:camera/camera.dart';
import 'package:speech_recognition/speech_recognition.dart';

// Virtual Therapist hosted through unity?
// Hard animate VT for demo and then post through unity --> then Unity to Flutter

// Camera need data ouput

// Microphone need data output

class VirtualTherapist extends StatefulWidget {
  @override
  _VirtualTherapistState createState() => _VirtualTherapistState();
}

class _VirtualTherapistState extends State<VirtualTherapist> {
  SpeechRecognition _speechRecognition;
  bool _isListening = false;
  // ignore: unused_field
  bool _isAvailable = false;
  CameraController _controller;
  Future<void> _initCamFuture;

  String resultText = "";

  @override
  void initState() {
    super.initState();
    _initApp();
  }

  _initApp() async {
    final cameras = await availableCameras();
    //slect another camera here
    final frontCam = cameras[1];
    _controller = CameraController(
      frontCam,
      ResolutionPreset.medium,
    );
    _initCamFuture = _controller.initialize();
  }

  @override
  void dispose() {
    _controller.dispose();
    super.dispose();
  }

  void initSpeechRecognizer() {
    _speechRecognition = SpeechRecognition();
    _speechRecognition.setAvailabilityHandler(
      (bool result) => setState(() => _isAvailable = result),
    );
    _speechRecognition.setRecognitionStartedHandler(
      () => setState(() => _isListening = true),
    );

    _speechRecognition.setRecognitionResultHandler(
      (String speech) => setState(() => resultText = speech),
    );

    _speechRecognition.setRecognitionStartedHandler(
      () => setState(() => _isListening = false),
    );

    _speechRecognition.activate().then(
          (result) => setState(() => _isAvailable = result),
        );
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
        appBar: AppBar(
          title: Text("Therapist"),
        ),
        body: Column(
            mainAxisAlignment: MainAxisAlignment.center,
            crossAxisAlignment: CrossAxisAlignment.center,
            children: <Widget>[
              FutureBuilder<void>(
                future: _initCamFuture,
                builder: (context, snapshot) {
                  return CameraPreview(_controller);
                },
                //could change to future builder
                // if (_isAvailable && !_isListening)
                // _speechRecognition
                // .listen(locale: "en_US")
                // .then((result) => print('$result'))
              ),
              Row(
                  mainAxisAlignment: MainAxisAlignment.center,
                  children: <Widget>[
                    FloatingActionButton.extended(
                      backgroundColor: Colors.blue,
                      hoverColor: Colors.green,
                      label: const Text(
                        "Here's what we think could help you.",
                        style: TextStyle(
                            color: Colors.white,
                            fontFamily: 'Netflix',
                            fontSize: 15),
                      ),
                      onPressed: () async {
                        if (_isListening)
                          _speechRecognition.stop().then(
                                (result) =>
                                    setState(() => _isListening = result),
                              );
                        Navigator.push(
                            context,
                            MaterialPageRoute(
                                builder: (context) => ThirdScreen()));
                      },
                    )
                  ])
            ]));
  }
// Unity
}


I didn't think it was necessary to post my main.dart file. This code is for the second page of the app. It loads a interactive unity object that uses the users camera and microphone to interact. Should I just create a separate container for these issues?


Solution

  • the issue is related to your _controller object. Is it null when calling CameraPreview(_controller); To solve this problem you need to remove the _initApp() from initState function, and provides it as future of your future builder as

    future: _initApp(),
    builder: (context, snapshot) {
        if(snapshot.connectionState == ConnectionState.done){
           return CameraPreview(_controller);
        }
       //else show loading
    },
    

    and return _controller.initialize(); from your _initApp method.