Android:对 getMaxAmplitude() 的空引用;

Android: Null reference on getMaxAmplitude();

我的 Android 应用程序中出现了一个新错误,调用时出现空引用错误...

"currentAmplitude = mRecorder.getMaxAmplitude();" 

在以下代码中:

public void useHandler() {
        //setCurrentAmplitude();
        handler = new Handler();
        handler.postDelayed(runnable, 100);
    }

        private Runnable runnable = new Runnable() {
        @Override
        public void run() {
            currentAmplitude = mRecorder.getMaxAmplitude();
            Toast.makeText(getApplicationContext(), "Testing handler",
                    Toast.LENGTH_LONG).show();
            TextView txtPowerLevel = (TextView) findViewById(R.id.txtPowerLevel);
            txtPowerLevel.setText(Integer.toString(currentAmplitude));
            handler.postDelayed(runnable, 100);
        }
    };

我在 class 的顶部声明了 currentAmplitude 为:

private int currentAmplitude;

我还是 Java 和 Android 的新手,在理解作用域以及应该在哪里声明等方面遇到了一些困难。所以我忍不住想我已经做到了一个非常明显的错误,但我已经花了几个小时试图解决这个问题并寻求帮助。

我主要尝试在这里和那里注释掉行,看看会发生什么。如果我注释掉有问题的行,程序运行正常,只是没有我想要的功能。

我现在想要完成的是在文本视图中显示当前振幅。主要是为了测试音频是否正常工作,但我稍后会使用该文本视图作为一种计数器。我的相机预览在我的纹理视图上显示正常,并且在准备、开始或停止我的音频捕获时没有错误。我似乎无法让它显示振幅。

这是完整的 class(下方)。如果您能发现我做错了什么,请告诉我。我也非常感谢任何关于为什么它不起作用的额外解释,以帮助我更好地理解代码以便下次自己解决它。

    public class DBZPowerUp extends AppCompatActivity implements TextureView.SurfaceTextureListener {

    private MediaRecorder mRecorder = null;
    private Handler handler = new Handler();
    private Camera mCamera;
    private int currentAmplitude;


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.dbzpowerup);

        // Define the textureview in XML and apply the surface listener for camera preview
        TextureView mTextureView = (TextureView) findViewById(R.id.textureView1);
        mTextureView.setSurfaceTextureListener(this);

        // Run methods to start audio capture
        startAudioCapture();
        // getAmplitude();

        // Input the current amplitude level into the power level textview
        //setCurrentAmplitude();
        useHandler();

    }

    public void useHandler() {
        //setCurrentAmplitude();
        handler = new Handler();
        handler.postDelayed(runnable, 100);
    }

        private Runnable runnable = new Runnable() {
        @Override
        public void run() {
            currentAmplitude = mRecorder.getMaxAmplitude();
            Toast.makeText(getApplicationContext(), "Testing handler",
                    Toast.LENGTH_LONG).show();
            TextView txtPowerLevel = (TextView) findViewById(R.id.txtPowerLevel);
            txtPowerLevel.setText(Integer.toString(currentAmplitude));
            handler.postDelayed(runnable, 100);
        }
    };

    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
        mCamera = Camera.open();

        try {
            mCamera.setPreviewTexture(surface);
            mCamera.startPreview();
        } catch(IOException e) {
            Log.e("DBZ_", "Camera broke");
        }
        mCamera.setDisplayOrientation(90);
    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
        // Ignored, Camera does all the work for us
    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
        mCamera.stopPreview();
        mCamera.release();
        return true;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {
        // Invoked every time there's a new Camera preview frame
    }

    public void startAudioCapture() {
        if (mRecorder == null) {
            MediaRecorder mRecorder = new MediaRecorder();
            mRecorder.setAudioSource(MediaRecorder.AudioSource.VOICE_COMMUNICATION);
            mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
            mRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
            mRecorder.setOutputFile("/dev/null");
            try {
                mRecorder.prepare();
            } catch (IOException e) {
                e.printStackTrace();
            }
            mRecorder.start();
        }
        }

    public void stopAudioCapture() {
        if (mRecorder != null) {
            mRecorder.stop();
            mRecorder.release();
            mRecorder = null;
        }
    }

/*    private void setCurrentAmplitude() {
        if (mRecorder.getMaxAmplitude() > 0) {
            currentAmplitude = mRecorder.getMaxAmplitude();
        } else {
            currentAmplitude = 1;
        }
    }*/

    @Override
    protected void onDestroy() {
        super.onDestroy();
        handler.removeCallbacks(runnable);
        stopAudioCapture();
    }
}

您的 mRecorder 对象是 null

在此代码中,您将创建一个新的 MediaRecorder 对象。

 public void startAudioCapture() {
        if (mRecorder == null) {
            MediaRecorder mRecorder = new MediaRecorder();
}

而是创建新创建的 MediaRecorder 对象并将其分配给全局声明的 mRecorder 变量。

像这样

     public void startAudioCapture() {
        if (mRecorder == null) {
            mRecorder = new MediaRecorder();
      }

您可以使用三星官方 technical-doc 来执行此操作

The Android AudioRecord class allows you to read captured audio samples into a buffer. In this example, we store each sample using encoding with 16 bits per sample, in a short array buffer of two bytes. We calculate the amplitude using the root mean square (RMS), where the calculated value is equal to the square root of the mean of the squares of the sample values. Then, to display the result, we create a progress bar with a green-to-red gradient showing the volume. When the recoding is complete, the application savs the raw data in PCM Wave format, in a .wav file.