如何使用 PlaybackCapture API 在 Android Q 上录制音频?

How to record audio on Android Q using PlaybackCapture APIs?

我正在尝试使用 Playback Capture API. As Playback Capture API, only allow to record sound with USAGE_GAME, USAGE_MEDIA or USAGE_UNKNOWN, so, I have downloaded Uamp 样本在 Android 10(Q) 上录制音频,该样本在播放歌曲时设置了 USAGE_MEDIA。我还在 AndroidManifest.xml 中添加了 android:allowAudioPlaybackCapture="true"。然后我启动了 Uamp,开始播放歌曲并将其保留在后台。

我用 targetSdk 29 开发了一个 CaptureAudio 项目,并将它安装在我的 OnePlus 7 Pro 上,它有 Android 10安装。我在 UI 上有两个按钮用于开始和停止捕获。当应用程序开始捕获时,读取函数会在缓冲区中填充全 0。

为了在项目中使用Playback Capture,我设置如下:

1.清单:

        <?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    package="com.example.captureaudio">

    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    <uses-permission android:name="android.permission.CAPTURE_AUDIO_OUTPUT" />
    <uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />

    <application
        android:allowBackup="false"
        android:icon="@mipmap/ic_launcher"
        android:label="@string/app_name"
        android:roundIcon="@mipmap/ic_launcher_round"
        android:supportsRtl="true"
        android:theme="@style/AppTheme"
        tools:ignore="GoogleAppIndexingWarning">
        <activity android:name=".MainActivity">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>

        <service
            android:name=".services.MediaProjectionService"
            android:enabled="true"
            android:exported="false"
            android:foregroundServiceType="mediaProjection"
            tools:targetApi="q" />
    </application>

</manifest>

2。主要活动:

class MainActivity : AppCompatActivity() {

    companion object {
        private const val REQUEST_CODE_CAPTURE_INTENT = 1
        private const val TAG = "CaptureAudio"
        private const val RECORDER_SAMPLE_RATE = 48000
        private const val RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO
        //or AudioFormat.CHANNEL_IN_BACK
        private const val RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT
        //  AudioFormat.ENCODING_PCM_16BIT
    }

    private var audioRecord: AudioRecord? = null
    private val mediaProjectionManager by lazy { (this@MainActivity).getSystemService(Context.MEDIA_PROJECTION_SERVICE) as MediaProjectionManager }
    private val rxPermissions by lazy { RxPermissions(this) }
    private val minBufferSize by lazy {
        AudioRecord.getMinBufferSize(
            RECORDER_SAMPLE_RATE,
            RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING
        )
    }

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)
        val intent = Intent(this, MediaProjectionService::class.java)
        startForegroundService(intent)
        getPermissions()
    }

    private fun getPermissions() {
        rxPermissions
            .request(
                Manifest.permission.RECORD_AUDIO,
                Manifest.permission.FOREGROUND_SERVICE,
                Manifest.permission.WRITE_EXTERNAL_STORAGE
            )
            .subscribe {
                log("Permission result: $it")
                if (it) { // Always true pre-M
                    val captureIntent = mediaProjectionManager.createScreenCaptureIntent()
                    startActivityForResult(captureIntent, REQUEST_CODE_CAPTURE_INTENT)
                } else {
                    getPermissions()
                }
            }
    }

    override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
        super.onActivityResult(requestCode, resultCode, data)
        if (requestCode == REQUEST_CODE_CAPTURE_INTENT && data != null) {
            val mediaProjection = mediaProjectionManager.getMediaProjection(resultCode, data)
            val playbackConfig = AudioPlaybackCaptureConfiguration.Builder(mediaProjection)
                .addMatchingUsage(AudioAttributes.USAGE_MEDIA)
                .addMatchingUsage(AudioAttributes.USAGE_UNKNOWN)
                .addMatchingUsage(AudioAttributes.USAGE_GAME)
                .build()
            audioRecord = AudioRecord.Builder()
                .setAudioPlaybackCaptureConfig(playbackConfig)
                .setBufferSizeInBytes(minBufferSize * 2)
                .setAudioFormat(
                    AudioFormat.Builder()
                        .setEncoding(RECORDER_AUDIO_ENCODING)
                        .setSampleRate(RECORDER_SAMPLE_RATE)
                        .setChannelMask(RECORDER_CHANNELS)
                        .build()
                )
                .build()
        }
    }

    fun startCapture(view: View) {
        audioRecord?.apply {
            startRecording()
            log("Is stopped: $state $recordingState")
            startRecordingIntoFile()
        }
        stopRecBtn.visibility = View.VISIBLE
        startRecBtn.visibility = View.INVISIBLE
    }

    private fun AudioRecord.startRecordingIntoFile() {
        val file = File(
            getExternalFilesDir(Environment.DIRECTORY_MUSIC),
            "temp.wav"
            //System.currentTimeMillis().toString() + ".wav"
        )
        if (!file.exists())
            file.createNewFile()

        GlobalScope.launch {
            val out = file.outputStream()
            audioRecord.apply {
                while (recordingState == AudioRecord.RECORDSTATE_RECORDING) {

                    val buffer = ShortArray(minBufferSize)//ByteBuffer.allocate(MIN_BUFFER_SIZE)
                    val result = read(buffer, 0, minBufferSize)

                    // Checking if I am actually getting something in a buffer
                    val b: Short = 0
                    var nonZeroValueCount = 0
                    for (i in 0 until minBufferSize) {
                        if (buffer[i] != b) {
                            nonZeroValueCount += 1
                            log("Value: ${buffer[i]}")
                        }
                    }
                    if (nonZeroValueCount != 0) {

                        // Record the non-zero values in the file..
                        log("Result $nonZeroValueCount")
                        when (result) {
                            AudioRecord.ERROR -> showToast("ERROR")
                            AudioRecord.ERROR_INVALID_OPERATION -> showToast("ERROR_INVALID_OPERATION")
                            AudioRecord.ERROR_DEAD_OBJECT -> showToast("ERROR_DEAD_OBJECT")
                            AudioRecord.ERROR_BAD_VALUE -> showToast("ERROR_BAD_VALUE")
                            else -> {
                                log("Appending $buffer into ${file.absolutePath}")
                                out.write(shortToByte(buffer))
                            }
                        }
                    }
                }
            }
            out.close()
        }
    }

    private fun shortToByte(shortArray: ShortArray): ByteArray {
        val byteOut = ByteArray(shortArray.size * 2)
        ByteBuffer.wrap(byteOut).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().put(shortArray)
        return byteOut
    }

    private fun showToast(msg: String) {
        runOnUiThread {
            log("Toast: $msg")
            Toast.makeText(this@MainActivity, msg, Toast.LENGTH_LONG).show()
        }
    }

    fun stopCapture(view: View) {
        audioRecord?.apply {
            stop()
            log("Is stopped: $state $recordingState")
        }
        stopRecBtn.visibility = View.INVISIBLE
        startRecBtn.visibility = View.VISIBLE
    }

    private fun log(msg: String) {
        Log.d(TAG, msg)
    }

    override fun onDestroy() {
        super.onDestroy()
        audioRecord?.stop()
        audioRecord?.release()
        audioRecord = null
    }
}

3。 MediaProjectionService

    class MediaProjectionService : Service() {

    companion object {
        private const val CHANNEL_ID = "ForegroundServiceChannel"
    }

    override fun onBind(intent: Intent?): IBinder? {
        return null
    }

    override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {

        createNotificationChannel()
        val notificationIntent = Intent(this, MainActivity::class.java)
        val pendingIntent = PendingIntent.getActivity(
            this,
            0, notificationIntent, 0
        )

        val notification = NotificationCompat.Builder(this, CHANNEL_ID)
            .setContentTitle("Foreground Service")
            .setContentText("Call Recording Service")
//            .setSmallIcon(R.drawable.ic_stat_name)
            .setContentIntent(pendingIntent)
            .build()

        startForeground(1, notification)
        return START_NOT_STICKY
    }

    private fun createNotificationChannel() {
        val serviceChannel = NotificationChannel(
            CHANNEL_ID,
            "Foreground Service Channel",
            NotificationManager.IMPORTANCE_DEFAULT
        )

        val manager = getSystemService(NotificationManager::class.java)
        manager!!.createNotificationChannel(serviceChannel)
    }
}

问题是,

1. 文件 /storage/emulated/0/Android/data/com.example.captureaudio/files/Music/temp.wav 已创建,但其中只有 0。我还用 xxd /storage/emulated/0/Android/data/com.example.captureaudio/files/Music/temp.wav 检查了它,如下所示:

OnePlus7Pro:/sdcard # xxd /storage/emulated/0/Android/data/com.example.captureaudio/files/Music/temp.wav | head
00000000: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000010: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000020: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000030: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000040: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000050: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000060: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000070: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000080: 0000 0000 0000 0000 0000 0000 0000 0000  ................
00000090: 0000 0000 0000 0000 0000 0000 0000 0000  ................

2. 从设备播放,报错"Couldn't play the track you requested".

任何帮助或建议,我缺少什么?

我认为当您将音频数据写入您的.wav文件时出了点问题。

这是我的 example app 使用 Playback Capture API 在 Android 10(Q) 上录制音频。 在这个应用程序中,我将音频数据写入 .pcm 文件,然后将其解码为 .mp3 音频文件,您可以使用任何播放器收听和执行该文件。

警告!

QRecorder 应用使用 NDK.

实现库 lame

如果您不想浪费时间将 lib lame 导入到您的项目中,您可以使用 PCM-Decoder 库解码录制的 .pcm 文件