Android Camera2 在 TextureView 上显示黑色和扭曲的 JPEG 图像?
Android Camera2 displays black and distorted JPEG image on TextureView?
我正在三星 S20 上为朋友制作测试应用程序。
三星 S20 的 ToF(飞行时间)摄像头面向背面。
我想在 TextureView
上并排显示 ToF 图像预览和常规相机预览。
我能够获取 ToF 传感器并使用颜色遮罩将其原始输出转换为视觉输出并直观地显示深度范围(最远的红色、橙色等),请参见屏幕截图:
相关代码如下:
<?xml version="1.0" encoding="utf-8"?>
<androidx.coordinatorlayout.widget.CoordinatorLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<com.google.android.material.appbar.AppBarLayout
android:id="@+id/appBarLayout"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:theme="@style/AppTheme.AppBarOverlay">
<androidx.appcompat.widget.Toolbar
android:id="@+id/toolbar"
android:layout_width="match_parent"
android:layout_height="?attr/actionBarSize"
android:background="?attr/colorPrimary"
app:popupTheme="@style/AppTheme.PopupOverlay" />
<androidx.constraintlayout.widget.ConstraintLayout
android:layout_width="match_parent"
android:layout_height="619dp"
android:background="#FFFFFFFF">
<TextureView
android:id="@+id/regularBackCamera"
android:layout_width="320dp"
android:layout_height="240dp"
android:layout_marginEnd="44dp"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintTop_toTopOf="parent"
app:layout_constraintVertical_bias="0.899" />
<TextView
android:id="@+id/textView3"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Raw ToF Data"
android:textColor="@android:color/primary_text_light"
app:layout_constraintEnd_toEndOf="@+id/rawData"
app:layout_constraintStart_toStartOf="@+id/rawData"
app:layout_constraintTop_toBottomOf="@+id/rawData" />
<TextureView
android:id="@+id/rawData"
android:layout_width="320dp"
android:layout_height="240dp"
android:layout_marginStart="44dp"
app:layout_constraintBottom_toTopOf="@+id/regularBackCamera"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent"
app:layout_constraintVertical_bias="0.485" />
<TextView
android:id="@+id/textView5"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="120dp"
android:text="Back Camera"
android:textColor="@android:color/primary_text_light"
app:layout_constraintStart_toStartOf="@+id/regularBackCamera"
app:layout_constraintTop_toBottomOf="@+id/regularBackCamera" />
</androidx.constraintlayout.widget.ConstraintLayout>
</com.google.android.material.appbar.AppBarLayout>
</androidx.coordinatorlayout.widget.CoordinatorLayout>
MainActivity class:
/* This is an example of getting and processing ToF data
*/
public class MainActivity extends AppCompatActivity implements DepthFrameVisualizer, RegularCameraFrameVisualizer {
private static final String TAG = MainActivity.class.getSimpleName();
public static final int CAM_PERMISSIONS_REQUEST = 0;
private TextureView rawDataView;
private TextureView regularImageView;
private Matrix ToFBitmapTransform;
private Matrix regularBackCameraBitmapTransform;
private BackToFCamera backToFCamera;
private RegularBackCamera regularBackCamera;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
rawDataView = findViewById(R.id.rawData);
regularImageView = findViewById(R.id.regularBackCamera);
checkCamPermissions();
}
@Override
protected void onPause() {
super.onPause();
if ( backToFCamera !=null)
{
backToFCamera.getCamera().close();
backToFCamera = null;
}
if ( regularBackCamera!= null)
{
regularBackCamera.getCamera().close();
regularBackCamera = null;
}
}
@Override
protected void onResume() {
super.onResume();
backToFCamera = new BackToFCamera(this, this);
String tofCameraId = backToFCamera.openCam(null);
regularBackCamera = new RegularBackCamera(this, this);
//pass in tofCameraId to avoid opening again since both regular cam & ToF camera are back facing
regularBackCamera.openCam(tofCameraId);
}
@Override
protected void onDestroy() {
super.onDestroy(); // Add this line
}
private void checkCamPermissions() {
if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA}, CAM_PERMISSIONS_REQUEST);
}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
}
@Override
public void onRawDataAvailable(Bitmap bitmap) {
renderBitmapForToFToTextureView(bitmap, rawDataView);
}
@Override
public void onRegularImageAvailable(Bitmap bitmap) {
renderBitmapToTextureView( bitmap,regularImageView);
}
/* We don't want a direct camera preview since we want to get the frames of data directly
from the camera and process.
This takes a converted bitmap and renders it onto the surface, with a basic rotation
applied.
*/
private void renderBitmapForToFToTextureView(Bitmap bitmap, TextureView textureView) {
if (bitmap!=null && textureView!=null) {
Canvas canvas = textureView.lockCanvas();
canvas.drawBitmap(bitmap, ToFBitmapTransform(textureView), null);
textureView.unlockCanvasAndPost(canvas);
}
}
private void renderBitmapToTextureView(Bitmap bitmap, TextureView textureView) {
if (bitmap!=null && textureView!=null)
{
Canvas canvas = textureView.lockCanvas();
if (canvas!=null) {
canvas.drawBitmap(bitmap, regularBackCamBitmapTransform(textureView), null);
textureView.unlockCanvasAndPost(canvas);
}
}
}
private Matrix ToFBitmapTransform(TextureView view) {
if (view!=null) {
if (ToFBitmapTransform == null || view.getWidth() == 0 || view.getHeight() == 0) {
int rotation = getWindowManager().getDefaultDisplay().getRotation();
Matrix matrix = new Matrix();
int centerX = view.getWidth() / 2;
int centerY = view.getHeight() / 2;
int bufferWidth = DepthFrameAvailableListener.SAMSUNG_S20_TOF_WIDTH;
int bufferHeight = DepthFrameAvailableListener.SAMSUNG_S20_TOF_HEIGHT;
RectF bufferRect = new RectF(0, 0, bufferWidth, bufferHeight);
RectF viewRect = new RectF(0, 0, view.getWidth(), view.getHeight());
matrix.setRectToRect(bufferRect, viewRect, Matrix.ScaleToFit.CENTER);
Log.i(TAG, " rotation:" + rotation);
if (Surface.ROTATION_90 == rotation) {
matrix.postRotate(270, centerX, centerY);
} else if (Surface.ROTATION_270 == rotation) {
matrix.postRotate(90, centerX, centerY);
} else if (Surface.ROTATION_180 == rotation) {
matrix.postRotate(180, centerX, centerY);
} else {
//strange but works!
matrix.postRotate(90, centerX, centerY);
}
ToFBitmapTransform = matrix;
}
}
return ToFBitmapTransform;
}
private Matrix regularBackCamBitmapTransform(TextureView view) {
if (view!=null) {
if (regularBackCameraBitmapTransform == null || view.getWidth() == 0 || view.getHeight() == 0) {
int rotation = getWindowManager().getDefaultDisplay().getRotation();
Matrix matrix = new Matrix();
RectF bufferRect = new RectF(0, 0, MAX_PREVIEW_WIDTH,MAX_PREVIEW_HEIGHT);
RectF viewRect = new RectF(0, 0, view.getWidth(), view.getHeight());
matrix.setRectToRect(bufferRect, viewRect, Matrix.ScaleToFit.CENTER);
float centerX = viewRect.centerX();
float centerY = viewRect.centerY();
Log.i(TAG, " rotation:" + rotation);
if (Surface.ROTATION_90 == rotation) {
matrix.postRotate(270, centerX, centerY);
} else if (Surface.ROTATION_270 == rotation) {
matrix.postRotate(90, centerX, centerY);
} else if (Surface.ROTATION_180 == rotation) {
matrix.postRotate(180, centerX, centerY);
} else {
//strange but works!
matrix.postRotate(90, centerX, centerY);
}
regularBackCameraBitmapTransform = matrix;
}
}
return regularBackCameraBitmapTransform;
}
}
通知帧可用于显示的侦听器,查看函数publishOriginalBitmap()
:
import static com.example.opaltechaitestdepthmap.RegularBackCamera.MAX_PREVIEW_HEIGHT;
import static com.example.opaltechaitestdepthmap.RegularBackCamera.MAX_PREVIEW_WIDTH;
public class BackCameraFrameAvailableListener implements ImageReader.OnImageAvailableListener {
private static final String TAG = BackCameraFrameAvailableListener.class.getSimpleName();
private RegularCameraFrameVisualizer regularCameraFrameVisualizer;
public BackCameraFrameAvailableListener(RegularCameraFrameVisualizer regularCameraFrameVisualizer) {
this.regularCameraFrameVisualizer = regularCameraFrameVisualizer;
}
@Override
public void onImageAvailable(ImageReader reader) {
try {
Image image = reader.acquireNextImage();
if (image != null && image.getFormat() == ImageFormat.JPEG)
{
publishOriginalBitmap(image);
}
}
catch (Exception e) {
Log.e(TAG, "Failed to acquireNextImage: " + e.getMessage());
}
}
private void publishOriginalBitmap(final Image image) {
if (regularCameraFrameVisualizer != null) {
new Thread() {
public void run() {
Bitmap bitmap = returnBitmap(image);
if (bitmap != null) {
regularCameraFrameVisualizer.onRegularImageAvailable(bitmap);
bitmap.recycle();
}
}
}.start();
}
}
private Bitmap returnBitmap(Image image) {
Bitmap bitmap = null;
// width=1920,height=1080
int width =1920;
int height =1080;
if (image!=null) {
Log.i(TAG,"returnBitmap,CONSTANT MAX width:"+MAX_PREVIEW_WIDTH +",MAX height:"+MAX_PREVIEW_HEIGHT);
Log.i(TAG,"BEFORE returnBitmap,image.width:"+width +",height:"+height );
if (image!=null) {
Image.Plane[] planes = image.getPlanes();
if (planes!=null && planes.length>0) {
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
image.close();
Log.i(TAG,"buffer size:"+buffer.capacity());
float currenBufferSize = buffer.capacity();
float jpegReportedArea = width * height;
if (currenBufferSize >=jpegReportedArea ) {
Log.i(TAG,"currenBufferSize >=jpegReportedArea ");
float quotient = jpegReportedArea/currenBufferSize ;
float f_width = width * quotient;
width = (int) Math.ceil(f_width);
float f_height = height * quotient;
height = (int) Math.ceil(f_height);
}
else
{
Log.i(TAG,"currenBufferSize <jpegReportedArea ");
float quotient = currenBufferSize / jpegReportedArea;
float f_width = (width * quotient);
width = (int) Math.ceil(f_width);
float f_height = (height * quotient);
height = (int) Math.ceil(f_height);
}
Log.i(TAG,"AFTER width:"+width+",height:"+height);
//***here bitmap is black
bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.RGB_565);
buffer.rewind();
if (bitmap!=null) {
bitmap.copyPixelsFromBuffer(buffer);
}
}
}
}
return bitmap;
}
}
监听器信号图片接口准备就绪:
package com.example.opaltechaitestdepthmap;
import android.graphics.Bitmap;
public interface RegularCameraFrameVisualizer {
void onRegularImageAvailable(Bitmap bitmap);
}
处理相机状态:
public class RegularBackCamera extends CameraDevice.StateCallback {
private static final String TAG = RegularBackCamera.class.getSimpleName();
private static int FPS_MIN = 15;
private static int FPS_MAX = 30;
public static final int MAX_PREVIEW_WIDTH = 1920;
public static final int MAX_PREVIEW_HEIGHT = 1080;
private Context context;
private CameraManager cameraManager;
private ImageReader RawSensorPreviewReader;
private CaptureRequest.Builder previewBuilder;
private BackCameraFrameAvailableListener imageAvailableListener;
private String cameraId;
private CameraDevice camera;
public RegularBackCamera(Context context, RegularCameraFrameVisualizer frameVisualizer) {
this.context = context;
cameraManager = (CameraManager)context.getSystemService(Context.CAMERA_SERVICE);
imageAvailableListener = new BackCameraFrameAvailableListener(frameVisualizer);
}
// Open the back camera and start sending frames
public String openCam(String idToExclude) {
this.cameraId = getBackCameraID(idToExclude);
Size size = openCamera(this.cameraId);
//Tried this DID NOT WORK Size smallerPreviewSize =chooseSmallerPreviewSize();
RawSensorPreviewReader = ImageReader.newInstance(MAX_PREVIEW_WIDTH,
MAX_PREVIEW_HEIGHT, ImageFormat.JPEG,2);
Log.i(TAG,"ImageFormat.JPEG, width:"+size.getWidth()+", height:"+ size.getHeight());
RawSensorPreviewReader.setOnImageAvailableListener(imageAvailableListener, null);
return this.cameraId;
}
private String getBackCameraID(String idToExclude) {
String cameraId = null;
CameraManager cameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
try {
if (idToExclude!=null) {
for (String camera : cameraManager.getCameraIdList()) {
//avoid getting same camera
if (!camera.equalsIgnoreCase(idToExclude)) {
//avoid return same camera twice as 1 sensor can only be accessed once
CameraCharacteristics chars = cameraManager.getCameraCharacteristics(camera);
final int[] capabilities = chars.get(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES);
boolean facingBack = chars.get(CameraCharacteristics.LENS_FACING) == CameraMetadata.LENS_FACING_BACK;
if (facingBack) {
cameraId = camera;
// Note that the sensor size is much larger than the available capture size
SizeF sensorSize = chars.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE);
Log.i(TAG, "Sensor size: " + sensorSize);
// Since sensor size doesn't actually match capture size and because it is
// reporting an extremely wide aspect ratio, this FoV is bogus
float[] focalLengths = chars.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS);
if (focalLengths.length > 0) {
float focalLength = focalLengths[0];
double fov = 2 * Math.atan(sensorSize.getWidth() / (2 * focalLength));
Log.i(TAG, "Calculated FoV: " + fov);
}
}
}//end avoid getting same camera
}//end for
}
else
{
for (String camera : cameraManager.getCameraIdList()) {
//avoid return same camera twice as 1 sensor can only be accessed once
CameraCharacteristics chars = cameraManager.getCameraCharacteristics(camera);
final int[] capabilities = chars.get(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES);
boolean facingFront = chars.get(CameraCharacteristics.LENS_FACING) == CameraMetadata.LENS_FACING_BACK;
if (facingFront) {
cameraId = camera;
// Note that the sensor size is much larger than the available capture size
SizeF sensorSize = chars.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE);
Log.i(TAG, "Sensor size: " + sensorSize);
// Since sensor size doesn't actually match capture size and because it is
// reporting an extremely wide aspect ratio, this FoV is bogus
float[] focalLengths = chars.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS);
if (focalLengths.length > 0) {
float focalLength = focalLengths[0];
double fov = 2 * Math.atan(sensorSize.getWidth() / (2 * focalLength));
Log.i(TAG, "Calculated FoV: " + fov);
}
}
}//end for
}
} catch (CameraAccessException e) {
e.printStackTrace();
}
return cameraId ;
}
//opens camera based on ID & returns optimal size caped at maximum size based on docs
private Size openCamera(String cameraId) {
Size size = null;
try{
int permission = ContextCompat.checkSelfPermission(context, Manifest.permission.CAMERA);
if(PackageManager.PERMISSION_GRANTED == permission) {
if ( cameraManager!=null) {
if (cameraId!=null) {
cameraManager.openCamera(cameraId, this, null);
CameraCharacteristics characteristics
= cameraManager.getCameraCharacteristics(cameraId);
StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
size = Collections.max(
Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),
new CompareSizeByArea());
if (size.getWidth() > MAX_PREVIEW_WIDTH || size.getHeight() > MAX_PREVIEW_HEIGHT)
{
size = new Size( MAX_PREVIEW_WIDTH ,MAX_PREVIEW_HEIGHT);
}
List<Size> sizes = Arrays.asList(map.getOutputSizes(ImageFormat.JPEG));
for (int i=0; i<sizes.size(); i++)
{
Log.i(RegularBackCamera.class.toString(),"JPEG sizes, width="+sizes.get(i).getWidth()+","+"height="+sizes.get(i).getHeight());
}
}
}
}else{
Log.e(TAG,"Permission not available to open camera");
}
}catch (CameraAccessException | IllegalStateException | SecurityException e){
Log.e(TAG,"Opening Camera has an Exception " + e);
e.printStackTrace();
}
return size;
}
@Override
public void onOpened(@NonNull CameraDevice camera) {
try {
this.camera = camera;
previewBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewBuilder.set(CaptureRequest.JPEG_ORIENTATION, 0);
Range<Integer> fpsRange = new Range<>(FPS_MIN, FPS_MAX);
previewBuilder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, fpsRange);
previewBuilder.addTarget(RawSensorPreviewReader.getSurface());
List<Surface> targetSurfaces = Arrays.asList(RawSensorPreviewReader.getSurface());
camera.createCaptureSession(targetSurfaces,
new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession session) {
onCaptureSessionConfigured(session);
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession session) {
Log.e(TAG,"!!! Creating Capture Session failed due to internal error ");
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private void onCaptureSessionConfigured(@NonNull CameraCaptureSession session) {
Log.i(TAG,"Capture Session created");
previewBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
try {
session.setRepeatingRequest(previewBuilder.build(), null, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onDisconnected(@NonNull CameraDevice camera) {
if (camera!=null)
{
camera.close();
camera = null;
}
}
@Override
public void onError(@NonNull CameraDevice camera, int error) {
if (camera!=null)
{
camera.close();
Log.e(TAG,"onError,cameraID:"+camera.getId()+",error:"+error);
camera = null;
}
}
protected Size chooseSmallerPreviewSize()
{
CameraManager cm = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
CameraCharacteristics cc = null;
try {
cc = cm.getCameraCharacteristics(this.cameraId);
} catch (CameraAccessException e) {
e.printStackTrace();
}
StreamConfigurationMap streamConfigs = cc.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size[] sizes = streamConfigs.getOutputSizes( ImageFormat.JPEG);
Size smallerPreviewSize = chooseVideoSize( sizes);
return smallerPreviewSize;
}
//Rerefences:
protected Size chooseVideoSize(Size[] choices) {
List<Size> smallEnough = new ArrayList<>();
for (Size size : choices) {
if (size.getWidth() == size.getHeight() * 4 / 3 && size.getWidth() <= 1080) {
smallEnough.add(size);
}
}
if (smallEnough.size() > 0) {
return Collections.max(smallEnough, new CompareSizeByArea());
}
return choices[choices.length - 1];
}
public CameraDevice getCamera() {
return camera;
}
}
排序预览尺寸的助手:
public class CompareSizeByArea implements Comparator<Size> {
@Override
public int compare(Size lhs, Size rhs) {
return Long.signum((long) lhs.getWidth() * lhs.getHeight() -
(long) rhs.getWidth() * rhs.getHeight());
}
}
我只包含了普通摄像头的代码,因为普通摄像头没有显示,但是获取 ToF 摄像头和侦听器的代码与 完全相同 除了 ToF 特定逻辑。
我在应用日志中没有看到任何异常或错误,但日志系统显示:
E/CHIUSECASE: [ERROR ] chxusecase.cpp:967 ReturnFrameworkResult() ChiFrame: 0 App Frame: 0 - pResult contains more buffers (1) than the expected number of buffers (0) to return to the framework!
E/CamX: [ERROR][CORE ] camxnode.cpp:4518 CSLFenceCallback() Node::FastAECRealtime_IFE0 : Type:65536 Fence 3 handler failed in node fence handler
E/CamX: [ERROR][SENSOR ] camxsensornode.cpp:9279 GetSensorMode() Sensor name: s5k2la
E/CamX: [ERROR][SENSOR ] camxsensornode.cpp:9302 GetSensorMode() W x H : 4032, 3024
E//vendor/bin/hw/vendor.samsung.hardware.camera.provider@3.0-service_64: vendor/qcom/proprietary/commonsys-intf/adsprpc/src/fastrpc_apps_user.c:750: Error 0xe08132b8: remote_handle_invoke failed
E/CamX: [ERROR][ISP ] camxispiqmodule.h:1871 IsTuningModeDataChanged() Invalid pointer to current tuning mode parameters (0x0)
E/CamX: [ERROR][PPROC ] camxipenode.cpp:9529 GetFaceROI() Face ROI is not published
**1) 如何在 TextureView 上将常规后置摄像头正确显示为位图?
- 将该位图以 JPEG 或 PNG 格式保存在内部驱动器中**
万分感谢!
我不太明白你想达到什么目的,但也许我可以把你推向正确的方向。
JPG 是一种压缩文件格式,因此不能将其用于相机预览。您通常希望让 Camera 在不进行任何压缩的情况下直接绘制到 TextureView 上。
您确实留下了您需要先进行某种处理的评论,但是如果这种处理需要在显示预览时实时完成,您是否尝试过使用不同的文件格式?任何类型的压缩 image format 通常都会导致性能不佳。
您也可以直接显示预览,偶尔在外部存储上保存压缩的 JPG/PNG。您可以使用 Camera2 做到这一点,尽管 CameraX 通过 use cases.
有一种更简单的方法来做到这一点
如果您想真正将 JPEG 格式的图像转换为位图,您不能像使用以下方法那样只复制字节:
Log.i(TAG,"AFTER width:"+width+",height:"+height);
//***here bitmap is black
bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.RGB_565);
buffer.rewind();
if (bitmap!=null) {
bitmap.copyPixelsFromBuffer(buffer);
}
您需要实际解码压缩的 JPEG,例如 BitmapFactory.decodeByteArray。尽管您必须从 plane[0]
ByteBuffer.
创建一个 byte[]
,但它只会从图像内容中生成一个位图。
但是,您真的不想在这里捕获 JPEG,它们往往很慢并且不会为您提供很好的帧速率。除非你有充分的理由,否则只需使用 TextureView 的 SurfaceTexture 作为相机的目标(通过从 SurfaceTexture 创建 Surface)。这将以高效的设备特定格式传递数据,并且您无需进行任何复制(不过仍然需要处理缩放)。
并且如果需要在绘制前修改预览数据,使用YUV_420_888格式,同样高效,将运行 at 30fps。但这需要更多的努力才能绘制到屏幕上,因为您必须转换为 RGB。
我正在三星 S20 上为朋友制作测试应用程序。
三星 S20 的 ToF(飞行时间)摄像头面向背面。
我想在 TextureView
上并排显示 ToF 图像预览和常规相机预览。
我能够获取 ToF 传感器并使用颜色遮罩将其原始输出转换为视觉输出并直观地显示深度范围(最远的红色、橙色等),请参见屏幕截图:
相关代码如下:
<?xml version="1.0" encoding="utf-8"?>
<androidx.coordinatorlayout.widget.CoordinatorLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<com.google.android.material.appbar.AppBarLayout
android:id="@+id/appBarLayout"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:theme="@style/AppTheme.AppBarOverlay">
<androidx.appcompat.widget.Toolbar
android:id="@+id/toolbar"
android:layout_width="match_parent"
android:layout_height="?attr/actionBarSize"
android:background="?attr/colorPrimary"
app:popupTheme="@style/AppTheme.PopupOverlay" />
<androidx.constraintlayout.widget.ConstraintLayout
android:layout_width="match_parent"
android:layout_height="619dp"
android:background="#FFFFFFFF">
<TextureView
android:id="@+id/regularBackCamera"
android:layout_width="320dp"
android:layout_height="240dp"
android:layout_marginEnd="44dp"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintTop_toTopOf="parent"
app:layout_constraintVertical_bias="0.899" />
<TextView
android:id="@+id/textView3"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Raw ToF Data"
android:textColor="@android:color/primary_text_light"
app:layout_constraintEnd_toEndOf="@+id/rawData"
app:layout_constraintStart_toStartOf="@+id/rawData"
app:layout_constraintTop_toBottomOf="@+id/rawData" />
<TextureView
android:id="@+id/rawData"
android:layout_width="320dp"
android:layout_height="240dp"
android:layout_marginStart="44dp"
app:layout_constraintBottom_toTopOf="@+id/regularBackCamera"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent"
app:layout_constraintVertical_bias="0.485" />
<TextView
android:id="@+id/textView5"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="120dp"
android:text="Back Camera"
android:textColor="@android:color/primary_text_light"
app:layout_constraintStart_toStartOf="@+id/regularBackCamera"
app:layout_constraintTop_toBottomOf="@+id/regularBackCamera" />
</androidx.constraintlayout.widget.ConstraintLayout>
</com.google.android.material.appbar.AppBarLayout>
</androidx.coordinatorlayout.widget.CoordinatorLayout>
MainActivity class:
/* This is an example of getting and processing ToF data
*/
public class MainActivity extends AppCompatActivity implements DepthFrameVisualizer, RegularCameraFrameVisualizer {
private static final String TAG = MainActivity.class.getSimpleName();
public static final int CAM_PERMISSIONS_REQUEST = 0;
private TextureView rawDataView;
private TextureView regularImageView;
private Matrix ToFBitmapTransform;
private Matrix regularBackCameraBitmapTransform;
private BackToFCamera backToFCamera;
private RegularBackCamera regularBackCamera;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
rawDataView = findViewById(R.id.rawData);
regularImageView = findViewById(R.id.regularBackCamera);
checkCamPermissions();
}
@Override
protected void onPause() {
super.onPause();
if ( backToFCamera !=null)
{
backToFCamera.getCamera().close();
backToFCamera = null;
}
if ( regularBackCamera!= null)
{
regularBackCamera.getCamera().close();
regularBackCamera = null;
}
}
@Override
protected void onResume() {
super.onResume();
backToFCamera = new BackToFCamera(this, this);
String tofCameraId = backToFCamera.openCam(null);
regularBackCamera = new RegularBackCamera(this, this);
//pass in tofCameraId to avoid opening again since both regular cam & ToF camera are back facing
regularBackCamera.openCam(tofCameraId);
}
@Override
protected void onDestroy() {
super.onDestroy(); // Add this line
}
private void checkCamPermissions() {
if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA}, CAM_PERMISSIONS_REQUEST);
}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
}
@Override
public void onRawDataAvailable(Bitmap bitmap) {
renderBitmapForToFToTextureView(bitmap, rawDataView);
}
@Override
public void onRegularImageAvailable(Bitmap bitmap) {
renderBitmapToTextureView( bitmap,regularImageView);
}
/* We don't want a direct camera preview since we want to get the frames of data directly
from the camera and process.
This takes a converted bitmap and renders it onto the surface, with a basic rotation
applied.
*/
private void renderBitmapForToFToTextureView(Bitmap bitmap, TextureView textureView) {
if (bitmap!=null && textureView!=null) {
Canvas canvas = textureView.lockCanvas();
canvas.drawBitmap(bitmap, ToFBitmapTransform(textureView), null);
textureView.unlockCanvasAndPost(canvas);
}
}
private void renderBitmapToTextureView(Bitmap bitmap, TextureView textureView) {
if (bitmap!=null && textureView!=null)
{
Canvas canvas = textureView.lockCanvas();
if (canvas!=null) {
canvas.drawBitmap(bitmap, regularBackCamBitmapTransform(textureView), null);
textureView.unlockCanvasAndPost(canvas);
}
}
}
private Matrix ToFBitmapTransform(TextureView view) {
if (view!=null) {
if (ToFBitmapTransform == null || view.getWidth() == 0 || view.getHeight() == 0) {
int rotation = getWindowManager().getDefaultDisplay().getRotation();
Matrix matrix = new Matrix();
int centerX = view.getWidth() / 2;
int centerY = view.getHeight() / 2;
int bufferWidth = DepthFrameAvailableListener.SAMSUNG_S20_TOF_WIDTH;
int bufferHeight = DepthFrameAvailableListener.SAMSUNG_S20_TOF_HEIGHT;
RectF bufferRect = new RectF(0, 0, bufferWidth, bufferHeight);
RectF viewRect = new RectF(0, 0, view.getWidth(), view.getHeight());
matrix.setRectToRect(bufferRect, viewRect, Matrix.ScaleToFit.CENTER);
Log.i(TAG, " rotation:" + rotation);
if (Surface.ROTATION_90 == rotation) {
matrix.postRotate(270, centerX, centerY);
} else if (Surface.ROTATION_270 == rotation) {
matrix.postRotate(90, centerX, centerY);
} else if (Surface.ROTATION_180 == rotation) {
matrix.postRotate(180, centerX, centerY);
} else {
//strange but works!
matrix.postRotate(90, centerX, centerY);
}
ToFBitmapTransform = matrix;
}
}
return ToFBitmapTransform;
}
private Matrix regularBackCamBitmapTransform(TextureView view) {
if (view!=null) {
if (regularBackCameraBitmapTransform == null || view.getWidth() == 0 || view.getHeight() == 0) {
int rotation = getWindowManager().getDefaultDisplay().getRotation();
Matrix matrix = new Matrix();
RectF bufferRect = new RectF(0, 0, MAX_PREVIEW_WIDTH,MAX_PREVIEW_HEIGHT);
RectF viewRect = new RectF(0, 0, view.getWidth(), view.getHeight());
matrix.setRectToRect(bufferRect, viewRect, Matrix.ScaleToFit.CENTER);
float centerX = viewRect.centerX();
float centerY = viewRect.centerY();
Log.i(TAG, " rotation:" + rotation);
if (Surface.ROTATION_90 == rotation) {
matrix.postRotate(270, centerX, centerY);
} else if (Surface.ROTATION_270 == rotation) {
matrix.postRotate(90, centerX, centerY);
} else if (Surface.ROTATION_180 == rotation) {
matrix.postRotate(180, centerX, centerY);
} else {
//strange but works!
matrix.postRotate(90, centerX, centerY);
}
regularBackCameraBitmapTransform = matrix;
}
}
return regularBackCameraBitmapTransform;
}
}
通知帧可用于显示的侦听器,查看函数publishOriginalBitmap()
:
import static com.example.opaltechaitestdepthmap.RegularBackCamera.MAX_PREVIEW_HEIGHT;
import static com.example.opaltechaitestdepthmap.RegularBackCamera.MAX_PREVIEW_WIDTH;
public class BackCameraFrameAvailableListener implements ImageReader.OnImageAvailableListener {
private static final String TAG = BackCameraFrameAvailableListener.class.getSimpleName();
private RegularCameraFrameVisualizer regularCameraFrameVisualizer;
public BackCameraFrameAvailableListener(RegularCameraFrameVisualizer regularCameraFrameVisualizer) {
this.regularCameraFrameVisualizer = regularCameraFrameVisualizer;
}
@Override
public void onImageAvailable(ImageReader reader) {
try {
Image image = reader.acquireNextImage();
if (image != null && image.getFormat() == ImageFormat.JPEG)
{
publishOriginalBitmap(image);
}
}
catch (Exception e) {
Log.e(TAG, "Failed to acquireNextImage: " + e.getMessage());
}
}
private void publishOriginalBitmap(final Image image) {
if (regularCameraFrameVisualizer != null) {
new Thread() {
public void run() {
Bitmap bitmap = returnBitmap(image);
if (bitmap != null) {
regularCameraFrameVisualizer.onRegularImageAvailable(bitmap);
bitmap.recycle();
}
}
}.start();
}
}
private Bitmap returnBitmap(Image image) {
Bitmap bitmap = null;
// width=1920,height=1080
int width =1920;
int height =1080;
if (image!=null) {
Log.i(TAG,"returnBitmap,CONSTANT MAX width:"+MAX_PREVIEW_WIDTH +",MAX height:"+MAX_PREVIEW_HEIGHT);
Log.i(TAG,"BEFORE returnBitmap,image.width:"+width +",height:"+height );
if (image!=null) {
Image.Plane[] planes = image.getPlanes();
if (planes!=null && planes.length>0) {
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
image.close();
Log.i(TAG,"buffer size:"+buffer.capacity());
float currenBufferSize = buffer.capacity();
float jpegReportedArea = width * height;
if (currenBufferSize >=jpegReportedArea ) {
Log.i(TAG,"currenBufferSize >=jpegReportedArea ");
float quotient = jpegReportedArea/currenBufferSize ;
float f_width = width * quotient;
width = (int) Math.ceil(f_width);
float f_height = height * quotient;
height = (int) Math.ceil(f_height);
}
else
{
Log.i(TAG,"currenBufferSize <jpegReportedArea ");
float quotient = currenBufferSize / jpegReportedArea;
float f_width = (width * quotient);
width = (int) Math.ceil(f_width);
float f_height = (height * quotient);
height = (int) Math.ceil(f_height);
}
Log.i(TAG,"AFTER width:"+width+",height:"+height);
//***here bitmap is black
bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.RGB_565);
buffer.rewind();
if (bitmap!=null) {
bitmap.copyPixelsFromBuffer(buffer);
}
}
}
}
return bitmap;
}
}
监听器信号图片接口准备就绪:
package com.example.opaltechaitestdepthmap;
import android.graphics.Bitmap;
public interface RegularCameraFrameVisualizer {
void onRegularImageAvailable(Bitmap bitmap);
}
处理相机状态:
public class RegularBackCamera extends CameraDevice.StateCallback {
private static final String TAG = RegularBackCamera.class.getSimpleName();
private static int FPS_MIN = 15;
private static int FPS_MAX = 30;
public static final int MAX_PREVIEW_WIDTH = 1920;
public static final int MAX_PREVIEW_HEIGHT = 1080;
private Context context;
private CameraManager cameraManager;
private ImageReader RawSensorPreviewReader;
private CaptureRequest.Builder previewBuilder;
private BackCameraFrameAvailableListener imageAvailableListener;
private String cameraId;
private CameraDevice camera;
public RegularBackCamera(Context context, RegularCameraFrameVisualizer frameVisualizer) {
this.context = context;
cameraManager = (CameraManager)context.getSystemService(Context.CAMERA_SERVICE);
imageAvailableListener = new BackCameraFrameAvailableListener(frameVisualizer);
}
// Open the back camera and start sending frames
public String openCam(String idToExclude) {
this.cameraId = getBackCameraID(idToExclude);
Size size = openCamera(this.cameraId);
//Tried this DID NOT WORK Size smallerPreviewSize =chooseSmallerPreviewSize();
RawSensorPreviewReader = ImageReader.newInstance(MAX_PREVIEW_WIDTH,
MAX_PREVIEW_HEIGHT, ImageFormat.JPEG,2);
Log.i(TAG,"ImageFormat.JPEG, width:"+size.getWidth()+", height:"+ size.getHeight());
RawSensorPreviewReader.setOnImageAvailableListener(imageAvailableListener, null);
return this.cameraId;
}
private String getBackCameraID(String idToExclude) {
String cameraId = null;
CameraManager cameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
try {
if (idToExclude!=null) {
for (String camera : cameraManager.getCameraIdList()) {
//avoid getting same camera
if (!camera.equalsIgnoreCase(idToExclude)) {
//avoid return same camera twice as 1 sensor can only be accessed once
CameraCharacteristics chars = cameraManager.getCameraCharacteristics(camera);
final int[] capabilities = chars.get(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES);
boolean facingBack = chars.get(CameraCharacteristics.LENS_FACING) == CameraMetadata.LENS_FACING_BACK;
if (facingBack) {
cameraId = camera;
// Note that the sensor size is much larger than the available capture size
SizeF sensorSize = chars.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE);
Log.i(TAG, "Sensor size: " + sensorSize);
// Since sensor size doesn't actually match capture size and because it is
// reporting an extremely wide aspect ratio, this FoV is bogus
float[] focalLengths = chars.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS);
if (focalLengths.length > 0) {
float focalLength = focalLengths[0];
double fov = 2 * Math.atan(sensorSize.getWidth() / (2 * focalLength));
Log.i(TAG, "Calculated FoV: " + fov);
}
}
}//end avoid getting same camera
}//end for
}
else
{
for (String camera : cameraManager.getCameraIdList()) {
//avoid return same camera twice as 1 sensor can only be accessed once
CameraCharacteristics chars = cameraManager.getCameraCharacteristics(camera);
final int[] capabilities = chars.get(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES);
boolean facingFront = chars.get(CameraCharacteristics.LENS_FACING) == CameraMetadata.LENS_FACING_BACK;
if (facingFront) {
cameraId = camera;
// Note that the sensor size is much larger than the available capture size
SizeF sensorSize = chars.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE);
Log.i(TAG, "Sensor size: " + sensorSize);
// Since sensor size doesn't actually match capture size and because it is
// reporting an extremely wide aspect ratio, this FoV is bogus
float[] focalLengths = chars.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS);
if (focalLengths.length > 0) {
float focalLength = focalLengths[0];
double fov = 2 * Math.atan(sensorSize.getWidth() / (2 * focalLength));
Log.i(TAG, "Calculated FoV: " + fov);
}
}
}//end for
}
} catch (CameraAccessException e) {
e.printStackTrace();
}
return cameraId ;
}
//opens camera based on ID & returns optimal size caped at maximum size based on docs
private Size openCamera(String cameraId) {
Size size = null;
try{
int permission = ContextCompat.checkSelfPermission(context, Manifest.permission.CAMERA);
if(PackageManager.PERMISSION_GRANTED == permission) {
if ( cameraManager!=null) {
if (cameraId!=null) {
cameraManager.openCamera(cameraId, this, null);
CameraCharacteristics characteristics
= cameraManager.getCameraCharacteristics(cameraId);
StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
size = Collections.max(
Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),
new CompareSizeByArea());
if (size.getWidth() > MAX_PREVIEW_WIDTH || size.getHeight() > MAX_PREVIEW_HEIGHT)
{
size = new Size( MAX_PREVIEW_WIDTH ,MAX_PREVIEW_HEIGHT);
}
List<Size> sizes = Arrays.asList(map.getOutputSizes(ImageFormat.JPEG));
for (int i=0; i<sizes.size(); i++)
{
Log.i(RegularBackCamera.class.toString(),"JPEG sizes, width="+sizes.get(i).getWidth()+","+"height="+sizes.get(i).getHeight());
}
}
}
}else{
Log.e(TAG,"Permission not available to open camera");
}
}catch (CameraAccessException | IllegalStateException | SecurityException e){
Log.e(TAG,"Opening Camera has an Exception " + e);
e.printStackTrace();
}
return size;
}
@Override
public void onOpened(@NonNull CameraDevice camera) {
try {
this.camera = camera;
previewBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewBuilder.set(CaptureRequest.JPEG_ORIENTATION, 0);
Range<Integer> fpsRange = new Range<>(FPS_MIN, FPS_MAX);
previewBuilder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, fpsRange);
previewBuilder.addTarget(RawSensorPreviewReader.getSurface());
List<Surface> targetSurfaces = Arrays.asList(RawSensorPreviewReader.getSurface());
camera.createCaptureSession(targetSurfaces,
new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession session) {
onCaptureSessionConfigured(session);
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession session) {
Log.e(TAG,"!!! Creating Capture Session failed due to internal error ");
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private void onCaptureSessionConfigured(@NonNull CameraCaptureSession session) {
Log.i(TAG,"Capture Session created");
previewBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
try {
session.setRepeatingRequest(previewBuilder.build(), null, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onDisconnected(@NonNull CameraDevice camera) {
if (camera!=null)
{
camera.close();
camera = null;
}
}
@Override
public void onError(@NonNull CameraDevice camera, int error) {
if (camera!=null)
{
camera.close();
Log.e(TAG,"onError,cameraID:"+camera.getId()+",error:"+error);
camera = null;
}
}
protected Size chooseSmallerPreviewSize()
{
CameraManager cm = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
CameraCharacteristics cc = null;
try {
cc = cm.getCameraCharacteristics(this.cameraId);
} catch (CameraAccessException e) {
e.printStackTrace();
}
StreamConfigurationMap streamConfigs = cc.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size[] sizes = streamConfigs.getOutputSizes( ImageFormat.JPEG);
Size smallerPreviewSize = chooseVideoSize( sizes);
return smallerPreviewSize;
}
//Rerefences:
protected Size chooseVideoSize(Size[] choices) {
List<Size> smallEnough = new ArrayList<>();
for (Size size : choices) {
if (size.getWidth() == size.getHeight() * 4 / 3 && size.getWidth() <= 1080) {
smallEnough.add(size);
}
}
if (smallEnough.size() > 0) {
return Collections.max(smallEnough, new CompareSizeByArea());
}
return choices[choices.length - 1];
}
public CameraDevice getCamera() {
return camera;
}
}
排序预览尺寸的助手:
public class CompareSizeByArea implements Comparator<Size> {
@Override
public int compare(Size lhs, Size rhs) {
return Long.signum((long) lhs.getWidth() * lhs.getHeight() -
(long) rhs.getWidth() * rhs.getHeight());
}
}
我只包含了普通摄像头的代码,因为普通摄像头没有显示,但是获取 ToF 摄像头和侦听器的代码与 完全相同 除了 ToF 特定逻辑。
我在应用日志中没有看到任何异常或错误,但日志系统显示:
E/CHIUSECASE: [ERROR ] chxusecase.cpp:967 ReturnFrameworkResult() ChiFrame: 0 App Frame: 0 - pResult contains more buffers (1) than the expected number of buffers (0) to return to the framework!
E/CamX: [ERROR][CORE ] camxnode.cpp:4518 CSLFenceCallback() Node::FastAECRealtime_IFE0 : Type:65536 Fence 3 handler failed in node fence handler
E/CamX: [ERROR][SENSOR ] camxsensornode.cpp:9279 GetSensorMode() Sensor name: s5k2la
E/CamX: [ERROR][SENSOR ] camxsensornode.cpp:9302 GetSensorMode() W x H : 4032, 3024
E//vendor/bin/hw/vendor.samsung.hardware.camera.provider@3.0-service_64: vendor/qcom/proprietary/commonsys-intf/adsprpc/src/fastrpc_apps_user.c:750: Error 0xe08132b8: remote_handle_invoke failed
E/CamX: [ERROR][ISP ] camxispiqmodule.h:1871 IsTuningModeDataChanged() Invalid pointer to current tuning mode parameters (0x0)
E/CamX: [ERROR][PPROC ] camxipenode.cpp:9529 GetFaceROI() Face ROI is not published
**1) 如何在 TextureView 上将常规后置摄像头正确显示为位图?
- 将该位图以 JPEG 或 PNG 格式保存在内部驱动器中**
万分感谢!
我不太明白你想达到什么目的,但也许我可以把你推向正确的方向。
JPG 是一种压缩文件格式,因此不能将其用于相机预览。您通常希望让 Camera 在不进行任何压缩的情况下直接绘制到 TextureView 上。
您确实留下了您需要先进行某种处理的评论,但是如果这种处理需要在显示预览时实时完成,您是否尝试过使用不同的文件格式?任何类型的压缩 image format 通常都会导致性能不佳。
您也可以直接显示预览,偶尔在外部存储上保存压缩的 JPG/PNG。您可以使用 Camera2 做到这一点,尽管 CameraX 通过 use cases.
有一种更简单的方法来做到这一点如果您想真正将 JPEG 格式的图像转换为位图,您不能像使用以下方法那样只复制字节:
Log.i(TAG,"AFTER width:"+width+",height:"+height);
//***here bitmap is black
bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.RGB_565);
buffer.rewind();
if (bitmap!=null) {
bitmap.copyPixelsFromBuffer(buffer);
}
您需要实际解码压缩的 JPEG,例如 BitmapFactory.decodeByteArray。尽管您必须从 plane[0]
ByteBuffer.
byte[]
,但它只会从图像内容中生成一个位图。
但是,您真的不想在这里捕获 JPEG,它们往往很慢并且不会为您提供很好的帧速率。除非你有充分的理由,否则只需使用 TextureView 的 SurfaceTexture 作为相机的目标(通过从 SurfaceTexture 创建 Surface)。这将以高效的设备特定格式传递数据,并且您无需进行任何复制(不过仍然需要处理缩放)。
并且如果需要在绘制前修改预览数据,使用YUV_420_888格式,同样高效,将运行 at 30fps。但这需要更多的努力才能绘制到屏幕上,因为您必须转换为 RGB。