位图创建优化

Optimization in bitmap creation

我正在编写一个应用程序,该应用程序将从远程 TCP 连接实时接收的一系列图片呈现为 ImageView 元素。 stream 由以 PGM 格式编码并以 9Hz 发送的单帧组成解码位图到我的 MainActivity.

这是我的 VideoService(我只发布 run() 方法,因为我认为它是唯一的一种方法):

    public void run() {
        InetAddress serverAddr = null;

        try {
            serverAddr = InetAddress.getByName(VIDEO_SERVER_ADDR);
        } catch (UnknownHostException e) {
            Log.e(getClass().getName(), e.getMessage());
            e.printStackTrace();
            return;
        }

        Socket socket = null;
        BufferedReader reader = null;

        do {
            try {
                socket = new Socket(serverAddr, VIDEO_SERVER_PORT);

                reader = new BufferedReader(new InputStreamReader(socket.getInputStream()));

                boolean frameStart = false;

                LinkedList<String> frameList = new LinkedList<>();

                while (keepRunning) {
                    final String message = reader.readLine();

                    if (!frameStart && message.startsWith("F"))
                        frameStart = true;
                    else if (frameStart && message.startsWith("EF")) {
                        frameStart = false;

                        final Bitmap bitmap = Bitmap.createBitmap(IR_FRAME_WIDTH, IR_FRAME_HEIGHT, Bitmap.Config.ARGB_8888);
                        final Canvas canvas = new Canvas(bitmap);

                        final String[] data = frameList.toArray(new String[frameList.size()]);

                        canvas.drawBitmap(bitmap, 0, 0, null);

                        //Log.d(this.getClass().getName(), "IR FRAME COLLECTED");

                        if ((data.length - 6) == IR_FRAME_HEIGHT) {
                            float grayScaleRatio = Float.parseFloat(data[2].trim()) / 255.0f;

                            for (int y = 0; y < IR_FRAME_HEIGHT; y++) {
                                final String line = data[y + 3];
                                final String[] points = line.split("\s+");

                                if (points.length == IR_FRAME_WIDTH) {
                                    for (int x = 0; x < IR_FRAME_WIDTH; x++) {
                                        final float grayLevel = Float.parseFloat(points[x]) / grayScaleRatio;

                                        Paint paint = new Paint();

                                        paint.setStyle(Paint.Style.FILL);

                                        final int level = (int)grayLevel;

                                        paint.setColor(Color.rgb(level, level, level));

                                        canvas.drawPoint(x, y, paint);
                                    }
                                } else
                                    Log.d(this.getClass().getName(), "Malformed line");
                            }

                            final Intent messageIntent = new Intent();

                            messageIntent.setAction(VIDEO_BROADCAST_KEY);

                            ByteArrayOutputStream stream = new ByteArrayOutputStream();

                            bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream);
                            bitmap.recycle();
                            messageIntent.putExtra(VIDEO_MESSAGE_KEY, stream.toByteArray());
                            stream.close();
                            sendBroadcast(messageIntent);
                        } else
                            Log.d(this.getClass().getName(), "Malformed data");

                        frameList.clear();
                    } else if (frameStart)
                        frameList.add(message);
                }

                Thread.sleep(VIDEO_SERVER_RESPAWN);

            } catch (Throwable e) {
                Log.e(getClass().getName(), e.getMessage());
                e.printStackTrace();
            }
        } while (keepRunning);

        if (socket != null) {
            try {
                socket.close();
            } catch (Throwable e) {
                Log.e(getClass().getName(), e.getMessage());
                e.printStackTrace();
            }
        }
    }

message 是来自以下文本的一行:

F
P2
160 120
1226
193 141 158 152 193 186 171 177 186 160 195 182 ... (160 times)
                         .
                         . (120 lines)
                         .
278 248 253 261 257 284 310 304 304 272 227 208 ... (160 times)


EF

在 MainActivity 我通过以下代码处理这个问题:

class VideoReceiver extends BroadcastReceiver {
    final public Queue<Bitmap> imagesQueue = new LinkedList<>();

    @Override
    public void onReceive(Context context, Intent intent) {

        try {
            //Log.d(getClass().getName(), "onReceive() called");

            final byte[] data = intent.getByteArrayExtra(VideoService.VIDEO_MESSAGE_KEY);

            final Bitmap bitmap = BitmapFactory.decodeByteArray(data,0,data.length);

            imagesQueue.add(bitmap);

            runOnUiThread(updateVideoTask);
        } catch (Exception ex) {
            ex.printStackTrace();
        }
    }
}

updateVideoTask 任务定义如下:

    updateVideoTask = new Runnable() {
        public void run() {
            if (videoReceiver == null) return;

            if (!videoReceiver.imagesQueue.isEmpty())
            {
                final Bitmap image = videoReceiver.imagesQueue.poll();

                if (image == null) return;

                videoView.setImageBitmap(image);

                Log.d(this.getClass().getName(), "Images to spool: " + videoReceiver.imagesQueue.size());
            }
        }
    };

不幸的是,当我 运行 应用程序时,我注意到帧速率非常低且延迟非常大。我不能争论发生了什么。 我从 logcat 得到的唯一提示是这些行:

2019-05-20 16:37:08.817 29566-29580/it.tux.gcs I/art: Background sticky concurrent mark sweep GC freed 88152(3MB) AllocSpace objects, 3(52KB) LOS objects, 22% free, 7MB/10MB, paused 3.937ms total 111.782ms
2019-05-20 16:37:08.832 29566-29587/it.tux.gcs D/skia: Encode PNG Singlethread :      13003 us, width=160, height=120

即使有所有这些延迟的总和(140 毫秒),应用程序也应该维持超过 5Hz 的帧速率,同时得到 0.25Hz 甚至更糟。

经过一些调查我发现移动:

Paint paint = new Paint();
paint.setStyle(Paint.Style.FILL);

在嵌套循环之外阻止 GC 被如此频繁地调用,我在这一行中发现了另一个主要的延迟来源:

final String[] points = line.split("\s+");

它每次耗尽 2 毫秒,所以我决定选择不那么聪明但速度更快的东西:

final String[] points = line.split(" ");

反正还是不够..代码之间:

canvas.drawBitmap(bitmap, 0, 0, null);

sendBroadcast(messageIntent);

仍然消耗超过 200 毫秒...我怎样才能做得更好?

我很确定有一种更有效的方法可以从 TCP 服务器收集一系列这种大小和速率的帧并将它们显示在 ImageView

当然这可能是软件架构的问题,而不仅仅是代码本身的优化。我对本机代码以外的任何新方法都持开放态度(我对此并不熟悉)。

更新(2019 年 3 月 11 日):

Activity 边:

public class MainActivity extends AppCompatActivity implements FrameReadyCallBack {
    private Intent videoServiceIntent;
    private VideoService videoService;
    private boolean bound = false;
    private ImageView surfaceView_video = null;
    private String videoPort = "5002";
    private String videoServerAddr = "192.168.10.107";
    private ServiceConnection serviceConnection = null;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        surfaceView_video = findViewById(R.id.surfaceView_video);

        serviceConnection = new ServiceConnection() {
            @Override
            public void onServiceConnected(ComponentName className, IBinder service) {
                VideoService.VideoServiceBinder binder = (VideoService.VideoServiceBinder) service;
                videoService = binder.getService();
                bound = true;
                videoService.registerCallBack(MainActivity.this); // register
            }

            @Override
            public void onServiceDisconnected(ComponentName arg0) {
                bound = false;
            }
        };

        startVideoService();
    }

    @Override
    public void frameReady(byte[] image_data) {
        //TODO: create image and update surfaceView_video
    }

    public void startVideoService()
    {
        videoServiceIntent = new Intent(this, VideoService.class);

        videoServiceIntent.putExtra(VideoService.LOCAL_PORT_KEY, videoPort);
        videoServiceIntent.putExtra(VideoService.LOCAL_VIDEOSERVER_ADDR_KEY, videoServerAddr);

        startService(videoServiceIntent);
    }

    @Override
    protected void onStart() {
        super.onStart();
        bindService();
    }

    @Override
    protected void onStop() {
        super.onStop();
        unbindService();
    }

    private void bindService() {
        bindService(videoServiceIntent, serviceConnection, Context.BIND_AUTO_CREATE);
    }

    private void unbindService(){
        if (bound) {
            videoService.registerCallBack(null); // unregister
            unbindService(serviceConnection);
            bound = false;
        }
    }
}

服务端:

public class VideoService extends Service {
    public static final String LOCAL_PORT_KEY = "video_port";
    public static final String LOCAL_VIDEOSERVER_ADDR_KEY = "video_server_addr";
    private static final int DEFAULT_VIDEO_PORT = 5002;
    private static final int VIDEO_SERVER_RESPAWN = 2000;

    private volatile FrameReadyCallBack frameReadyCallBack = null;
    private VideoReceiver videoReceiver = null;
    private IBinder videoServiceBinder = new VideoServiceBinder();

    @Nullable
    @Override
    public IBinder onBind(Intent intent) {
        return videoServiceBinder ;
    }

    @Override
    public boolean onUnbind(Intent intent) {
        videoReceiver.kill();
        return super.onUnbind(intent);
    }

    @Override
    public int onStartCommand(Intent intent, int flags, int startId) {
        final int localVideoPort = intent.getIntExtra(LOCAL_PORT_KEY, DEFAULT_VIDEO_PORT);
        final String videoServerAddr = intent.getStringExtra(LOCAL_VIDEOSERVER_ADDR_KEY);

        videoReceiver = new VideoReceiver(videoServerAddr, localVideoPort);
        videoReceiver.start();

        return Service.START_NOT_STICKY;
    }

    public void registerCallBack(FrameReadyCallBack frameReadyCallBack) {
        this.frameReadyCallBack = frameReadyCallBack;
    }

    public class VideoServiceBinder extends Binder {

        public VideoService getService() {
            return VideoService.this;
        }
    }

    private class VideoReceiver extends Thread {
        private boolean keepRunning = true;
        private int VIDEO_SERVER_PORT;
        private String VIDEO_SERVER_ADDR;
        private int bad_frames;
        private int frames;
        private int link_respawn;
        private FrameDecodingStatus status;

        public VideoReceiver(String addr, int listen_port) {
            VIDEO_SERVER_PORT = listen_port;
            VIDEO_SERVER_ADDR = addr;
        }

        public void run() {
            InetAddress serverAddr;
            link_respawn = 0;

            try {
                serverAddr = InetAddress.getByName(VIDEO_SERVER_ADDR);
            } catch (UnknownHostException e) {
                Log.e(getClass().getName(), e.getMessage());
                e.printStackTrace();
                return;
            }

            Socket socket = null;
            DataInputStream stream;

            do {
                bad_frames = 0;
                frames = 0;
                status = FrameDecodingStatus.Idle;

                try {
                    socket = new Socket(serverAddr, VIDEO_SERVER_PORT);

                    stream = new DataInputStream(new BufferedInputStream(socket.getInputStream()));

                    final byte[] _data = new byte[PACKET_SIZE];
                    final byte[] _image_data = new byte[IMAGE_SIZE];
                    int _data_index = 0;

                    while (keepRunning) {
                        if (stream.read(_data, 0, _data.length) == 0)
                            continue;

                        for (byte _byte : _data) {
                            if (status == FrameDecodingStatus.Idle) {
                               //Wait SoM
                            } else if (status == FrameDecodingStatus.Data) {
                               //Collect data
                            } else {
                                    frameReadyCallBack.frameReady(_image_data);
                                    status = FrameDecodingStatus.Idle;
                            }
                        }
                    }

                    link_respawn++;
                    Thread.sleep(VIDEO_SERVER_RESPAWN);
                    Log.d(getClass().getName(), "Link respawn: " + link_respawn);
                } catch (Throwable e) {
                    Log.e(getClass().getName(), e.getMessage());
                    e.printStackTrace();
                }
            } while (keepRunning);

            if (socket != null) {
                try {
                    socket.close();
                } catch (Throwable e) {
                    Log.e(getClass().getName(), e.getMessage());
                    e.printStackTrace();
                }
            }
        }

        public void kill() {
            keepRunning = false;
        }
    }
}

首先,出于某种原因,您正在通过 BroadcastReceiver 提交带有新图像更改的结果。您可以显着提高整体速度,但删除此逻辑。并通过 bound features.

将通信替换为 Service
    // Bind to LocalService
    Intent intent = new Intent(this, LocalService.class);
    bindService(intent, connection, Context.BIND_AUTO_CREATE);

然后接收连接。

/** Defines callbacks for service binding, passed to bindService() */
private ServiceConnection connection = new ServiceConnection() {

    @Override
    public void onServiceConnected(ComponentName className,
            IBinder service) {
        // We've bound to LocalService, cast the IBinder and get LocalService instance
        LocalBinder binder = (LocalBinder) service;
        mService = binder.getService();
        mBound = true;
    }

    @Override
    public void onServiceDisconnected(ComponentName arg0) {
        mBound = false;
    }
};

然后使用 Service 活页夹实例订阅 Activity 并在 Service 到 post 新数据字节中使用回调。