'Color'在MATLAB中使用pointCloud时必须对应输入点的个数
'Color' must correspond to the number of input points when using pointCloud in MATLAB
我想从立体相机创建 3D 地图,为了对此进行测试,我使用了两个给定的 MATLAB 示例:
- https://de.mathworks.com/help/vision/ref/estimatecameraparameters.html
("Stereo Camera Calibration")
- https://de.mathworks.com/help/vision/examples/depth-estimation-from-stereo-video.html
我将这两个脚本合并为以下一个:
% load left and right images
leftImages = imageDatastore(fullfile(toolboxdir('vision'),'visiondata', ...
'calibration','stereo','left'));
rightImages = imageDatastore(fullfile(toolboxdir('vision'),'visiondata', ...
'calibration','stereo','right'));
% calculate image points
[imagePoints,boardSize] = ...
detectCheckerboardPoints(leftImages.Files,rightImages.Files);
% calculate world points
squareSize = 108;
worldPoints = generateCheckerboardPoints(boardSize,squareSize);
% calculate camera paramters
I = readimage(leftImages,1);
imageSize = [size(I,1),size(I,2)];
stereoParams = estimateCameraParameters(imagePoints,worldPoints, ...
'ImageSize',imageSize);
% get left and right image
frameLeftGray = imread(leftImages.Files{1});
frameRightGray = imread(rightImages.Files{1});
[frameLeftRect, frameRightRect] = ...
rectifyStereoImages(frameLeftGray, frameRightGray, stereoParams);
% get disparity map
disparityMap = disparity(frameLeftRect, frameRightRect);
figure;
imshow(disparityMap, [0, 128]);
title('Disparity Map');
colormap jet
colorbar
% create 3D Bar
points3D = reconstructScene(disparityMap, stereoParams);
% Convert to meters and create a pointCloud object
points3D = points3D ./ 1000;
% This will fail
ptCloud = pointCloud(points3D, 'Color', frameLeftRect);
% Create a streaming point cloud viewer
player3D = pcplayer([-3, 3], [-3, 3], [0, 8], 'VerticalAxis', 'y', ...
'VerticalAxisDir', 'down');
% Visualize the point cloud
view(player3D, ptCloud);
但是,执行后我会收到以下错误消息:
Error using pointCloud/set.Color (line 545) 'Color' must correspond to
the number of input points.
Error in pointCloud (line 151)
this.Color = C;
Error in DepthEstimation (line 45) ptCloud = pointCloud(points3D,
'Color', frameLeftRect);
分别尝试示例 1) 和 2) 时,它们工作得很好。我认为这与图像尺寸本身有关。然而调整它的大小,会导致错误的相机参数。
那么有没有其他方法可以修复存在 "Color" 参数的错误?
提前致谢
您使用的是灰度图像作为输入,因此它与 RGB 点不匹配。从灰度图像创建一个 RGB 图像,然后使用它。
rgb = cat(3,frameRightRect,frameRightRect,frameRightRect);
ptCloud = pointCloud(points3D, 'Color', rgb);
我想从立体相机创建 3D 地图,为了对此进行测试,我使用了两个给定的 MATLAB 示例:
- https://de.mathworks.com/help/vision/ref/estimatecameraparameters.html ("Stereo Camera Calibration")
- https://de.mathworks.com/help/vision/examples/depth-estimation-from-stereo-video.html
我将这两个脚本合并为以下一个:
% load left and right images
leftImages = imageDatastore(fullfile(toolboxdir('vision'),'visiondata', ...
'calibration','stereo','left'));
rightImages = imageDatastore(fullfile(toolboxdir('vision'),'visiondata', ...
'calibration','stereo','right'));
% calculate image points
[imagePoints,boardSize] = ...
detectCheckerboardPoints(leftImages.Files,rightImages.Files);
% calculate world points
squareSize = 108;
worldPoints = generateCheckerboardPoints(boardSize,squareSize);
% calculate camera paramters
I = readimage(leftImages,1);
imageSize = [size(I,1),size(I,2)];
stereoParams = estimateCameraParameters(imagePoints,worldPoints, ...
'ImageSize',imageSize);
% get left and right image
frameLeftGray = imread(leftImages.Files{1});
frameRightGray = imread(rightImages.Files{1});
[frameLeftRect, frameRightRect] = ...
rectifyStereoImages(frameLeftGray, frameRightGray, stereoParams);
% get disparity map
disparityMap = disparity(frameLeftRect, frameRightRect);
figure;
imshow(disparityMap, [0, 128]);
title('Disparity Map');
colormap jet
colorbar
% create 3D Bar
points3D = reconstructScene(disparityMap, stereoParams);
% Convert to meters and create a pointCloud object
points3D = points3D ./ 1000;
% This will fail
ptCloud = pointCloud(points3D, 'Color', frameLeftRect);
% Create a streaming point cloud viewer
player3D = pcplayer([-3, 3], [-3, 3], [0, 8], 'VerticalAxis', 'y', ...
'VerticalAxisDir', 'down');
% Visualize the point cloud
view(player3D, ptCloud);
但是,执行后我会收到以下错误消息:
Error using pointCloud/set.Color (line 545) 'Color' must correspond to the number of input points.
Error in pointCloud (line 151) this.Color = C;
Error in DepthEstimation (line 45) ptCloud = pointCloud(points3D, 'Color', frameLeftRect);
分别尝试示例 1) 和 2) 时,它们工作得很好。我认为这与图像尺寸本身有关。然而调整它的大小,会导致错误的相机参数。
那么有没有其他方法可以修复存在 "Color" 参数的错误?
提前致谢
您使用的是灰度图像作为输入,因此它与 RGB 点不匹配。从灰度图像创建一个 RGB 图像,然后使用它。
rgb = cat(3,frameRightRect,frameRightRect,frameRightRect);
ptCloud = pointCloud(points3D, 'Color', rgb);