提交 f7bd6838 编写于 作者: G gineshidalgo99

Stereo-camera images can be read from disk - offline processing

上级 fcadd274
......@@ -72,12 +72,12 @@ In order to verify that the camera parameters introduced by the user are sorted
## Installing the OpenPose 3-D Reconstruction Module
Check the [doc/quick_start.md#3-d-reconstruction](./quick_start.md#3-d-reconstruction) for basic examples.
Check the [doc/installation.md#openpose-3d-reconstruction-module](./quick_start.md#openpose-3d-reconstruction-module) for installation steps.
## Quick Start
You can copy and modify the OpenPose 3-D demo to use any camera brand, by modifying the frames producer. For that, you would need to provide your custom code to retrieve synchronized images from your cameras, as well as their intrinsic and extrinsic camera parameters.
Check the [doc/quick_start.md#3-d-reconstruction](./quick_start.md#3-d-reconstruction) for basic examples.
......
......@@ -148,7 +148,7 @@ Each flag is divided into flag name, default value, and description.
- DEFINE_int32(frame_rotate, 0, "Rotate each frame, 4 possible values: 0, 90, 180, 270.");
- DEFINE_bool(frames_repeat, false, "Repeat frames when finished.");
- DEFINE_bool(process_real_time, false, "Enable to keep the original source frame rate (e.g. for video). If the processing time is too long, it will skip frames. If it is too fast, it will slow it down.");
- DEFINE_string(camera_parameter_folder, "models/cameraParameters/", "String with the folder where the camera parameters are located.");
- DEFINE_string(camera_parameter_folder, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located.");
3. OpenPose
- DEFINE_string(model_folder, "models/", "Folder path (absolute or relative) where the models (pose, face, ...) are located.");
......
......@@ -12,7 +12,7 @@ OpenPose - Installation
8. [Uninstallation](#uninstallation)
9. [Optional Settings](#optional-settings)
1. [MPI Model](#mpi-model)
2. [OpenPose 3D Reconstruction Module and Demo](#openpose-3d-reconstruction-module-and-demo)
2. [OpenPose 3D Reconstruction Module](#openpose-3d-reconstruction-module)
3. [Compiling without cuDNN](#compiling-without-cudnn)
4. [Custom Caffe (Ubuntu Only)](#custom-caffe-ubuntu-only)
5. [Custom OpenCV (Ubuntu Only)](#custom-opencv-ubuntu-only)
......@@ -99,6 +99,18 @@ The instructions in this section describe the steps to build OpenPose using CMak
4. Windows - **Microsoft Visual Studio (VS) 2015 Enterprise Update 3**:
- If **Visual Studio 2017 Community** is desired, we do not officially support it, but it might be compiled by firstly [enabling CUDA 8.0 in VS2017](https://stackoverflow.com/questions/43745099/using-cuda-with-visual-studio-2017?answertab=active#tab-top) or use **VS2017 with CUDA 9** by checking the `.vcxproj` file and changing the necessary paths from CUDA 8 to 9.
- VS 2015 Enterprise Update 1 will give some compiler errors and VS 2015 Community has not been tested.
5. Windows - **Caffe, OpenCV, and Caffe prerequisites**:
- CMake automatically downloads all the Windows DLLs. Alternatively, you might prefer to download them manually:
- Models:
- [COCO model](http://posefs1.perception.cs.cmu.edu/OpenPose/models/pose/coco/pose_iter_440000.caffemodel): download in `models/pose/coco/`.
- [MPI model](http://posefs1.perception.cs.cmu.edu/OpenPose/models/pose/mpi/pose_iter_160000.caffemodel): download in `models/pose/mpi/`.
- [Face model](http://posefs1.perception.cs.cmu.edu/OpenPose/models/face/pose_iter_116000.caffemodel): download in `models/face/`.
- [Hands model](http://posefs1.perception.cs.cmu.edu/OpenPose/models/hand/pose_iter_102000.caffemodel): download in `models/hand/`.
- Dependencies:
- Note: Leave the zip files in `3rdparty/windows/` so that CMake does not try to download them again.
- [Caffe](http://posefs1.perception.cs.cmu.edu/OpenPose/3rdparty/windows/caffe_2018_01_18.zip): Unzip as `3rdparty/windows/caffe/`.
- [Caffe dependencies](http://posefs1.perception.cs.cmu.edu/OpenPose/3rdparty/windows/caffe3rdparty_2017_07_14.zip): Unzip as `3rdparty/windows/caffe3rdparty/`.
- [OpenCV 3.1](http://posefs1.perception.cs.cmu.edu/OpenPose/3rdparty/windows/opencv_310.zip): Unzip as `3rdparty/windows/opencv/`.
......@@ -201,7 +213,7 @@ By default, the body MPI model is not downloaded. You can download it by turning
#### OpenPose 3D Reconstruction Module and Demo
#### OpenPose 3D Reconstruction Module
You can include the 3D reconstruction module by:
1. Install the FLIR camera software, Spinnaker SDK. It is a propietary software, so we cannot provide direct download link. Note: You might skip this step if you intend to use the 3-D OpenPose module with a different camera brand.
......
......@@ -101,16 +101,21 @@ const auto& poseBodyPartMappingMpi = getPoseBodyPartMapping(PoseModel::MPI_15);
### Heatmap Ordering
For the **heat maps storing format**, instead of saving each of the 67 heatmaps (18 body parts + background + 2 x 19 PAFs) individually, the library concatenates them into a huge (width x #heat maps) x (height) matrix (i.e., concatenated by columns). E.g., columns [0, individual heat map width] contains the first heat map, columns [individual heat map width + 1, 2 * individual heat map width] contains the second heat map, etc. Note that some image viewers are not able to display the resulting images due to the size. However, Chrome and Firefox are able to properly open them.
The saving order is body parts + background + PAFs. Any of them can be disabled with program flags. If background is disabled, then the final image will be body parts + PAFs. The body parts and background follow the order of `getPoseBodyPartMapping(const PoseModel poseModel)`, while the PAFs follow the order specified on `getPosePartPairs(const PoseModel poseModel)`:
The saving order is body parts + background + PAFs. Any of them can be disabled with program flags. If background is disabled, then the final image will be body parts + PAFs. The body parts and background follow the order of `getPoseBodyPartMapping(const PoseModel poseModel)`.
The PAFs follow the order specified on `getPosePartPairs(const PoseModel poseModel)` together with `getPoseMapIndex(const PoseModel poseModel)`. E.g., assuming COCO (see example code below), the PAF channels in COCO start in 19 (smallest number in `getPoseMapIndex`, equal to #body parts + 1), and end up in 56 (highest one). Then, we can match its value from `getPosePartPairs`. For instance, 19 (x-channel) and 20 (y-channel) in `getPoseMapIndex` correspond to PAF from body part 1 to 8; 21 and 22 correspond to x,y channels in the joint from body part 8 to 9, etc. Note that if the smallest channel is odd (19), then all the x-channels are odd, and all the y-channels even. If the smallest channel is even, then the opposite will happen.
```
// C++ API call
#include <openpose/pose/poseParameters.hpp>
const auto& posePartPairsCoco = getPosePartPairs(PoseModel::COCO_18);
const auto& posePartPairsMpi = getPosePartPairs(PoseModel::MPI_15);
// POSE_COCO_PAIRS
// getPosePartPairs(PoseModel::COCO_18) result
// Each index is the key value corresponding to each body part in `getPoseBodyPartMapping`. E.g., 1 for "Neck", 2 for "RShoulder", etc.
// 1,2, 1,5, 2,3, 3,4, 5,6, 6,7, 1,8, 8,9, 9,10, 1,11, 11,12, 12,13, 1,0, 0,14, 14,16, 0,15, 15,17, 2,16, 5,17
// getPoseMapIndex(PoseModel::COCO_18) result
// 31,32, 39,40, 33,34, 35,36, 41,42, 43,44, 19,20, 21,22, 23,24, 25,26, 27,28, 29,30, 47,48, 49,50, 53,54, 51,52, 55,56, 37,38, 45,46
```
......
......@@ -131,7 +131,7 @@ build\x64\Release\OpenPoseDemo.exe --flir_camera --3d --number_people_max 1 --fa
./build/examples/openpose/openpose.bin --flir_camera --3d --number_people_max 1 --write_json output_folder_path/
```
3. Saving stereo camera images fast (without keypoint detection) for later post-processing
3. Fast stereo camera image saving (without keypoint detection) for later post-processing
```
# Ubuntu (same flags for Windows version)
# Note: saving in PNG rather than JPG will improve image quality, but slow down FPS (depending on hard disk writing speed and camera number)
......@@ -142,7 +142,8 @@ build\x64\Release\OpenPoseDemo.exe --flir_camera --3d --number_people_max 1 --fa
```
# Ubuntu (same flags for Windows version)
# Optionally add `--face` and/or `--hand` to include face and/or hands
./build/examples/openpose/openpose.bin --image_dir output_folder_path/ --3d --number_people_max 1
# Assuming 3 cameras
./build/examples/openpose/openpose.bin --image_dir output_folder_path/ --image_dir_stereo 3 --3d --number_people_max 1
```
......
......@@ -186,7 +186,8 @@ OpenPose Library - Release Notes
14. Removed old `windows/` version. CMake is the only Windows version available.
15. Camera parameters (flir camera) are read from disk at runtime rather than being compiled.
16. 3-D reconstruction module can be implemented with different camera brands or custom image sources.
16. Flag `--write_json` includes 3-D keypoints.
17. Flag `--write_json` includes 3-D keypoints.
18. Flag `--image_dir_stereo` added to allow `--image_dir` to load stereo images.
2. Functions or parameters renamed:
1. Flag `no_display` renamed as `display`, able to select between `NoDisplay`, `Display2D`, `Display3D`, and `DisplayAll`.
2. 3-D reconstruction demo is now inside the OpenPose demo binary.
......
......@@ -52,6 +52,10 @@ DEFINE_string(video, "", "Use a video file instea
" example video.");
DEFINE_string(image_dir, "", "Process a directory of images. Use `examples/media/` for our default example folder with 20"
" images. Read all standard formats (jpg, png, bmp, etc.).");
DEFINE_int32(image_dir_stereo, 1, "Complementary option to `--image_dir`. OpenPose will read as many images per iteration,"
" allowing tasks such as stereo camera processing. Note that `--camera_parameters_folder`"
" must be set. OpenPose must find as many `xml` files in the parameter folder as this"
" number indicates.");
DEFINE_bool(flir_camera, false, "Whether to use FLIR (Point-Grey) stereo camera.");
DEFINE_string(ip_camera, "", "String with the IP camera URL. It supports protocols like RTSP and HTTP.");
DEFINE_uint64(frame_first, 0, "Start on desired frame number. Indexes are 0-based, i.e. the first frame has index 0.");
......@@ -62,7 +66,7 @@ DEFINE_int32(frame_rotate, 0, "Rotate each frame, 4 po
DEFINE_bool(frames_repeat, false, "Repeat frames when finished.");
DEFINE_bool(process_real_time, false, "Enable to keep the original source frame rate (e.g. for video). If the processing time is"
" too long, it will skip frames. If it is too fast, it will slow it down.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/", "String with the folder where the camera parameters are located.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located.");
// OpenPose
DEFINE_string(model_folder, "models/", "Folder path (absolute or relative) where the models (pose, face, ...) are located.");
DEFINE_string(output_resolution, "-1x-1", "The image resolution (display and output). Use \"-1x-1\" to force the program to use the"
......@@ -236,7 +240,8 @@ int openPoseDemo()
// producerType
const auto producerSharedPtr = op::flagsToProducer(FLAGS_image_dir, FLAGS_video, FLAGS_ip_camera, FLAGS_camera,
FLAGS_flir_camera, FLAGS_camera_resolution, FLAGS_camera_fps,
FLAGS_camera_parameter_folder);
FLAGS_camera_parameter_folder,
(unsigned int) FLAGS_image_dir_stereo);
// poseModel
const auto poseModel = op::flagsToPoseModel(FLAGS_model_pose);
// JSON saving
......
......@@ -61,6 +61,10 @@ DEFINE_string(video, "", "Use a video file instea
" example video.");
DEFINE_string(image_dir, "", "Process a directory of images. Use `examples/media/` for our default example folder with 20"
" images. Read all standard formats (jpg, png, bmp, etc.).");
DEFINE_int32(image_dir_stereo, 1, "Complementary option to `--image_dir`. OpenPose will read as many images per iteration,"
" allowing tasks such as stereo camera processing. Note that `--camera_parameters_folder`"
" must be set. OpenPose must find as many `xml` files in the parameter folder as this"
" number indicates.");
DEFINE_bool(flir_camera, false, "Whether to use FLIR (Point-Grey) stereo camera.");
DEFINE_string(ip_camera, "", "String with the IP camera URL. It supports protocols like RTSP and HTTP.");
DEFINE_uint64(frame_first, 0, "Start on desired frame number. Indexes are 0-based, i.e. the first frame has index 0.");
......@@ -71,7 +75,7 @@ DEFINE_int32(frame_rotate, 0, "Rotate each frame, 4 po
DEFINE_bool(frames_repeat, false, "Repeat frames when finished.");
DEFINE_bool(process_real_time, false, "Enable to keep the original source frame rate (e.g. for video). If the processing time is"
" too long, it will skip frames. If it is too fast, it will slow it down.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/", "String with the folder where the camera parameters are located.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located.");
// OpenPose
DEFINE_string(model_folder, "models/", "Folder path (absolute or relative) where the models (pose, face, ...) are located.");
DEFINE_string(output_resolution, "-1x-1", "The image resolution (display and output). Use \"-1x-1\" to force the program to use the"
......@@ -238,7 +242,8 @@ int openPoseTutorialWrapper4()
// producerType
const auto producerSharedPtr = op::flagsToProducer(FLAGS_image_dir, FLAGS_video, FLAGS_ip_camera, FLAGS_camera,
FLAGS_flir_camera, FLAGS_camera_resolution, FLAGS_camera_fps,
FLAGS_camera_parameter_folder);
FLAGS_camera_parameter_folder,
(unsigned int) FLAGS_image_dir_stereo);
// poseModel
const auto poseModel = op::flagsToPoseModel(FLAGS_model_pose);
// JSON saving
......
......@@ -38,11 +38,15 @@ DEFINE_string(video, "", "Use a video file instea
" example video.");
DEFINE_string(image_dir, "", "Process a directory of images. Use `examples/media/` for our default example folder with 20"
" images. Read all standard formats (jpg, png, bmp, etc.).");
DEFINE_int32(image_dir_stereo, 1, "Complementary option to `--image_dir`. OpenPose will read as many images per iteration,"
" allowing tasks such as stereo camera processing. Note that `--camera_parameters_folder`"
" must be set. OpenPose must find as many `xml` files in the parameter folder as this"
" number indicates.");
DEFINE_bool(flir_camera, false, "Whether to use FLIR (Point-Grey) stereo camera.");
DEFINE_string(ip_camera, "", "String with the IP camera URL. It supports protocols like RTSP and HTTP.");
DEFINE_bool(process_real_time, false, "Enable to keep the original source frame rate (e.g. for video). If the processing time is"
" too long, it will skip frames. If it is too fast, it will slow it down.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/", "String with the folder where the camera parameters are located.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located.");
// OpenPose
DEFINE_string(output_resolution, "-1x-1", "The image resolution (display and output). Use \"-1x-1\" to force the program to use the"
" input image resolution.");
......@@ -65,7 +69,8 @@ int openPoseTutorialThread1()
// producerType
const auto producerSharedPtr = op::flagsToProducer(FLAGS_image_dir, FLAGS_video, FLAGS_ip_camera, FLAGS_camera,
FLAGS_flir_camera, FLAGS_camera_resolution, FLAGS_camera_fps,
FLAGS_camera_parameter_folder);
FLAGS_camera_parameter_folder,
(unsigned int) FLAGS_image_dir_stereo);
const auto displayProducerFpsMode = (FLAGS_process_real_time
? op::ProducerFpsMode::OriginalFps : op::ProducerFpsMode::RetrievalFps);
producerSharedPtr->setProducerFpsMode(displayProducerFpsMode);
......
......@@ -39,11 +39,15 @@ DEFINE_string(video, "", "Use a video file instea
" example video.");
DEFINE_string(image_dir, "", "Process a directory of images. Use `examples/media/` for our default example folder with 20"
" images. Read all standard formats (jpg, png, bmp, etc.).");
DEFINE_int32(image_dir_stereo, 1, "Complementary option to `--image_dir`. OpenPose will read as many images per iteration,"
" allowing tasks such as stereo camera processing. Note that `--camera_parameters_folder`"
" must be set. OpenPose must find as many `xml` files in the parameter folder as this"
" number indicates.");
DEFINE_bool(flir_camera, false, "Whether to use FLIR (Point-Grey) stereo camera.");
DEFINE_string(ip_camera, "", "String with the IP camera URL. It supports protocols like RTSP and HTTP.");
DEFINE_bool(process_real_time, false, "Enable to keep the original source frame rate (e.g. for video). If the processing time is"
" too long, it will skip frames. If it is too fast, it will slow it down.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/", "String with the folder where the camera parameters are located.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located.");
// OpenPose
DEFINE_string(output_resolution, "-1x-1", "The image resolution (display and output). Use \"-1x-1\" to force the program to use the"
" input image resolution.");
......@@ -99,7 +103,8 @@ int openPoseTutorialThread2()
// producerType
const auto producerSharedPtr = op::flagsToProducer(FLAGS_image_dir, FLAGS_video, FLAGS_ip_camera, FLAGS_camera,
FLAGS_flir_camera, FLAGS_camera_resolution, FLAGS_camera_fps,
FLAGS_camera_parameter_folder);
FLAGS_camera_parameter_folder,
(unsigned int) FLAGS_image_dir_stereo);
const auto displayProducerFpsMode = (FLAGS_process_real_time
? op::ProducerFpsMode::OriginalFps : op::ProducerFpsMode::RetrievalFps);
producerSharedPtr->setProducerFpsMode(displayProducerFpsMode);
......
......@@ -51,6 +51,10 @@ DEFINE_string(video, "", "Use a video file instea
" example video.");
DEFINE_string(image_dir, "", "Process a directory of images. Use `examples/media/` for our default example folder with 20"
" images. Read all standard formats (jpg, png, bmp, etc.).");
DEFINE_int32(image_dir_stereo, 1, "Complementary option to `--image_dir`. OpenPose will read as many images per iteration,"
" allowing tasks such as stereo camera processing. Note that `--camera_parameters_folder`"
" must be set. OpenPose must find as many `xml` files in the parameter folder as this"
" number indicates.");
DEFINE_bool(flir_camera, false, "Whether to use FLIR (Point-Grey) stereo camera.");
DEFINE_string(ip_camera, "", "String with the IP camera URL. It supports protocols like RTSP and HTTP.");
DEFINE_uint64(frame_first, 0, "Start on desired frame number. Indexes are 0-based, i.e. the first frame has index 0.");
......@@ -61,7 +65,7 @@ DEFINE_int32(frame_rotate, 0, "Rotate each frame, 4 po
DEFINE_bool(frames_repeat, false, "Repeat frames when finished.");
DEFINE_bool(process_real_time, false, "Enable to keep the original source frame rate (e.g. for video). If the processing time is"
" too long, it will skip frames. If it is too fast, it will slow it down.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/", "String with the folder where the camera parameters are located.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located.");
// OpenPose
DEFINE_string(model_folder, "models/", "Folder path (absolute or relative) where the models (pose, face, ...) are located.");
DEFINE_string(output_resolution, "-1x-1", "The image resolution (display and output). Use \"-1x-1\" to force the program to use the"
......@@ -313,7 +317,8 @@ int openPoseTutorialWrapper1()
// producerType
const auto producerSharedPtr = op::flagsToProducer(FLAGS_image_dir, FLAGS_video, FLAGS_ip_camera, FLAGS_camera,
FLAGS_flir_camera, FLAGS_camera_resolution, FLAGS_camera_fps,
FLAGS_camera_parameter_folder);
FLAGS_camera_parameter_folder,
(unsigned int) FLAGS_image_dir_stereo);
// poseModel
const auto poseModel = op::flagsToPoseModel(FLAGS_model_pose);
// JSON saving
......
#ifndef OPENPOSE_PRODUCER_IMAGE_DIRECTORY_READER_HPP
#define OPENPOSE_PRODUCER_IMAGE_DIRECTORY_READER_HPP
#include <openpose/3d/cameraParameterReader.hpp>
#include <openpose/core/common.hpp>
#include <openpose/producer/producer.hpp>
......@@ -18,8 +19,13 @@ namespace op
* Constructor of ImageDirectoryReader. It sets the image directory path from which the images will be loaded
* and generates a std::vector<std::string> with the list of images on that directory.
* @param imageDirectoryPath const std::string parameter with the folder path containing the images.
* @param imageDirectoryStereo const int parameter with the number of images per iteration (>1 would represent
* stereo processing).
* @param cameraParameterPath const std::string parameter with the folder path containing the camera
* parameters (only required if imageDirectorystereo > 1).
*/
explicit ImageDirectoryReader(const std::string& imageDirectoryPath);
explicit ImageDirectoryReader(const std::string& imageDirectoryPath, const unsigned int imageDirectoryStereo = 1,
const std::string& cameraParameterPath = "");
std::vector<cv::Mat> getCameraMatrices();
......@@ -41,7 +47,9 @@ namespace op
private:
const std::string mImageDirectoryPath;
const unsigned int mImageDirectoryStereo;
const std::vector<std::string> mFilePaths;
CameraParameterReader mCameraParameterReader;
Point<int> mResolution;
long long mFrameNameCounter;
......
......@@ -25,7 +25,8 @@ namespace op
const bool flirCamera = false,
const std::string& webcamResolution = "1280x720",
const double webcamFps = 30.,
const std::string& cameraParameterPath = "models/cameraParameters/");
const std::string& cameraParameterPath = "models/cameraParameters/",
const unsigned int imageDirectoryStereo = 1);
OP_API std::vector<HeatMapType> flagsToHeatMaps(const bool heatMapsAddParts = false,
const bool heatMapsAddBkg = false,
......
......@@ -6,7 +6,7 @@ namespace op
{
FlirReader::FlirReader(const std::string& cameraParametersPath) :
Producer{ProducerType::FlirCamera},
mSpinnakerWrapper{cameraParametersPath + "flir/"},
mSpinnakerWrapper{cameraParametersPath},
mFrameNameCounter{0}
{
try
......
......@@ -30,19 +30,45 @@ namespace op
}
}
ImageDirectoryReader::ImageDirectoryReader(const std::string& imageDirectoryPath) :
ImageDirectoryReader::ImageDirectoryReader(const std::string& imageDirectoryPath,
const unsigned int imageDirectoryStereo,
const std::string& cameraParameterPath) :
Producer{ProducerType::ImageDirectory},
mImageDirectoryPath{imageDirectoryPath},
mImageDirectoryStereo{imageDirectoryStereo},
mFilePaths{getImagePathsOnDirectory(imageDirectoryPath)},
mFrameNameCounter{0}
mFrameNameCounter{0ll}
{
try
{
// Read camera parameters from SN
auto serialNumbers = getFilesOnDirectory(cameraParameterPath, ".xml");
// Security check
if (serialNumbers.size() != mImageDirectoryStereo && mImageDirectoryStereo > 1)
error("Found different number of camera parameter files than the number indicated by"
" `--image_dir_stereo` ("
+ std::to_string(serialNumbers.size()) + " vs. "
+ std::to_string(mImageDirectoryStereo) + "). Make them equal or add"
+ " `--image_dir_stereo 1`",
__LINE__, __FUNCTION__, __FILE__);
// Get serial numbers
for (auto& serialNumber : serialNumbers)
serialNumber = getFileNameNoExtension(serialNumber);
// Get camera paremeters
if (mImageDirectoryStereo > 1)
mCameraParameterReader.readParameters(cameraParameterPath, serialNumbers);
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
}
}
std::vector<cv::Mat> ImageDirectoryReader::getCameraMatrices()
{
try
{
return {};
return mCameraParameterReader.getCameraMatrices();
}
catch (const std::exception& e)
{
......@@ -87,7 +113,10 @@ namespace op
{
try
{
return std::vector<cv::Mat>{getRawFrame()};
std::vector<cv::Mat> rawFrames;
for (auto i = 0u ; i < mImageDirectoryStereo ; i++)
rawFrames.emplace_back(getRawFrame());
return rawFrames;
}
catch (const std::exception& e)
{
......
......@@ -137,7 +137,8 @@ namespace op
std::shared_ptr<Producer> flagsToProducer(const std::string& imageDirectory, const std::string& videoPath,
const std::string& ipCameraPath, const int webcamIndex,
const bool flirCamera, const std::string& webcamResolution,
const double webcamFps, const std::string& cameraParameterPath)
const double webcamFps, const std::string& cameraParameterPath,
const unsigned int imageDirectoryStereo)
{
try
{
......@@ -145,7 +146,8 @@ namespace op
const auto type = flagsToProducerType(imageDirectory, videoPath, ipCameraPath, webcamIndex, flirCamera);
if (type == ProducerType::ImageDirectory)
return std::make_shared<ImageDirectoryReader>(imageDirectory);
return std::make_shared<ImageDirectoryReader>(imageDirectory, imageDirectoryStereo,
cameraParameterPath);
else if (type == ProducerType::Video)
return std::make_shared<VideoReader>(videoPath);
else if (type == ProducerType::IPCamera)
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册