Welcome to the world of IterMVS! This innovative machine learning method, presented at CVPR 2022, promises high efficiency and competitive reconstruction quality for Multi-View Stereo (MVS). This guide will walk you through installation, reproducing results, evaluating datasets, and training models. Follow along as we explore how to get the most out of this powerful tool!
Installation
Before jumping in, you’ll need to ensure that your setup is ready for IterMVS. Here’s what you need:
- Python 3.6
- CUDA 10.1
To install the necessary requirements, run:
pip install -r requirements.txt
Reproducing Results
To reproduce results efficiently, follow these steps:
1. Download Pre-Processed Datasets
Start by downloading the pre-processed datasets for evaluation:
Organize the datasets as follows:
root_directory
├── scan1 (scene_name1)
├── scan2 (scene_name2)
│ ├── images
│ ├── cams_1
│ └── pair.txt
2. Camera Parameters
The cam.txt file contains essential camera parameters:
extrinsic
E00 E01 E02 E03
E10 E11 E12 E13
E20 E21 E22 E23
E30 E31 E32 E33
intrinsic
K00 K01 K02
K10 K11 K12
K20 K21 K22
DEPTH_MIN
DEPTH_MAX
The pair.txt file holds information about the ten best source views for each reference image.
Evaluation on Different Datasets
Evaluating on various datasets can be accomplished with respective scripts:
DTU Evaluation
Follow these steps for the DTU evaluation:
- Download processed camera parameters from here.
- Unzip and replace the old camera files in the folders cams_1.
- Modify the eval_dtu.sh script with correct directories.
- Run the evaluation script:
bash eval_dtu.sh
.
Tanks & Temples Evaluation
For Tanks & Temples:
- Modify the eval_tanks.sh by setting directories.
- Run:
bash eval_tanks.sh
.
ETH3D Evaluation
For ETH3D:
- Adjust the eval_eth.sh script.
- Execute with:
bash eval_eth.sh
.
Custom Dataset Evaluation
Custom dataset evaluations can utilize COLMAP results:
- Run colmap_input.py to convert results into required format.
- Test with:
bash eval_custom.sh
.
Training Models
Let’s dive into training for DTU and BlendedMVS:
Training on DTU
Prepare the DTU dataset:
- Download the pre-processed DTU training set from here.
- Unzip into the root_directoryCameras_1.
- Edit train_dtu.sh for paths.
- Run:
bash train_dtu.sh
.
Training on BlendedMVS
Follow these steps to train on BlendedMVS:
- Download the dataset from here.
- Edit train_blend.sh as necessary.
- Execute:
bash train_blend.sh
.
Troubleshooting
If you encounter issues during installation or execution, try the following:
- Ensure Python and CUDA versions are compatible.
- Check your dataset paths in scripts for accuracy.
- If the evaluation scripts fail, ensure dependencies are installed as per requirements.
- Restart your machine and verify the GPU settings if you face runtime errors.
For additional assistance or insights, feel free to connect with others on the subject or visit our collaborative platform at fxis.ai.
Conclusion
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.