Welcome to an exciting journey through NEO 360, a cutting-edge implementation designed to synthesize vibrant outdoor scenes with minimal views using neural fields.
Getting Started: Setting Up the Environment
Before diving into the code, let’s prepare our environment. Think of this as setting up the stage before the main performance. Here’s how you can do it:
- First, create a Python 3.7 virtual environment using the following commands:
cd $NeO-360_repo
conda create -n neo360 python=3.7
conda activate neo360
pip install --upgrade pip
pip install -r requirements.txt
pip install torch==1.11.0+cu113 torchvision==0.12.0+cu113 -f https://download.pytorch.org/whl/torch_stable.html
export neo360_rootdir=$PWD
Dataset: Downloading and Organizing
Next up, downloading the NERDS 360 Multi-View dataset is crucial for natural scene perspective transformation. Like collecting the perfect ingredients for a recipe, each data aspect enhances your project:
- Download the datasets for training and testing using the following links:
- 🏠 NERDS360 Training Set – 75 Scenes (19.5 GB)
- 📝 NERDS360 Test Set – 5 Scenes (2.1 GB)
- ⚙️ NERDS360 Colmap – 10 Scenes (2.1 GB)
- Extract the data under your data directory and ensure the directory structure looks like: $neo360_rootdir/data/PDMultiObjv6.
Visualizing Your Dataset
To visualize the dataset’s point clouds and camera annotations, run:
python visualize/visualize_nerds360.py --base_dir PDMultiObjv6/train/SF_GrantAndCalifornia10
Loading Data for Training
Think of the dataloaders as the trusted chefs who will prepare your ingredients for cooking. Here’s how to set them up:
- We provide two powerful dataloaders for different scenarios:
- The single-scene overfitting dataloader, located in
datasets/nerds360.py. - The generalizable training dataloader, found in
datasets/nerds360_ae.py. - These can be utilized with any NeRF implementation by using the
read_posesfunction.
Inference: Running Your Model
After setting everything up, it’s time for the show! Here’s how to run inference:
- Download the validation split and pre-trained checkpoint from the provided links, and extract them under the project folder.
- Visualize renderings from a few source views with the following command:
python run.py --dataset_name nerds360_ae --exp_type triplanar_nocs_fusion_conv_scene --exp_name multi_map_tp_CONV_scene --encoder_type resnet --batch_size 1 --img_wh 320 240 --eval_mode vis_only --render_name 5viewtest_novelobj30_SF0_360_LPIPS --ckpt_path finetune_lpips_epoch=30.ckpt --root_dir $neo360_rootdir/data/neo360_val/split/test_novelobj
Generalizable Training: Training and Fine-tuning
The performance of NEO 360 can be compared to honing your skills through practice. Here’s how:
- Stage 1 training involves using a mix of photometric loss for several epochs:
python run.py --dataset_name nerds360_ae --root_dir $neo360_rootdir/data/PDMultiObjv6/train --exp_type triplanar_nocs_fusion_conv_scene --exp_name multi_map_tp_CONV_scene --encoder_type resnet --batch_size 1 --img_wh 320 240 --num_gpus 8
python run.py --dataset_name nerds360_ae --root_dir $neo360_rootdir/data/PDMultiObjv6/train --exp_type triplanar_nocs_fusion_conv_scene --exp_name multi_map_tp_CONV_scene --encoder_type resnet --batch_size 1 --img_wh 320 240 --num_gpus 8 --ckpt_path epoch=29.ckpt --finetune_lpips
Troubleshooting Tips
Don’t fret if things don’t go as planned—here are some handy troubleshooting ideas:
- If you find that the compute requirements are too high, consider looking into memory optimization strategies discussed in the GitHub thread.
- If you’re facing compatibility issues, ensure your Python version and PyTorch version align with the requirements.
- For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
Conclusion
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

