Welcome to your guide on Collaborative Neural Rendering (CoNR) using anime character sheets! This groundbreaking method allows you to create stunning animated videos from hand-drawn anime styles, and we’ll walk you through all the steps you need to get started, addressing any potential bumps along the way.
Introduction
The CoNR project, accepted by the Special Track of IJCAI2023, presents a unique approach to generate vivid dancing videos from hand-drawn anime character sheets. You can find detailed demos that showcase this fascinating process on BiliBili and YouTube.
Prerequisites
- NVIDIA GPU + CUDA + CUDNN
- Python 3.6
Installation Instructions
Follow these steps to successfully install the CoNR:
1. Clone the Repository
bash
git clone https://github.com/megvii-research/CoNR
2. Install Dependencies
- Navigate into the CoNR directory:
bash
cd CoNR
bash
pip install -r requirements.txt
3. Download Weights
Next, you’ll need to download the weights:
bash
mkdir weights
cd weights
gdown https://drive.google.com/uc?id=1M1LEpx70tJ72AIV2TQKr6NE_7mJ7tLYxg
gdown https://drive.google.com/uc?id=1YvZy3NHkJ6gC3pq_j8agcbEJymHCwJy0g
gdown https://drive.google.com/uc?id=1AOWZxBvTo9nUf2_9Y7Xe27ZFQuPrnx9i
gdown https://drive.google.com/uc?id=19jM1-GcqgGoE1bjmQycQw_vqD9C5e-Jm
Prepare Your Inputs
Now that you have everything set up, it’s time to prepare your inputs:
- Download the Ultra-Dense Pose sequences for two characters, which you’ll find in the CoNR_Dataset.
- You can create custom character sheets by drawing your designs, ensuring they are saved in PNG format and trimmed from the background.
Running the CoNR
You have two ways to run CoNR:
Method 1: Using Web UI
bash
streamlit run streamlit.py --server.port=8501
Then, open your browser and navigate to localhost:8501 to follow the on-screen instructions for generating your video.
Method 2: Via Terminal
bash
mkdir dir_to_save_result
python -m torch.distributed.launch --nproc_per_node=1 train.py --mode=test --world_size=1 --dataloaders=2 --test_input_poses_images=dir_to_poses --test_input_person_images=dir_to_character_sheet --test_output_dir=dir_to_save_result --test_checkpoint_dir=dir_to_weights
ffmpeg -r 30 -y -i dir_to_save_result%d.png -r 30 -c:v libx264 output.mp4 -r 30
Troubleshooting
If you encounter issues, consider the following:
- Ensure that your dependencies are correctly installed.
- Check that the paths to your directories are accurate.
- Verify you have sufficient GPU resources available.
Don’t hesitate to reach out! For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
Conclusion
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.
Further Reading
For more information about the CoNR project, refer to the arXiv paper and the related poster.