In the field of machine learning, specifically in neural networks, ensuring robustness against distribution shifts is crucial for achieving high accuracy. The recent techniques introduced in our CNSN paper at ICCV 2021—CrossNorm (CN) and SelfNorm (SN)—aim to enhance generalization robustness effectively and simply. In this blog, we’ll walk you through the installation, data preparation, and usage of these powerful normalization techniques.
Step-by-Step Guide to Setting Up CrossNorm and SelfNorm
1. Install Dependencies
First, let’s make sure you have the necessary dependencies installed. We recommend using conda for package management. Follow the instructions below:
conda create --name cnsn python=3.7
conda activate cnsn
conda install numpy
conda install pytorch==1.2.0 torchvision==0.4.0 cudatoolkit=10.0 -c pytorch
2. Prepare Datasets
Next, you’ll need to download the necessary datasets. We will be using CIFAR-10-C, CIFAR-100-C, and ImageNet-C. Follow the commands below:
mkdir -p .data
curl -O https://zenodo.org/record/2535967/files/CIFAR-10-C.tar
curl -O https://zenodo.org/record/3555552/files/CIFAR-100-C.tar
tar -xvf CIFAR-100-C.tar -C data
tar -xvf CIFAR-10-C.tar -C data
mkdir -p .data/ImageNet-C
curl -O https://zenodo.org/record/2235448/files/blur.tar
curl -O https://zenodo.org/record/2235448/files/digital.tar
curl -O https://zenodo.org/record/2235448/files/noise.tar
curl -O https://zenodo.org/record/2235448/files/weather.tar
tar -xvf blur.tar -C data/ImageNet-C
tar -xvf digital.tar -C data/ImageNet-C
tar -xvf noise.tar -C data/ImageNet-C
tar -xvf weather.tar -C data/ImageNet-C
3. Usage
Now that your environment is set up, you can start using CrossNorm and SelfNorm. There are sample scripts available in folders for CIFAR-10 and CIFAR-100, as well as ImageNet. You can run the following scripts for example purposes:
.cifar100-scripts/wideresnet/run-cn.sh.cifar100-scripts/wideresnet/run-sn.sh.cifar100-scripts/wideresnet/run-cnsn.sh.cifar100-scripts/wideresnet/run-cnsn-consist.sh(Use CNSN with JSD consistency regularization).cifar100-scripts/wideresnet/run-cnsn-augmix.sh(Use CNSN with AugMix)
4. Pretrained Models
To optimize your experiments, you can also make use of our pretrained ResNet-50 ImageNet classifiers:
Understanding the Code: An Analogy
Think of CrossNorm (CN) and SelfNorm (SN) as two chefs in a bustling kitchen, each with their unique specialties. Just like CN focuses on adjusting flavor (normalization) across different dishes (data points), SN personalizes each dish by considering the individual ingredients (features). Together, they create a harmonious menu that caters to various preferences and ensures that every meal served (model accuracy) is delicious and satisfying under any circumstance (distribution shifts).
Troubleshooting
If you encounter any issues during installation or while running your scripts, consider the following troubleshooting steps:
- Ensure that you have activated the conda environment using
conda activate cnsn. - Check if you have the necessary permissions to download files to the specified directories.
- If the datasets are missing or not extracted properly, re-run the download and extraction commands.
- Confirm that your CUDA toolkit version matches the PyTorch installation requirements.
For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
Conclusion
By leveraging CrossNorm and SelfNorm, you can significantly improve your model’s robustness against distribution shifts. At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

