Generating adversarial examples for NLP models
TextAttack Documentation on ReadTheDocs
AboutSetupUsageDesign
About
TextAttack is a Python framework for adversarial attacks, data augmentation, and model training in NLP. It provides a rich set of tools to understand and improve NLP models through adversarial examples and data manipulation.
If you’re looking for information about TextAttack’s menagerie of pre-trained models, you might want to check out the TextAttack Model Zoo.
Setup
Installation
- Ensure you are running Python 3.6+ to use this package.
- A CUDA-compatible GPU is optional but will greatly improve code speed.
- Install TextAttack using pip:
pip install textattack - Once installed, you can run it via the command line or as a Python module:
textattack ...python -m textattack ... - Tip: TextAttack downloads files to
~/.cache/textattackby default, which includes pretrained models, dataset samples, and the configuration fileconfig.yaml. To change the cache path, set the environment variableTA_CACHE_DIR(e.g.,TA_CACHE_DIR=tmp textattack attack ...).
Usage
Help: textattack –help
TextAttack’s main features can all be accessed via the textattack command. Common commands include:
textattack attack argstextattack augment args
For more information about commands, run:
textattack --help
To see help for a specific command, use:
textattack attack --help
Running Attacks
The easiest way to experiment with an attack is via the command line:
textattack attack
Tip: If your machine has multiple GPUs, you can distribute the attack across them using the --parallel option.
Examples of Running Attacks
TextAttack offers various built-in attack recipes. Here are several ways to try them out:
- TextFooler on BERT (MR sentiment classification dataset):
textattack attack --recipe textfooler --model bert-base-uncased-mr --num-examples 100 - DeepWordBug on DistilBERT (Quora Question Pairs):
textattack attack --model distilbert-base-uncased-cola --recipe deepwordbug --num-examples 100
Design
TextAttack is model-agnostic and can analyze any model that generates IDs, tensors, or strings. It includes pre-trained models for common NLP tasks and supports datasets.
To formulate an attack, you need:
- Goal Function: Determines if the attack has succeeded.
- Constraints: Defines which perturbations are valid.
- Transformation: Generates possible modifications given an input.
- Search Method: Explores the space of transformations.
Troubleshooting
If you encounter any issues, consider the following:
- Ensure that Python and pip are updated to the latest versions.
- Check your compatibility with the required CUDA drivers if using a GPU.
- Verify the dataset paths and ensure they are correctly loaded.
For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
Conclusion
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.


