Generated with:

from transformers import AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("unitary/toxic-bert")
assert tokenizer.is_fast
tokenizer.save_pretrained("...")

This was generated for use in Bumblebee which relies on the Rust based tokenizer. For more information please refer to this link:

https://github.com/elixir-nx/bumblebee?tab=readme-ov-file#tokenizer-support

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment