Agent Skills: huggingface-tokenizers

Fast tokenizers optimized for research and production. Rust-based implementation tokenizes 1GB in <20 seconds. Supports BPE, WordPiece, and Unigram algorithms. Train custom vocabularies, track alignments, handle padding/truncation. Integrates seamlessly with transformers. Use when you need high-performance tokenization or custom tokenizer training.

UncategorizedID: davila7/claude-code-templates/huggingface-tokenizers

Install this agent skill to your local

pnpm dlx add-skill https://github.com/davila7/claude-code-templates/huggingface-tokenizers

Skill Files

Browse the full folder contents for huggingface-tokenizers.

Download Skill

Loading file tree…

Select a file to preview its contents.