Agent Skills: mamba-architecture

State-space model with O(n) complexity vs Transformers' O(n²). 5× faster inference, million-token sequences, no KV cache. Selective SSM with hardware-aware design. Mamba-1 (d_state=16) and Mamba-2 (d_state=128, multi-head). Models 130M-2.8B on HuggingFace.

UncategorizedID: davila7/claude-code-templates/mamba-architecture

Install this agent skill to your local

pnpm dlx add-skill https://github.com/davila7/claude-code-templates/mamba-architecture

Skill Files

Browse the full folder contents for mamba-architecture.

Download Skill

Loading file tree…

Select a file to preview its contents.