T
ToolShelf
SPIKINGBRAIN 7B
// Spiking Brain-inspired Large Models, integrating hybrid efficient attention, MoE modules and spike encoding into its ...

SpikingBrain 7B

Spiking Brain-inspired Large Models, integrating hybrid efficient attention, MoE modules and spike encoding into its ...

13EmergingUnknown
License
MIT
Updated
Today

What it does

๐Ÿ“„ Technical Report: Chinese | English ๐Ÿš€ Arxiv: arXiv:2509.05276 ๐Ÿงฉ Models: Available Models ๐Ÿ”— Demo: OpenBayes่ดๅผ่ฎก็ฎ— --- Inspired by brain mechanisms, SpikingBrain integrates hybrid efficient attention, MoE modules, and spike encoding into its architecture, supported by a universal conversion pipeline compatible with the open-source model ecosystem. This enables continual pre-training with less

Getting Started

git
git clone https://github.com/BICLab/SpikingBrain-7B

Platforms

๐ŸชŸwindows๐ŸŽmac๐Ÿงlinux

Install Difficulty

moderate

Built With

python

Community Reactions