SPIKINGBRAIN 7B
// Spiking Brain-inspired Large Models, integrating hybrid efficient attention, MoE modules and spike encoding into its ...
SpikingBrain 7B
Spiking Brain-inspired Large Models, integrating hybrid efficient attention, MoE modules and spike encoding into its ...
13EmergingUnknown
What it does
๐ Technical Report: Chinese | English ๐ Arxiv: arXiv:2509.05276 ๐งฉ Models: Available Models ๐ Demo: OpenBayes่ดๅผ่ฎก็ฎ --- Inspired by brain mechanisms, SpikingBrain integrates hybrid efficient attention, MoE modules, and spike encoding into its architecture, supported by a universal conversion pipeline compatible with the open-source model ecosystem. This enables continual pre-training with less
Getting Started
git
git clone https://github.com/BICLab/SpikingBrain-7B