Recent research from UC Berkeley introduces selective state space models (S6), a novel sequence modeling technique that achieves comparable performance to Transformers while scaling linearly in sequence length.
Share this post
Selective State Space Models Achieve…
Share this post
Recent research from UC Berkeley introduces selective state space models (S6), a novel sequence modeling technique that achieves comparable performance to Transformers while scaling linearly in sequence length.