Jamba | |
Developer: | AI21 Labs |
Released: | 28 March 2024 |
Genre: | |
License: | Apache 2.0 License |
Jamba is an open-weights large language model (LLM) developed by AI21 Labs.[1] [2] It utilizes a Mamba-based model built on a novel state space model (SSM) and transformer hybrid architecture.[3] [4] It is a 52 billion parameter model trained using a mixture-of-experts (MoE) technique with 12B active parameters (number of parameters active per token). Jamba can fit up to 256K tokens in its context window and is the largest Mamba-variant LLM created, or 140k tokens in a single 80GB GPU.
Jamba performs well across a number of key measurements including throughput and efficiency while outperforming or matching other state-of-the-art models in its class on a wide range of performance benchmarks while having significantly greater context limits enabling use-cases that require increased context. The model is released with open weights under an Apache 2.0 license.[5]
The company plans to release a beta-version instruct-tuned version on the AI21 Platform in the near future.