MPT-7B and MPT-30B are a set of models that are part of MosaicML's Foundation Series. Trained on 1T tokens, the developers state that MPT-7B matches the performance of LLaMA while also being open source, while MPT-30B outperforms the original GPT-3. In addition to the base model, the developers also offer MPT-Instruct, MPT-Chat, and MPT-7B-StoryWriter-65k+, the last of which is trained on a context length of 65K tokens.
Initial release: 2023-05-05
The StableLM series of language models is Stability AI's entry into the LLM space. Trained on The Pile, the initial release included 3B and 7B parameter models with larger models on the way.
Initial release: 2023-04-19
|Products & Features|
|License||Apache 2.0||CC BY-SA 4.0|
|Model Sizes||7B, 30B||3B, 7B|