A model proposed during the BigScience Workshop as an open-source alternative to GPT-3, BLOOM has since been superseded by recent models based on Meta's LLaMA model.
Initial release: 2022-07-06
MPT-7B is a set of models that are part of MosaicML's Foundation Series. Trained on 1T tokens, the developers state that it matches the performance of LLaMA while also being open source. In addition to the base model, the developers also offer MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the last of which is trained on a context length of 65K tokens.
Initial release: 2023-05-05
Looking for an LLM API/SDK that works out of the box?Sapling API
|Products & Features|
|License||Open RAIL-M v1||Apache 2.0|
|Model Sizes||1.1B, 1.7B, 3B, 7.1B, 176B||7B|