Apple has released eight large open-source language models, OpenELM, that are designed to run on the device rather than through cloud servers.
Four of them were pre-trained using the CoreNet library. Apple is using a multi-layered scaling strategy that aims to improve accuracy and efficiency.
The company also provided code, training logs, and several versions of the models for anyone to use. All new products are available on Hugging Face Hub.
Source: linux.org.ru