AI2’s open-source OLMo model gets a more diversified dataset, two-stage curriculum
Related Articles
-
Apple Releases Open Source AI Models That Run On-Device
Apple today released several open source large language models (LLMs) that are designed to run on-device rather than through cloud servers. Called OpenELM (Open-source Efficient Language Models), the LLMs are available on the Hugging Face Hub, a community for sharing AI code. As outlined in a white paper [PDF], there are eight total OpenELM models, four of which were pre-trained using the CoreNet library, and four instruction tuned models. Apple uses a layer-wise scaling strategy that is...
-
Apple release new open source AI models for on device processing
In a significant move towards enhancing privacy and processing efficiency, Apple has introduced a series of open source large language models (LLMs) known as OpenELM. These models are uniquely designed to operate directly on devices, diverging from the traditional reliance on cloud-based computations. This shift not only promises to improve user privacy by processing data […]
-
Apple releases OpenELM: small, open source AI models designed to run on-device
In terms of performance, the OpenLLM results shared by Apple show that the new models perform fairly well, especially the one with 3 billion parameters.