- Subcutaneous Anthology
- Posts
- Apple launches LLM: OpenELM
Apple launches LLM: OpenELM
Apple just dropped their own large language model: OpenELM. Let’s talk about it…
What makes it different than other large language models (LLMs)?
1. Designed for single devices:
Apple’s OpenELM is designed for single devices (like an iPhone), where LLMs like ChatGPT and Gemini are more reliant on cloud computing.
In other words, it’s meant to be faster, more accurate & more efficient for people using mobile devices.
2. Developer-Centric:
OpenELM is currently geared more toward developers & researchers; specifically for them to customize new apps using its’ underlying technology.
While Apple may eventually decide to use OpenELM's tech for future user-centric applications, the initial focus lay in development functionality.
By releasing it as open-source, R&D innovators gain more freedom to adapt it which could then lead to a broader array of apps built for mobile device users, leveraging the OpenELM technology and their data points.
As with any AI model, it's only as reliable as the data it's trained on.
3. Better Outputs on Less Training Data
Ok, so I just said that an AI is only as good as its training data. What gives?!
Generally, more data means more understanding of the patterns that exist within that data. But that’s assuming the quality of the data remains consistent with more data points. Quality vs. Quantity.
If you feed an AI more data to train on but it’s filled with inconsistencies, it can actually make the output less reliable. A ton of dirty data is not as good as a smaller amount of clean data.
The reason Apple’s OpenELM may be capable of better outputs with less data is simply because of the model architecture, or how information is processed. Kind of like gas mileage in cars, some LLMs can go further with the same amount of fuel.
Key Takeaways:
Adaptable for developers to innovate new apps geared toward mobile LLM users
Likely to be faster & more accurate*
Built on (supposedly) high quality training data, giving it better ‘gas mileage’
*We can't definitively say OpenELM will be faster or more accurate than other LLMs yet. It's a new technology, and benchmarks have not been widely shared.