
Source: VentureBeat
Summary
Multiverse Computing has launched an app and API for its compressed AI models, which were developed in collaboration with major AI labs such as OpenAI, Meta, DeepSeek, and Mistral AI. The app showcases the capabilities of these models, while the API makes them more widely available. The company aims to make AI more accessible and efficient. According to a report, the compressed models can run on devices with limited computing power.
Our Reading
The announcement sounds ambitious.
Multiverse Computing has unveiled an app and API for its compressed AI models, because what the world really needed was another AI app. The company has partnered with major AI labs to make these models more accessible. The app is meant to show off the capabilities of these compressed models, while the API is supposed to make them more widely available. Because, you know, the 12 AI assistants on your phone just weren’t enough. The compressed models can supposedly run on devices with limited computing power, because who needs actual processing power when you can just compress it?
Original observation: Because “accessible AI” is just a euphemism for “AI that still doesn’t work on your phone”.
Author: Evan Null









