Open-source platform to train, download, and run LLMs on device. Kolosal is designed to be fast, lightweight, and sustainable. It’s only 20 MB in size, yet it can run LLMs just as quickly or even faster than other competitors.
Hello, everyone!
Meet Kolosal AI, an open-source platform that lets you run LLMs right on your own device—whether you’re using a CPU or GPU. With Kolosal, your privacy stays protected and energy consumption remains low, making it more eco-friendly compared to large-scale AI systems like ChatGPT or Gemini.
Kolosal is designed to be fast, lightweight, and sustainable. It’s only 20 MB in size (that’s just 0.1–0.2% of the size of similar platforms like Ollama or LMStudio), yet it can run LLMs just as quickly or even faster than its competitors. Soon (once we fix a few bugs), Kolosal will also be able to run on other devices, including smartphones and single-board computers like the Raspberry Pi or Jetson.
We’re a team of passionate students committed to optimizing on-device LLM training and deployment.
Try running it locally with Kolosal. Check it out at https://kolosal.ai/.