AI, IoT & Emerging Technologies

Microsoft researchers build 1-bit AI LLM with 2B parameters — model small enough to run on some CPUs

Microsoft researchers build 1-bit AI LLM with 2B parameters — model small enough to run on some CPUs


Microsoft researchers just created BitNet b1.58 2B4T, an open-source 1-bit large language model with two billion parameters and trained on four trillion tokens. But what makes this AI model unique is that it’s lightweight enough to work efficiently on a CPU, with TechCrunch saying an Apple M2 chip can run it. The model is also readily available on Hugging Face, allowing anyone to experiment with it.

Bitnets use 1-bit weights with only three possible values: -1, 0, and +1. This saves a lot of memory compared to mainstream AI models with 32-bit or 16-bit floating-point formats, allowing them to operate much more efficiently and require less memory and computational power. Bitnet’s simplicity has one drawback, though — it’s less accurate compared to larger AI models. However, BitNet b1.58 2B4T makes up for this with its massive training data, which is estimated to be more than 33 million books.

Related posts

OpenAI’s $500bn Stargate project eyes UK, Germany, France for expansion: report

balinettechnologies

Forget Ghibli Trend, Here’s How You Can Turn Yourself into a ‘Barbiecore’ Action Figure Using ChatGPT

balinettechnologies

OpenAI In Talks to Buy Windsurf for About $3 Billion

balinettechnologies

Leave a Comment