News

The BitNet b1.58 2B4T model was developed by Microsoft's General Artificial Intelligence group and contains two billion ...
Microsoft's BitNet challenges industry norms with a minimalist approach using ternary weights that require just 400MB of ...
Microsoft’s new large language model (LLM) puts significantly less strain on hardware than other LLMs—and it’s free to ...
Microsoft researchers have developed — and released — a hyper-efficient AI model that can run on CPUs, including Apple's M2.
Memory requirements are the most obvious advantage of reducing the complexity of a model's internal weights. The BitNet b1.58 ...
Explore the new AI model from Microsoft designed to run efficiently on a CPU, ensuring powerful performance without a GPU.
Microsoft’s model BitNet b1.58 2B4T is available on Hugging Face but doesn’t run on GPU and requires a proprietary framework.
Microsoft’s new BitNet b1.58 model significantly reduces memory and energy requirements while matching the capabilities of ...
Microsoft released what it describes as the most expansive 1-bit AI model to date, BitNet b1.58 2B4T. Unlike traditional ...
Microsoft Research has introduced BitNet b1.58 2B4T, a new 2-billion parameter language model that uses only 1.58 bits per ...