Sponsor of the Day:
Jerkmate
https://www.primeintellect.ai/blog/intellect-3
INTELLECT-3: A 100B+ MoE trained with large-scale RL
Today, we release INTELLECT-3, a 100B+ parameter Mixture-of-Experts model trained on our RL stack, achieving state-of-the-art performance for its size across...
intellect 3large scale100bmoetrained
https://huggingface.co/PrimeIntellect/INTELLECT-3-FP8
PrimeIntellect/INTELLECT-3-FP8 · Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
intellect 3hugging faceprimeintellectfp8
https://model.aibase.com/models/details/1998557669044260864
INTELLECT-3-MXFP4_MOE-GGUF Open-Source Inference Model - Free Support for Math, Coding, and...
🚀 INTELLECT-3 MXFP4_MOE Quantized Model This is a MXFP4_MOE quantization of the model INTELLECT-3, a powerful 100B+ MoE reasoning model trained with large - sc
open source inferenceintellect 3model freemoegguf
https://huggingface.co/collections/PrimeIntellect/intellect-3
INTELLECT-3 - a PrimeIntellect Collection
INTELLECT-3: A 100B+ MoE trained with large-scale RL
intellect 3primeintellectcollection