MapAnything: Universal Feed-Forward Metric 3D Reconstruction
Open-source, high-performance Mixture-of-Experts large language model
Run Mixtral-8x7B models in Colab or consumer desktops
Towards Ultimate Expert Specialization in Mixture-of-Experts Language
Building Mixture-of-Experts from LLaMA with Continual Pre-training
Quantitative analysis, strategies and backtests
Privacy-preserving generation of a synthetic twin to a data set
We provide a PyTorch implementation of the paper Voice Separation
LL model providing reasoning and conversational capabilities
Open language model developed by NVIDIA as part of Nemotron-3 family
Model that fuses instruct, reasoning and agentic skills
Open-source code agent designed for Lean 4