| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| README.md | 2026-03-09 | 2.4 kB | |
| v2.0.0 source code.tar.gz | 2026-03-09 | 152.5 MB | |
| v2.0.0 source code.zip | 2026-03-09 | 154.4 MB | |
| Totals: 3 Items | 306.9 MB | 1 | |
PhysicsNeMo General Release v2.0.0
📝 NVIDIA PhysicsNeMo v2.0 contains significant reorganization of all the features, with easier installation and integration to external packages. See the migration guide for more details!
Added
- Refactored diffusion preconditioners in
physicsnemo.diffusion.preconditionersrelying on a new abstract base classBaseAffinePreconditionerfor preconditioning schemes using affine transformations. Existing preconditioners (VPPrecond,VEPrecond,iDDPMPrecond,EDMPrecond) reimplemented based on this new interface. - New
physicsnemo.experimental.nn.symmetrymodule that implements building blocks that preserve 2D and 3D rotational equivariance using a grid-based layout for efficient GPU parallelization, and an emphasis on compacteinsumoperations.
Changed
- PhysicsNemo v2.0 contains significant reorganization of tools. Please see the v2.0-MIGRATION-GUIDE.md to understand what has changed and why.
- DiT (Diffusion Transformer) has been moved from
physicsnemo.experimental.models.dittophysicsnemo.models.dit.
Fixed
- Shape mistmatch bug in the Lennard Jones example
Dependencies
- CUDA backend is now selected via orthogonal
cu12/cu13extras rather than being hardcoded to CUDA 13. Feature extras (nn-extras,utils-extras, etc.) are now CUDA-agnostic and can be combined with either backend, e.g.pip install "nvidia-physicsnemo[cu13,nn-extras]". When neithercu12norcu13is specified, PyTorch is installed from PyPI using its default build (currently CUDA 12.8 on Linux). For development withuv, useuv sync --extra cu13(or--extra cu12) to select the backend.
Contributors
We’re grateful to everyone who contributed issues, feature ideas, fixes, and documentation updates — your input is what helps us continuously improve PhysicsNeMo for the whole community! A special shout-out to the authors of the pull requests listed above, in no particular order:
@jleinonen @dran-dev @aayushg55 @saikrishnanc-nv @jeis4wpi @albertocarpentieri @paveltomin @weilr @giprayogo @tonishi-nv @younes-abid @dakhare-creator @Alexey-Kamenev
Thank you :heart: — we truly appreciate your contributions and hope to see more from you in the future!