| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| README.md | 2025-08-22 | 6.6 kB | |
| v1.2.0 source code.tar.gz | 2025-08-22 | 116.8 MB | |
| v1.2.0 source code.zip | 2025-08-22 | 118.1 MB | |
| Totals: 3 Items | 234.9 MB | 0 | |
PhysicsNeMo General Release v1.2.0
Added
- Diffusion Transformer (DiT) model. The DiT model can be accessed in
physicsnemo.experimental.models.dit.DiT. ⚠️Warning: - Experimental feature subject to future API changes. - Improved documentation for diffusion models and diffusion utils.
- Safe API to override
__init__'s arguments saved in checkpoint file withModule.from_checkpoint("chkpt.mdlus", override_args=set(...)). - PyTorch Geometric MeshGraphNet backend.
- Functionality in DoMINO to take arbitrary number of
scalarorvectorglobal parameters and encode them usingclass ParameterModel - TopoDiff model and example.
- Added ability for DoMINO model to return volume neighbors.
- Added functionality in DoMINO recipe to introduce physics residual losses.
- Diffusion models, metrics, and utils: implementation of Student-t
distribution for EDM-based diffusion models (t-EDM). This feature is adapted
from the paper Heavy-Tailed Diffusion Models, Pandey et al..
This includes a new EDM preconditioner (
tEDMPrecondSuperRes), a loss function (tEDMResidualLoss), and a new option in corrdiffdiffusion_step. ⚠️ This is an experimental feature that can be accessed through thephysicsnemo.experimentalmodule; it might also be subjected to API changes without notice. - Bumped Ruff version from 0.0.290 to 0.12.5. Replaced Black with
ruff-format. - Domino improvements with Unet attention module and user configs
- Hybrid MeshGraphNet for modeling structural deformation
- Enabled TransformerEngine backend in the
transolvermodel. - Inference code for x-meshgraphnet example for external aerodynamics.
- Added a new example for external_aerodynamics: training
transolveron irregular mesh data for DrivaerML surface data. - Added a new example for external aerodynamics for finetuning pretrained models.
Changed
- Diffusion utils:
physicsnemo.utils.generativerenamed intophysicsnemo.utils.diffusion - Diffusion models: in CorrDiff model wrappers (
EDMPrecondSuperResolutionandUNet), the argumentsprofile_modeandamp_modecannot be overriden byfrom_checkpoint. They are now properties that can be dynamically changed after the model instantiation with, for example,model.amp_mode = Trueandmodel.profile_mode = False. - Updated healpix data module to use correct
DistributedSamplertarget for test data loader - Existing DGL-based vortex shedding example has been renamed to
vortex_shedding_mgn_dgl. Added newvortex_shedding_mgnexample that uses PyTorch Geometric instead. - HEALPixLayer can now use earth2grid HEALPix padding ops, if desired
- Migrated Vortex Shedding Reduced Mesh example to PyTorch Geometric.
- CorrDiff example: fixed bugs when training regression
UNet. - Diffusion models: fixed bugs related to gradient checkpointing on non-square images.
- Diffusion models: created a separate class
Attentionfor clarity and modularity. UpdatedUNetBlockaccordingly to use theAttentionclass instead of custom attention logic. This will update the model architecture forSongUNet-based diffusion models. Changes are not BC-breaking and are transparent to the user. - ⚠️ BC-breaking: refactored the automatic mixed precision
(AMP) API in layers and models defined in
physicsnemo/models/diffusion/for improved usability. Note: it is now, not only possible, but required to explicitly setmodel.amp_mode = Truein order to use the model in atorch.autocastclause. This applies to allSongUNet-based models. - Diffusion models: fixed and improved API to enable fp16 forward pass in
UNetandEDMPrecondSuperResolutionmodel wrappers; fp16 forward pass can now be toggled/untoggled by settingmodel.use_fp16 = True. - Diffusion models: improved API for Apex group norm.
SongUNet-based models will automatically perform conversion of the input tensors totorch.channels_lastmemory format whenmodel.use_apex_gnisTrue. New warnings are raised when attempting to use Apex group norm on CPU. - Diffusion utils: systematic compilation of patching operations in
stochastic_samplerfor improved performance. - CorrDiff example: added option for Student-t EDM (t-EDM) in
train.pyandgenerate.py. When training a CorrDiff diffusion model, this feature can be enabled with the hydra overrides++training.hp.distribution=student_tand++training.hp.nu_student_t=<nu_value>. For generation, this feature can be enabled with similar overrides:++generation.distribution=student_tand++generation.nu_student_t=<nu_value>. - CorrDiff example: the parameters
P_meanandP_std(used to compute the noise levelsigma) are now configurable. They can be set with the hydra overrides++training.hp.P_mean=<P_mean_value>and++training.hp.P_std=<P_std_value>for training (and similar ones withtraining.hpreplaced bygenerationfor generation). - Diffusion utils: patch-based inference and lead time support with deterministic sampler.
- Existing DGL-based XAeroNet example has been renamed to
xaeronet_dgl. Added newxaeronetexample that uses PyTorch Geometric instead. - Updated the deforming plate example to use the Hybrid MeshGraphNet model.
- ⚠️ BC-breaking: Refactored the
transolvermodel to improve readability and performance, and extend to more use cases. - Diffusion models: improved lead time support for
SongUNetPosLtEmbdandEDMLoss. Lead-time embeddings can now be used with/without positional embeddings. - Diffusion models: consolidate
ApexGroupNormandGroupNorminmodels/diffusion/layers.pywith a factoryget_group_normthat can be used to instantiate either one of them.get_group_normis now the recommended way to instantiate a GroupNorm layer inSongUNet-based and other diffusion models. - Physicsnemo models: improved checkpoint loading API in
Module.from_checkpointthat now exposes astrictparameter to raise error on missing/unexpected keys, similar to that used intorch.nn.Module.load_state_dict. - Migrated Hybrid MGN and deforming plate example to PyTorch Geometric.
Fixed
- Bug fixes in DoMINO model in sphere sampling and tensor reshaping
- Bug fixes in DoMINO utils random sampling and test.py
- Optimized DoMINO config params based on DrivAer ML