Name | Modified | Size | Downloads / Week |
---|---|---|---|
Parent folder | |||
0.6.0 source code.tar.gz | 2025-04-25 | 1.8 MB | |
0.6.0 source code.zip | 2025-04-25 | 1.9 MB | |
README.md | 2025-04-25 | 4.3 kB | |
Totals: 3 Items | 3.8 MB | 0 |
This release of EvoTorch introduces new functional programming capabilities, updates to reinforcement learning components, and a new data structure, along with several improvements and bug fixes.
New Features
-
Functional API for Optimization (#98 by @engintoklu):
- Introduces an alternative functional API for EvoTorch, compatible with
torch.func.vmap
. This allows for optimizing single or batched populations simultaneously. - Functional Algorithms: Includes functional versions of Cross Entropy Method (CEM) and Policy Gradients with Parameter-based Exploration (PGPE). These can be used with
vmap
or by providing batched initial centers (center_init
). - Functional Optimizers: Adds functional counterparts for Adam, ClipUp, and SGD optimizers. Their interfaces are similar to the functional CEM and PGPE, facilitating easier switching between evolutionary and gradient-based approaches.
@expects_ndim
Decorator: A new decorator to declare the expected number of dimensions for each positional argument of a function. If input tensors have more dimensions than expected, the function automatically appliesvmap
to operate across the batch dimensions.@rowwise
Decorator: A new decorator for functions implemented with the assumption of a vector input. If a tensor with 2 or more dimensions is received, the function automatically appliesvmap
to operate across batch dimensions.
- Introduces an alternative functional API for EvoTorch, compatible with
-
Functional Genetic Algorithm Operators (#109 by @engintoklu):
- Provides alternative implementations for genetic algorithm (GA) operators that follow functional programming principles.
- These operators are batchable, either by adding a leftmost dimension to the population or by using
torch.func.vmap
. - Users can combine these operators to implement custom GAs.
-
TensorFrame Data Structure (#120 by @engintoklu):
- Introduces
TensorFrame
, a new tabular data structure. - It is inspired by
pandas.DataFrame
but is designed to work with PyTorch tensors. TensorFrame
is compatible withtorch.vmap
, enabling it to be batched and used within fitness functions.
- Introduces
-
Notebook Demonstrating Object Evolution (#102 by @engintoklu): Added a Jupyter notebook to illustrate how to evolve arbitrary Python objects using EvoTorch.
-
Jupyter Notebook for Visualizing Brax Agents (#105 by @engintoklu): Added a notebook for visualizing agents trained with Brax.
Improvements
-
Updated Vectorized Reinforcement Learning (#104 by @engintoklu): Vectorized RL functionalities are now compatible with the Gymnasium
1.0.x
API, while maintaining compatibility with Gymnasium0.29.x
. Key updates include an EvoTorch-specificSyncVectorEnv
, performance enhancements, and refactored Brax notebook examples. -
Updated Hyperparameters for Brax Example (#108 by @engintoklu): Hyperparameters in the Brax example were updated.
-
Updated
general_usage.md
(#107 by @engintoklu): The general usage documentation was updated. -
Improved Logging Documentation (#116 by @flukeskywalker): Documentation for logging was improved.
Bug Fixes
-
CMAES with Bounded Problems (#100 by @flukeskywalker): CMAES will now correctly indicate failure if the problem is bounded.
-
VecGymNE with Adaptive Popsize (#106 by @engintoklu):
VecGymNE
is now compatible with adaptive population sizes. -
CMAES Center Dimensionality (#111 by @engintoklu): The "center" of CMAES is now correctly treated as 1-dimensional.
Maintenance
- Updated GitHub Actions (#112 by @Higgcz): GitHub Actions workflows were updated.