| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| Optax 0.2.3 source code.tar.gz | 2024-07-09 | 1.7 MB | |
| Optax 0.2.3 source code.zip | 2024-07-09 | 1.8 MB | |
| README.md | 2024-07-09 | 7.4 kB | |
| Totals: 3 Items | 3.5 MB | 0 | |
What's Changed
- Fix the KeyboardInterrupt exception from https://github.com/google-deepmind/optax/issues/860 by removing the timeout by @copybara-service in https://github.com/google-deepmind/optax/pull/886
- Beginning of 0.2.3 development by @copybara-service in https://github.com/google-deepmind/optax/pull/893
- Add a mathematical description of AdamW by @gbruno16 in https://github.com/google-deepmind/optax/pull/894
- Suppress not-callable pylint error for now since is being flagged erroneously all over the place. by @copybara-service in https://github.com/google-deepmind/optax/pull/908
- Fix doc link by @yixiaoer in https://github.com/google-deepmind/optax/pull/903
- Fixed pseudocode for Nesterov in description of SGD. by @satyenkale in https://github.com/google-deepmind/optax/pull/901
- Fix softmax_cross_entropy to handle -inf logits correctly when corresponding label is 0. by @carlosgmartin in https://github.com/google-deepmind/optax/pull/898
- Upstream sparsemax jaxopt loss to optax. by @copybara-service in https://github.com/google-deepmind/optax/pull/899
- Reorganize tree_utils. by @copybara-service in https://github.com/google-deepmind/optax/pull/914
- Revert of [#898]. by @copybara-service in https://github.com/google-deepmind/optax/pull/915
- Fix jax.tree_map deprecation warnings. by @copybara-service in https://github.com/google-deepmind/optax/pull/917
- Correct handling of -inf in softmax_cross_entropy. Fix [#898]. by @copybara-service in https://github.com/google-deepmind/optax/pull/916
- Added mathematical documentation to AdaMax by @hmludwig in https://github.com/google-deepmind/optax/pull/918
- Fix pip install command for doc dependencies. by @mblondel in https://github.com/google-deepmind/optax/pull/922
- Start documentation for projections. by @mblondel in https://github.com/google-deepmind/optax/pull/921
- Add projection_simplex. by @copybara-service in https://github.com/google-deepmind/optax/pull/919
- Move gradient transformations to optax.transforms sub-package - 1/N by @copybara-service in https://github.com/google-deepmind/optax/pull/923
- Added a NTXent loss by @GrantMcConachie in https://github.com/google-deepmind/optax/pull/897
- fix(docs): broken link in README by @jeertmans in https://github.com/google-deepmind/optax/pull/940
- Add a deprecation module to warn or raise errors for deprecations (following jax semantics). by @copybara-service in https://github.com/google-deepmind/optax/pull/931
- chore(ci): add markdown-link-check action by @jeertmans in https://github.com/google-deepmind/optax/pull/939
- Implementation of MoMo algorithm by @fabian-sp in https://github.com/google-deepmind/optax/pull/721
- Weight decay for COCOB by @albcab in https://github.com/google-deepmind/optax/pull/945
- Add a nesterov flag to radam optimizer. by @carlosgmartin in https://github.com/google-deepmind/optax/pull/949
- Formatting in momo docstring + doctest by @fabianp in https://github.com/google-deepmind/optax/pull/950
- docstring formatting by @fabianp in https://github.com/google-deepmind/optax/pull/952
- Port schedule_free optimizer to optax. Original pytorch repo: https://github.com/facebookresearch/schedule_free by @copybara-service in https://github.com/google-deepmind/optax/pull/911
- Fix RST formatting issues. by @fabianp in https://github.com/google-deepmind/optax/pull/953
- remove duplicated BATCH_SIZE argument by @fabianp in https://github.com/google-deepmind/optax/pull/956
- Replace deprecated
jax.tree_*functions withjax.tree.*by @copybara-service in https://github.com/google-deepmind/optax/pull/963 - remove residues from previous builds before running tests by @fabianp in https://github.com/google-deepmind/optax/pull/967
- Fix docs errors by @copybara-service in https://github.com/google-deepmind/optax/pull/941
- Removing sophia optimizer by @copybara-service in https://github.com/google-deepmind/optax/pull/973
- move clipping transforms to optax.transforms. by @copybara-service in https://github.com/google-deepmind/optax/pull/926
- Expose components in sub-package by @copybara-service in https://github.com/google-deepmind/optax/pull/978
- Add multiclass_sparsemax_loss. by @copybara-service in https://github.com/google-deepmind/optax/pull/971
- Remove useless inner jit by @copybara-service in https://github.com/google-deepmind/optax/pull/957
- Fix memory leak in radam optimizer by @lukekulik in https://github.com/google-deepmind/optax/pull/974
- Add end_scale argument by @stefanocortinovis in https://github.com/google-deepmind/optax/pull/975
- Fix error with x64 loss by @stefanocortinovis in https://github.com/google-deepmind/optax/pull/976
- LBFGS solver part 1: chainable preconditioner. by @copybara-service in https://github.com/google-deepmind/optax/pull/980
- Fix docs errors (following warnings displayed in doc logs of github actions) by @copybara-service in https://github.com/google-deepmind/optax/pull/984
- [JAX] Update users of jax.tree.map() to be more careful about how they handle Nones. by @copybara-service in https://github.com/google-deepmind/optax/pull/983
- LBFGS solver part 2: implementing linesearch ensuring sufficient decrease and small curvature by @copybara-service in https://github.com/google-deepmind/optax/pull/981
- CI: add test against oldest supported JAX version by @jakevdp in https://github.com/google-deepmind/optax/pull/987
- Internal change by @copybara-service in https://github.com/google-deepmind/optax/pull/988
- Ignore some linesearch tests on gpu/tpu by @copybara-service in https://github.com/google-deepmind/optax/pull/986
- LBFGS part 3: combine lbfgs and zoom linesearch by @copybara-service in https://github.com/google-deepmind/optax/pull/989
- Add arxiv reference to schedule_free optimizer. by @copybara-service in https://github.com/google-deepmind/optax/pull/997
- LBFGS part 4: notebook illustrating how to use lbfgs with linesearch as a solver. by @copybara-service in https://github.com/google-deepmind/optax/pull/991
- Add common schedule_free wrappers. by @copybara-service in https://github.com/google-deepmind/optax/pull/998
- Add schedule_free check for b1 != 0. by @copybara-service in https://github.com/google-deepmind/optax/pull/999
- feat: add
normalize_by_update_normby @SauravMaheshkar in https://github.com/google-deepmind/optax/pull/958 - Saurav maheshkar saurav/scale by grad norm by @fabianp in https://github.com/google-deepmind/optax/pull/1000
- Fix doctest normalize_by_update_norm by @copybara-service in https://github.com/google-deepmind/optax/pull/1002
- Release v0.2.3 by @copybara-service in https://github.com/google-deepmind/optax/pull/1001
New Contributors
- @gbruno16 made their first contribution in https://github.com/google-deepmind/optax/pull/894
- @satyenkale made their first contribution in https://github.com/google-deepmind/optax/pull/901
- @GrantMcConachie made their first contribution in https://github.com/google-deepmind/optax/pull/897
- @fabian-sp made their first contribution in https://github.com/google-deepmind/optax/pull/721
- @lukekulik made their first contribution in https://github.com/google-deepmind/optax/pull/974
- @jakevdp made their first contribution in https://github.com/google-deepmind/optax/pull/987
Full Changelog: https://github.com/google-deepmind/optax/compare/v0.2.2...v0.2.3