Nevergrad is a Python library for derivative-free optimization, offering robust implementations of many algorithms suited for black-box functions (i.e. functions where gradients are unavailable or unreliable). It targets hyperparameter search, architecture search, control problems, and experimental tuning—domains in which gradient-based methods may fail or be inapplicable. The library provides an easy interface to define an optimization problem (parameter space, loss function, budget) and then experiment with multiple strategies—evolutionary algorithms, Bayesian optimization, bandit methods, genetic algorithms, etc. Nevergrad supports parallelization, budget scheduling, and multiple cost/resource constraints, allowing it to scale to nontrivial optimization problems. It includes visualization tools and diagnostic metrics to compare strategy performance, track parameter evolution, and detect stagnation.
Features
- Multiple derivative-free optimization algorithms (evolution, Bayesian, bandits, genetic)
- Support for parallel execution and distributed evaluation of candidate solutions
- Budget and constraint management (e.g. total evaluations, cost limits)
- Visualization and diagnostic tools for optimization trajectories and performance
- Easy problem definition API (parameter spaces, loss functions, budgets)
- Strategy comparison framework to test and compare optimizers systematically