...We provide fast objective functions, gradients, and for some cases hessians as well as approximations thereof. As a user, you can easily define custom loss functions. For those, you can decide to provide analytical gradients or use finite difference approximation / automatic differentiation. You can choose to mix loss functions natively found in this package and those you provide. In such cases, you optimize over a sum of different objectives (e.g. ML + Ridge). This strategy also applies to gradients, where you may supply analytic gradients or opt for automatic differentiation or mixed analytical and automatic differentiation. ...