Implementation:Avhz RustQuant GradientDescent optimize
| Knowledge Sources | |
|---|---|
| Domains | Optimization, Model_Calibration |
| Last Updated | 2026-02-07 20:00 GMT |
Overview
Concrete tool for performing gradient descent optimization using automatic differentiation provided by the RustQuant math crate.
Description
The GradientDescent struct provides a complete optimization loop that integrates with the autodiff system. Each iteration creates a fresh Graph, evaluates the objective function, computes the gradient via accumulate() and wrt(), then updates the parameter vector. The result is returned as a GradientDescentResult containing the minimizer, minimum value, iteration count, and elapsed time.
Usage
Create a GradientDescent optimizer with a learning rate, max iterations, and tolerance. Define the objective function as a closure taking &[Variable] and returning Variable. Call optimize() with the function and initial guess.
Code Reference
Source Location
- Repository: RustQuant
- File: crates/RustQuant_math/src/optimization/gradient_descent.rs
- Lines: L87-218
Signature
pub struct GradientDescent {
pub learning_rate: f64,
pub max_iterations: usize,
pub tolerance: Option<f64>,
}
pub struct GradientDescentResult {
pub minimizer: Vec<f64>,
pub minimum: f64,
pub iterations: usize,
pub elapsed: Duration,
}
impl GradientDescent {
pub fn new(
learning_rate: f64,
max_iterations: usize,
tolerance: Option<f64>,
) -> Self
pub fn optimize<F>(
&self,
f: F,
x0: &[f64],
verbose: bool,
) -> GradientDescentResult
where
F: for<'v> Fn(&[Variable<'v>]) -> Variable<'v>,
}
Import
use RustQuant::math::GradientDescent;
use RustQuant::autodiff::Variable;
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| learning_rate | f64 | Yes | Step size for parameter updates |
| max_iterations | usize | Yes | Maximum number of gradient descent iterations |
| tolerance | Option<f64> | No | Gradient norm threshold for convergence (defaults to sqrt(EPSILON)) |
| f | Fn(&[Variable]) -> Variable | Yes | Objective function to minimize |
| x0 | &[f64] | Yes | Initial parameter guess |
| verbose | bool | Yes | Print iteration progress to stdout |
Outputs
| Name | Type | Description |
|---|---|---|
| minimizer | Vec<f64> | Parameter values at the minimum |
| minimum | f64 | Function value at the minimum |
| iterations | usize | Number of iterations performed |
| elapsed | Duration | Wall-clock time for the optimization |
Usage Examples
Minimize x^2
use RustQuant::math::GradientDescent;
use RustQuant::autodiff::Variable;
fn f<'v>(x: &[Variable<'v>]) -> Variable<'v> {
x[0] * x[0]
}
let gd = GradientDescent::new(0.1, 1000, Some(0.000_001));
let result = gd.optimize(f, &[10.0], false);
println!("Minimum: {:.6}", result.minimum); // ~0.0
println!("Minimizer: {:?}", result.minimizer); // ~[0.0]
println!("Iterations: {}", result.iterations);
Booth Function (Multivariate)
use RustQuant::math::GradientDescent;
use RustQuant::autodiff::Variable;
use RustQuant::autodiff::overload::Powf;
fn booth<'v>(vars: &[Variable<'v>]) -> Variable<'v> {
let x = vars[0];
let y = vars[1];
(x + 2.0 * y - 7.0).powf(2.0) + (2.0 * x + y - 5.0).powf(2.0)
}
let gd = GradientDescent::new(0.1, 1000, Some(0.000_001));
let result = gd.optimize(booth, &[5.0, 5.0], false);
println!("Minimum: {:.6}", result.minimum); // ~0.0
println!("Minimizer: {:?}", result.minimizer); // ~[1.0, 3.0]
Model Calibration (Volatility Fitting)
use RustQuant::math::GradientDescent;
use RustQuant::autodiff::Variable;
// Calibrate implied volatility to match market price
let market_price = 10.45;
let s = 100.0;
let k = 100.0;
let r = 0.05;
let t = 1.0;
fn calibrate<'v>(params: &[Variable<'v>]) -> Variable<'v> {
let sigma = params[0];
// Simplified: minimize (model_price(sigma) - market_price)^2
// In practice, use the full BS formula with Variable arithmetic
let model_price = sigma * 40.0; // simplified placeholder
let diff = model_price - 10.45;
diff * diff
}
let gd = GradientDescent::new(0.01, 5000, Some(0.000_001));
let result = gd.optimize(calibrate, &[0.30], true);
println!("Calibrated vol: {:.4}", result.minimizer[0]);