Implementation:Avhz RustQuant Graph and Variable
| Knowledge Sources | |
|---|---|
| Domains | Optimization, Numerical_Methods, Automatic_Differentiation |
| Last Updated | 2026-02-07 20:00 GMT |
Overview
Concrete tool for building reverse-mode automatic differentiation computation graphs provided by the RustQuant autodiff crate.
Description
The Graph struct is a Wengert List implemented as a RefCell<Vec<Vertex>>. Variables are added via graph.var(value) or graph.vars(values), which return Variable structs. Variables support standard arithmetic operators (+, -, *, /), mathematical functions (sin, cos, exp, ln, sqrt, powf, etc.), and comparison operations, all of which record vertices in the graph.
After computing a function of Variables, calling accumulate() performs the reverse-mode sweep, and wrt() extracts the gradient with respect to specific inputs.
Usage
Import Graph and Variable when you need to compute gradients of arbitrary functions for optimization, model calibration, or sensitivity analysis.
Code Reference
Source Location
- Repository: RustQuant
- File: crates/RustQuant_autodiff/src/graph.rs (L29-187)
- File: crates/RustQuant_autodiff/src/variable.rs (L29-148)
- File: crates/RustQuant_autodiff/src/accumulate.rs (L23-52)
- File: crates/RustQuant_autodiff/src/gradient.rs (L29-72)
- File: crates/RustQuant_autodiff/src/overload.rs (L1-2005, operator overloading)
Signature
// Computation graph (Wengert List)
pub struct Graph {
pub vertices: RefCell<Vec<Vertex>>,
}
impl Graph {
pub const fn new() -> Self
pub fn with_capacity(capacity: usize) -> Self
pub fn var(&self, value: f64) -> Variable
pub fn vars<'v>(&'v self, values: &[f64]) -> Vec<Variable<'v>>
pub fn len(&self) -> usize
pub fn clear(&self)
pub fn zero(&self)
}
// Variable (node in the graph)
pub struct Variable<'v> {
pub graph: &'v Graph,
pub value: f64,
pub index: usize,
}
// Accumulate trait (reverse-mode sweep)
pub trait Accumulate {
fn accumulate(&self) -> Vec<f64>;
}
// Gradient trait (extract partial derivatives)
pub trait Gradient<T> {
fn wrt(&self, variables: T) -> Vec<f64>; // or f64 for single var
}
Import
use RustQuant::autodiff::{Graph, Variable, Accumulate, Gradient};
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| value | f64 | Yes (for var()) | Initial value of the variable |
| values | &[f64] | Yes (for vars()) | Initial values for multiple variables |
Outputs
| Name | Type | Description |
|---|---|---|
| Variable | Variable<'v> | A differentiable variable that tracks operations on the graph |
| accumulate() | Vec<f64> | Full gradient vector (all adjoints) |
| wrt(&vars) | Vec<f64> | Partial derivatives with respect to specified variables |
Usage Examples
Simple Derivative
use RustQuant::autodiff::{Graph, Accumulate, Gradient};
let graph = Graph::new();
// Create variable x = 3.0
let x = graph.var(3.0);
// Compute f(x) = x^2 + 2x + 1
let f = x * x + 2.0 * x + 1.0;
// f(3) = 9 + 6 + 1 = 16
assert_eq!(f.value, 16.0);
// f'(x) = 2x + 2
// f'(3) = 8
let gradient = f.accumulate();
let df_dx = gradient.wrt(&x);
assert_eq!(df_dx, 8.0);
Multivariate Gradient
use RustQuant::autodiff::{Graph, Accumulate, Gradient};
use RustQuant::autodiff::overload::Powf;
let graph = Graph::new();
let vars = graph.vars(&[1.0, 3.0]); // x=1, y=3
let x = vars[0];
let y = vars[1];
// Booth function: f(x,y) = (x + 2y - 7)^2 + (2x + y - 5)^2
let f = (x + 2.0 * y - 7.0).powf(2.0) + (2.0 * x + y - 5.0).powf(2.0);
let gradient = f.accumulate();
let grad = gradient.wrt(&vars);
// grad[0] = df/dx, grad[1] = df/dy
println!("Gradient: [{:.4}, {:.4}]", grad[0], grad[1]);