Implementation:Alibaba MNN Express Optimizer
Metadata
| Source Repository | https://github.com/alibaba/MNN |
| Source File | express/Optimizer.cpp (29 lines)
|
| Language | C++ |
| Namespace | MNN::Express
|
| Domains | Training, Optimization |
| Last Updated | 2026-02-10 |
Summary
Optimizer provides the base implementation of the optimizer class for the MNN Express framework. This file defines the inner Parameters class for managing floating-point parameter arrays and a factory method Optimizer::create() for constructing optimizer instances. In its current form, the factory method is a stub/placeholder that returns nullptr, serving as the foundation for concrete optimizer implementations (e.g., SGD, Adam) to be built upon.
Note: This is currently a minimal scaffolding implementation. The create() factory method does not yet instantiate any concrete optimizer. Concrete optimizer subclasses are expected to override or extend this base.
Import
#include <MNN/expr/Optimizer.hpp>
I/O Contract
| Input | Output |
|---|---|
int n (parameter count) |
Parameters object owning a float[n] array
|
Optimizer::Config config |
std::shared_ptr<Optimizer> (currently nullptr)
|
Key Class: Optimizer
Optimizer::Parameters::Parameters(int n) (L13-17)
Optimizer::Parameters::Parameters(int n) {
MNN_ASSERT(n > 0);
mValue = new float[n];
mSize = n;
}
Allocates a raw floating-point array of size n to store optimizer parameters (e.g., learning rates, momentum values, weight arrays). The MNN_ASSERT guard ensures the allocation size is always positive.
Optimizer::Parameters::~Parameters() (L18-22)
Optimizer::Parameters::~Parameters() {
if (nullptr != mValue) {
delete[] mValue;
}
}
Deallocates the parameter array. The null check prevents double-free in cases where the array was moved or never allocated.
Optimizer::create(Config config) (L23-26)
std::shared_ptr<Optimizer> Optimizer::create(Config config) {
// Do nothing
return nullptr;
}
Factory method intended to construct optimizer instances based on the provided configuration. Currently returns nullptr, indicating this is a placeholder for future implementation. Concrete optimizer types (SGD, Adam, etc.) are expected to be instantiated through this factory once their implementations are available.
Internal Dependencies
#include <MNN/expr/Optimizer.hpp>
#include "core/Backend.hpp"
The inclusion of core/Backend.hpp suggests the optimizer is designed to be backend-aware, potentially allowing optimization computations to be dispatched to different hardware backends (CPU, GPU, etc.).
Design Notes
- The
Parametersinner class uses rawnew[]/delete[]for memory management rather than smart pointers or standard containers, likely for performance reasons and direct memory control in training workloads. - The factory pattern via
Optimizer::create()allows runtime selection of optimizer type based on configuration, following the same pattern used throughout MNN for backend and session creation. - The stub nature of this implementation means it serves primarily as an API contract definition, establishing the interface that concrete optimizers must fulfill.
Related Pages
- Alibaba_MNN_Express_Gradient_Optimizer_Interface -- The principle of gradient-based optimization in the MNN Express framework