Implementation:Alibaba MNN MNNConvert Info
| Field | Value |
|---|---|
| Implementation Name | MNNConvert_Info |
| Type | API Doc |
| Category | Model_Conversion_Pipeline |
| Source | tools/converter/source/common/cli.cpp:L99-140 (dumpModelInfo), tools/converter/source/common/cli.cpp:L258-290 (--info/--JsonFile parsing), tools/converter/source/common/cli.cpp:L1087-1214 (mnn2json/json2mnn)
|
| External Dependencies | MNN core library (for Module::load), flatbuffers |
Summary
MNNConvert provides two model inspection modes: a quick info dump that prints model metadata to stdout, and a full JSON export that serializes the entire model structure (operators, tensors, sub-graphs) to a JSON file. Both modes operate on existing .mnn model files and are accessed through the same MNNConvert CLI.
API
Model Info Dump
MNNConvert --info --modelFile model.mnn -f MNN
JSON Export
MNNConvert --JsonFile output.json --modelFile model.mnn -f MNN
JSON to MNN (Reverse Conversion)
MNNConvert -f JSON --modelFile model.json --MNNModel output.mnn
Key Parameters
| Parameter | Type | Description |
|---|---|---|
--info |
flag | Dump model metadata (inputs, outputs, version, format) to stdout. Only valid with -f MNN.
|
--JsonFile |
string |
Export MNN model to the specified JSON file path. Only valid with -f MNN.
|
--modelFile |
string |
Path to the MNN model file (.mnn) to inspect.
|
-f |
string |
Must be MNN for info dump or JSON export; JSON for JSON-to-MNN conversion.
|
Inputs
- MNN model file (
.mnn) -- The compiled MNN model to inspect - JSON model file (
.json) -- For reverse conversion from JSON back to MNN binary
Outputs
--info Mode Output
Prints to stdout:
Model default dimensionFormat is NCHW
Model Inputs:
[ input ]: dimensionFormat: NCHW, size: [ 1,3,224,224 ], type is float
Model Outputs:
[ output ]
Model Version: 2.8.0
Information reported:
- Default dimension format -- NCHW, NHWC, or NC4HW4
- Input details -- For each input: name, dimension format, shape, data type
- Output names -- Names of all model outputs
- Model version -- MNN version string (or "< 2.0.0" for legacy models)
- Metadata -- Any custom key-value metadata pairs embedded in the model
--JsonFile Mode Output
Produces a JSON file containing the complete FlatBuffers-serialized model structure, including:
- All operators (
oplists) with their types, names, parameters, input/output indices - Tensor names (
tensorName) - Extra tensor descriptors (quantization info, formats)
- Sub-graphs for control flow
- Weight data (unless flag-based truncation is active)
Internal Implementation
dumpModelInfo (cli.cpp:L99-140)
// tools/converter/source/common/cli.cpp:L99-140
static int dumpModelInfo(const char* modelName) {
std::vector<std::string> empty;
std::shared_ptr<MNN::Express::Module> module(
MNN::Express::Module::load(empty, empty, modelName));
if (nullptr == module.get()) {
MNN_ERROR("Load MNN from %s Failed\n", modelName);
return 1;
}
auto info = module->getInfo();
// Print default dimensionFormat (NCHW/NHWC/NC4HW4)
// Print each input: name, dimensionFormat, shape, type
// Print each output name
// Print model version
// Print metadata key-value pairs if present
return 0;
}
The info dump uses MNN's Express API to load the model as a Module, then calls getInfo() to retrieve:
info->defaultFormat-- The model's default dimension formatinfo->inputNames/info->inputs-- Input names and variable info (dimensions, order, type)info->outputNames-- Output tensor namesinfo->version-- Model version stringinfo->metaData-- Custom metadata map
mnn2json (cli.cpp:L1087-1190)
// tools/converter/source/common/cli.cpp:L1087-1190
bool Cli::mnn2json(const char* modelFile, const char* jsonFile, int flag) {
// Read binary model file into buffer
// If flag > 3: strip large data (conv weights, biases, blob data, etc.)
// If flag > 4: dump each sub-graph to a separate file
// Convert FlatBuffers binary to string using FlatBufferToString()
// Write to output JSON file
}
The JSON export reads the raw MNN binary and uses FlatBuffers' FlatBufferToString() to produce a human-readable representation. When flag > 3, large tensors are cleared before export:
- Convolution weights and biases
- Blob data (for tensors with > 20 elements)
- MatMul weights and biases
- PReLU slopes
- Extra op info buffers
- LSTM weight matrices
json2mnn (cli.cpp:L1192-1214)
// tools/converter/source/common/cli.cpp:L1192-1214
bool Cli::json2mnn(const char* jsonFile, const char* modelFile) {
// Parse JSON using rapidjson
// Convert JSON to FlatBuffers using Json2Flatbuffer::writeJsonToFlatbuffer()
// Write binary output to model file
}
CLI Argument Routing (cli.cpp:L258-290)
The --info and --JsonFile arguments are processed in initializeMNNConvertArgs():
// tools/converter/source/common/cli.cpp:L439-452
if (result.count("MNNModel")) {
// Standard conversion output
} else if (result.count("JsonFile")) {
modelPath.mnn2json = true;
modelPath.MNNModel = JsonFilePath;
} else if (result.count("info") && modelPath.model == modelConfig::MNN) {
modelPath.dumpInfo = true;
return true;
}
In convertModel(), the dumpInfo flag triggers the info dump path:
// tools/converter/source/common/cli.cpp:L673-677
bool Cli::convertModel(modelConfig& modelPath) {
if (modelPath.dumpInfo) {
dumpModelInfo(modelPath.modelFile.c_str());
return true;
}
// ... normal conversion flow
}
Usage Examples
Quick Model Inspection
./MNNConvert -f MNN --modelFile resnet18.mnn --info
Example output:
Model default dimensionFormat is NCHW
Model Inputs:
[ input.1 ]: dimensionFormat: NCHW, size: [ 1,3,224,224 ], type is float
Model Outputs:
[ 497 ]
Model Version: 2.8.0
Full JSON Export
./MNNConvert -f MNN --modelFile resnet18.mnn --JsonFile resnet18.json
JSON Export Without Weights (Structure Only)
The mnn2json function's flag parameter controls weight inclusion. This is accessible programmatically via the C++ API:
// flag > 3 strips weights for readability
MNN::Cli::mnn2json("model.mnn", "model_structure.json", 4);
// flag > 4 additionally separates sub-graphs into individual files
MNN::Cli::mnn2json("model.mnn", "model_structure.json", 5);
Convert JSON Back to MNN
./MNNConvert -f JSON --modelFile model.json --MNNModel model_rebuilt.mnn
This enables a workflow where models can be exported to JSON, manually edited (e.g., adjusting parameters or removing ops), and then re-serialized to MNN binary format.
Practical Validation Workflow
A recommended post-conversion validation sequence:
# 1. Convert the model
./MNNConvert -f ONNX --modelFile model.onnx --MNNModel model.mnn --bizCode MNN
# 2. Inspect model metadata
./MNNConvert -f MNN --modelFile model.mnn --info
# 3. Export full structure for review (optional)
./MNNConvert -f MNN --modelFile model.mnn --JsonFile model_debug.json
# 4. Run numerical verification
python tools/script/testMNNFromOnnx.py model.onnx