Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Onnx Onnx Helper Make Graph

From Leeroopedia


Knowledge Sources
Domains Model_Construction, Computation_Graph
Last Updated 2026-02-10 00:00 GMT

Overview

Concrete tool for assembling computation graph protobuf messages provided by the ONNX helper module.

Description

The make_graph function constructs a GraphProto protobuf message by combining operator nodes, input/output specifications, and optional constant initializers into a complete computation graph. It is the central composition step in ONNX model construction, connecting the outputs of make_node and make_tensor_value_info into a single coherent structure. The function also supports sparse initializers and intermediate value information for shape inference.

Usage

Import this function after creating all operator nodes (via make_node) and tensor specifications (via make_tensor_value_info). The resulting GraphProto is passed directly to make_model to create the final ONNX model.

Code Reference

Source Location

  • Repository: onnx
  • File: onnx/helper.py
  • Lines: 200-240

Signature

def make_graph(
    nodes: Sequence[NodeProto],
    name: str,
    inputs: Sequence[ValueInfoProto],
    outputs: Sequence[ValueInfoProto],
    initializer: Sequence[TensorProto] | None = None,
    doc_string: str | None = None,
    value_info: Sequence[ValueInfoProto] | None = None,
    sparse_initializer: Sequence[onnx.SparseTensorProto] | None = None,
) -> GraphProto:
    """Construct a GraphProto.

    Args:
        nodes: List of NodeProto (computation operations).
        name: Graph name string.
        inputs: List of ValueInfoProto for graph inputs.
        outputs: List of ValueInfoProto for graph outputs.
        initializer: Optional list of TensorProto for constant weights.
        doc_string: Optional graph documentation.
        value_info: Optional list of ValueInfoProto for intermediate values.
        sparse_initializer: Optional list of SparseTensorProto.
    """

Import

from onnx import helper

I/O Contract

Inputs

Name Type Required Description
nodes Sequence[NodeProto] Yes List of computation operator nodes
name str Yes Graph name identifier
inputs Sequence[ValueInfoProto] Yes Graph input tensor specifications
outputs Sequence[ValueInfoProto] Yes Graph output tensor specifications
initializer Sequence[TensorProto] or None No Constant weight tensors (default: [])
doc_string str or None No Optional documentation string
value_info Sequence[ValueInfoProto] or None No Intermediate value type information
sparse_initializer Sequence[SparseTensorProto] or None No Sparse constant tensors (default: [])

Outputs

Name Type Description
return GraphProto Complete computation graph protobuf message

Usage Examples

Simple Linear Graph

from onnx import helper, TensorProto

# 1. Define inputs and outputs
X = helper.make_tensor_value_info("X", TensorProto.FLOAT, [1, 3])
Y = helper.make_tensor_value_info("Y", TensorProto.FLOAT, [1, 3])

# 2. Create operator node
relu_node = helper.make_node("Relu", ["X"], ["Y"])

# 3. Assemble the graph
graph = helper.make_graph(
    [relu_node],         # nodes
    "simple_relu",       # name
    [X],                 # inputs
    [Y],                 # outputs
)

Graph with Initializers

import numpy as np
from onnx import helper, TensorProto, numpy_helper

# Create weight tensor as initializer
weight_data = np.random.randn(10, 784).astype(np.float32)
W = numpy_helper.from_array(weight_data, name="W")

# Define specs
X = helper.make_tensor_value_info("X", TensorProto.FLOAT, [1, 784])
Y = helper.make_tensor_value_info("Y", TensorProto.FLOAT, [1, 10])

# Create MatMul node
matmul = helper.make_node("MatMul", ["X", "W"], ["Y"])

# Assemble with initializer
graph = helper.make_graph(
    [matmul],
    "linear_model",
    [X],
    [Y],
    initializer=[W],
)

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment