Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Tensorflow Serving Mnist Training Example

From Leeroopedia
Knowledge Sources
Domains Deep_Learning, Training
Last Updated 2026-02-13 17:00 GMT

Overview

Concrete tool for training a softmax regression model on MNIST data provided by the TensorFlow Serving example scripts.

Description

The main() function in mnist_saved_model.py implements a complete MNIST training pipeline using TensorFlow v1 APIs. It creates an interactive session, defines a softmax regression graph with cross-entropy loss, trains via gradient descent for a configurable number of iterations, and prepares the session with trained weights for subsequent export.

Usage

Use this implementation when you need to train a simple classification model as a demonstration of the TensorFlow Serving export pipeline. This is the canonical example showing how to produce a trained TensorFlow session that can be serialized with SavedModelBuilder.

Code Reference

Source Location

  • Repository: tensorflow/serving
  • File: tensorflow_serving/example/mnist_saved_model.py
  • Lines: L49-93

Signature

def main(_):
    """
    Train a softmax regression model on MNIST data.

    Uses tf.compat.v1 APIs:
      - tf.compat.v1.InteractiveSession()
      - tf.nn.softmax(tf.matmul(x, w) + b)
      - tf.compat.v1.train.GradientDescentOptimizer(0.01).minimize(cross_entropy)

    CLI Flags:
      --training_iteration: int (default 1000) - number of gradient descent steps
      --model_version: int (default 1) - version number for export
      --work_dir: str (default '/tmp') - MNIST data directory
    """

Import

import tensorflow as tf
from tensorflow.python.ops import lookup_ops
import mnist_input_data

I/O Contract

Inputs

Name Type Required Description
MNIST dataset mnist_input_data Yes Downloaded via read_data_sets(); 50k train, 10k test images
training_iteration int No CLI flag, default 1000; number of gradient descent iterations
model_version int No CLI flag, default 1; version number for export path
work_dir str No CLI flag, default '/tmp'; working directory for MNIST data

Outputs

Name Type Description
sess tf.compat.v1.InteractiveSession Trained TF session with optimized weights
w tf.Variable Weight matrix of shape [784, 10]
b tf.Variable Bias vector of shape [10]
y tf.Tensor Softmax prediction tensor (name='y')
x tf.Tensor Input placeholder tensor (name='x')
prediction_classes tf.Tensor String tensor of predicted class labels
values tf.Tensor Top-k probability values

Usage Examples

Training MNIST Model

# Train model with default 1000 iterations, export as version 1
python tensorflow_serving/example/mnist_saved_model.py /tmp/mnist_model

# Train with custom iterations and version
python tensorflow_serving/example/mnist_saved_model.py \
    --training_iteration=2000 \
    --model_version=2 \
    /tmp/mnist_model

Programmatic Training (extracted logic)

import tensorflow as tf
import mnist_input_data

# 1. Load MNIST data
mnist = mnist_input_data.read_data_sets('/tmp', one_hot=True)

# 2. Create session and define model
sess = tf.compat.v1.InteractiveSession()
x = tf.compat.v1.placeholder(tf.float32, shape=[None, 784], name='x')
w = tf.Variable(tf.zeros([784, 10]))
b = tf.Variable(tf.zeros([10]))
sess.run(tf.compat.v1.global_variables_initializer())
y = tf.nn.softmax(tf.matmul(x, w) + b, name='y')

# 3. Define loss and optimizer
y_ = tf.compat.v1.placeholder('float', shape=[None, 10])
cross_entropy = -tf.math.reduce_sum(y_ * tf.math.log(y))
train_step = tf.compat.v1.train.GradientDescentOptimizer(0.01).minimize(cross_entropy)

# 4. Train
for _ in range(1000):
    batch = mnist.train.next_batch(50)
    train_step.run(feed_dict={x: batch[0], y_: batch[1]})

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment