Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Dotnet Machinelearning AutoMLExperiment Configuration

From Leeroopedia


Knowledge Sources
Domains Machine_Learning, AutoML
Last Updated 2026-02-09 00:00 GMT

Overview

Concrete tool for configuring an AutoML experiment using a fluent builder API provided by ML.NET AutoML.

Description

The AutoMLExperiment class and its extension methods provide a fluent builder API for configuring all aspects of an AutoML search. The AutoCatalog.CreateExperiment() factory method instantiates the experiment. Each subsequent configuration method (SetTrainingTimeInSeconds, SetBinaryClassificationMetric, SetDataset, SetPipeline, SetTuner) returns the same AutoMLExperiment instance, enabling method chaining. The experiment accumulates configuration state and validates it when execution begins.

Usage

Import Microsoft.ML.AutoML after constructing a sweepable pipeline and loading training/validation datasets. Use the fluent API to chain all configuration calls in a single expression. The experiment must have a time budget, a metric, a dataset, and a pipeline set before execution.

Code Reference

Source Location

  • Repository: ML.NET
  • File: src/Microsoft.ML.AutoML/API/AutoCatalog.cs (Lines 308-313 for CreateExperiment)
  • File: src/Microsoft.ML.AutoML/AutoMLExperiment/AutoMLExperiment.cs (Lines 77-89 for SetTrainingTimeInSeconds)
  • File: src/Microsoft.ML.AutoML/API/AutoMLExperimentExtension.cs (Lines 86-90 for SetBinaryClassificationMetric, Lines 141-151 for SetPipeline, Lines 30-45 for SetDataset)

Signature

// Create a new AutoML experiment
public AutoMLExperiment CreateExperiment(AutoMLExperiment.AutoMLExperimentSettings settings = null)

// Set the maximum training time in seconds
public AutoMLExperiment SetTrainingTimeInSeconds(uint trainingTimeInSeconds)

// Set the optimization metric for binary classification
public static AutoMLExperiment SetBinaryClassificationMetric(
    this AutoMLExperiment experiment,
    BinaryClassificationMetric metric,
    string labelColumn = "Label",
    string predictedColumn = "PredictedLabel")

// Set the training and validation datasets
public static AutoMLExperiment SetDataset(
    this AutoMLExperiment experiment,
    IDataView train,
    IDataView validation,
    bool subSamplingTrainDataset = false)

// Set the sweepable pipeline to search
public static AutoMLExperiment SetPipeline(
    this AutoMLExperiment experiment,
    SweepablePipeline pipeline)

// Set the tuner algorithm for hyperparameter search
public AutoMLExperiment SetTuner<TTuner>()
    where TTuner : class, ITuner

Import

using Microsoft.ML.AutoML;

I/O Contract

Inputs

Name Type Required Description
trainingTimeInSeconds uint Yes Maximum wall-clock time for the experiment in seconds
metric BinaryClassificationMetric Yes Optimization metric (Accuracy, AUC, AreaUnderPrecisionRecallCurve, F1Score, PositivePrecision, PositiveRecall, NegativePrecision, NegativeRecall)
labelColumn string No (default: "Label") Name of the label column
predictedColumn string No (default: "PredictedLabel") Name of the predicted label column
train IDataView Yes Training dataset
validation IDataView Yes Validation dataset for metric evaluation
subSamplingTrainDataset bool No (default: false) Whether to sub-sample training data for faster early trials
pipeline SweepablePipeline Yes The pipeline search space to explore
settings AutoMLExperimentSettings No Optional initial settings object

Outputs

Name Type Description
(return) AutoMLExperiment The same experiment instance (fluent builder), fully configured and ready for execution

Usage Examples

Basic Example

var mlContext = new MLContext();

// Split data into train/validation sets
var trainTestSplit = mlContext.Data.TrainTestSplit(data, testFraction: 0.2);

// Build the sweepable pipeline
var pipeline = mlContext.Auto()
    .Featurizer(trainTestSplit.TrainSet, outputColumnName: "Features")
    .Append(mlContext.Auto().BinaryClassification(
        labelColumnName: "Label",
        featureColumnName: "Features"));

// Configure the AutoML experiment using the fluent API
var experiment = mlContext.Auto()
    .CreateExperiment()
    .SetTrainingTimeInSeconds(120)
    .SetBinaryClassificationMetric(BinaryClassificationMetric.AUC, labelColumn: "Label")
    .SetDataset(trainTestSplit.TrainSet, trainTestSplit.TestSet)
    .SetPipeline(pipeline);

With Custom Tuner Example

// Configure with a specific tuner algorithm
var experiment = mlContext.Auto()
    .CreateExperiment()
    .SetTrainingTimeInSeconds(300)
    .SetBinaryClassificationMetric(BinaryClassificationMetric.F1Score)
    .SetDataset(trainData, validationData)
    .SetPipeline(pipeline)
    .SetTuner<GridSearchTuner>();

Related Pages

Implements Principle

Requires Environment

Uses Heuristic

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment