Jump to content

Connect SuperML | Leeroopedia MCP: Equip your AI agents with best practices, code verification, and debugging knowledge. Powered by Leeroo — building Organizational Superintelligence. Contact us at founders@leeroo.com.

Implementation:Mistralai Client python Embeddings Create

From Leeroopedia
Knowledge Sources
Domains NLP, Embeddings, Semantic_Search
Last Updated 2026-02-15 14:00 GMT

Overview

Concrete tool for generating text embeddings via the Mistral API provided by the Embeddings resource.

Description

The Embeddings.create() and Embeddings.create_async() methods send text inputs to the Mistral embedding model and return dense vector representations. The method accepts a single string or a list of strings as inputs, along with the embedding model name. Optional parameters include output_dimension (for dimensionality reduction) and encoding_format. The response contains an EmbeddingResponse with a data list of embedding objects.

Usage

Call client.embeddings.create() with one or more text strings. Use the mistral-embed model for general-purpose text embeddings.

Code Reference

Source Location

  • Repository: client-python
  • File: src/mistralai/client/embeddings.py
  • Lines: L20-129 (sync), L131-240 (async)

Signature

class Embeddings:
    def create(
        self,
        *,
        model: str,
        inputs: Union[str, List[str]],
        metadata: Optional[Dict] = None,
        output_dimension: Optional[int] = None,
        output_dtype: Optional[EmbeddingDtype] = None,
        encoding_format: Optional[EncodingFormat] = None,
    ) -> EmbeddingResponse:
        ...

    async def create_async(
        self,
        *,
        model: str,
        inputs: Union[str, List[str]],
        # Same parameters
    ) -> EmbeddingResponse:
        ...

Import

from mistralai import Mistral
# Access via: client.embeddings.create(...)

I/O Contract

Inputs

Name Type Required Description
model str Yes Embedding model ID (e.g., "mistral-embed")
inputs Union[str, List[str]] Yes Text(s) to embed
output_dimension Optional[int] No Custom embedding dimension
encoding_format Optional[EncodingFormat] No Output encoding format

Outputs

Name Type Description
response EmbeddingResponse Contains data list and usage info
response.data[i].embedding List[float] Embedding vector for i-th input
response.usage UsageInfo Token consumption statistics

Usage Examples

Generate Embeddings

import os
from mistralai import Mistral

client = Mistral(api_key=os.environ["MISTRAL_API_KEY"])

# Single text
response = client.embeddings.create(
    model="mistral-embed",
    inputs="What is machine learning?",
)
vector = response.data[0].embedding
print(f"Dimension: {len(vector)}")

# Batch of texts
response = client.embeddings.create(
    model="mistral-embed",
    inputs=[
        "Machine learning is a subset of AI.",
        "Deep learning uses neural networks.",
        "The weather is sunny today.",
    ],
)
for i, item in enumerate(response.data):
    print(f"Text {i}: {len(item.embedding)} dimensions")

Compute Similarity

import numpy as np

def cosine_similarity(a, b):
    return np.dot(a, b) / (np.linalg.norm(a) * np.linalg.norm(b))

response = client.embeddings.create(
    model="mistral-embed",
    inputs=["cat", "dog", "computer"],
)
vecs = [item.embedding for item in response.data]
print(f"cat-dog similarity: {cosine_similarity(vecs[0], vecs[1]):.3f}")
print(f"cat-computer similarity: {cosine_similarity(vecs[0], vecs[2]):.3f}")

Related Pages

Implements Principle

Requires Environment

Uses Heuristic

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment