Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Groq Groq python Embeddings Create

From Leeroopedia
Knowledge Sources
Domains NLP, Embeddings
Last Updated 2026-02-15 16:00 GMT

Overview

Concrete tool for generating text embedding vectors provided by the Groq Python SDK.

Description

The Embeddings.create() method POSTs to /openai/v1/embeddings with input text and model identifier. It returns a CreateEmbeddingResponse containing a list of Embedding objects, each with a float vector and position index, plus usage statistics.

Usage

Access via client.embeddings.create(). Provide input (text or list of texts) and model identifier.

Code Reference

Source Location

  • Repository: groq-python
  • File: src/groq/resources/embeddings.py
  • Lines: L47-100 (sync), L123-176 (async)

Signature

class Embeddings(SyncAPIResource):
    def create(
        self,
        *,
        input: Union[str, SequenceNotStr[str]],
        model: Union[str, Literal["nomic-embed-text-v1_5"]],
        encoding_format: Literal["float", "base64"] | Omit = omit,
        user: Optional[str] | Omit = omit,
    ) -> CreateEmbeddingResponse:

Import

from groq import Groq
# Access via: client.embeddings.create(...)

I/O Contract

Inputs

Name Type Required Description
input Union[str, List[str]] Yes Text(s) to embed
model str or Literal["nomic-embed-text-v1_5"] Yes Embedding model ID
encoding_format Literal["float", "base64"] No Output vector format
user Optional[str] No End-user identifier for abuse monitoring

Outputs

Name Type Description
(return) CreateEmbeddingResponse Object with data (List[Embedding]), model, usage fields

Usage Examples

Single Text Embedding

from groq import Groq

client = Groq()

response = client.embeddings.create(
    input="What is machine learning?",
    model="nomic-embed-text-v1_5",
)

vector = response.data[0].embedding
print(f"Dimension: {len(vector)}")
print(f"First 5 values: {vector[:5]}")

Batch Embedding

from groq import Groq

client = Groq()

response = client.embeddings.create(
    input=["hello world", "goodbye world"],
    model="nomic-embed-text-v1_5",
    encoding_format="float",
)

for emb in response.data:
    print(f"Index {emb.index}: dim={len(emb.embedding)}")

print(f"Tokens used: {response.usage.total_tokens}")

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment