Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Liu00222 Open Prompt Injection create model

From Leeroopedia
Knowledge Sources
Domains NLP, LLM, Model_Loading
Last Updated 2026-02-14 15:00 GMT

Overview

Concrete factory function for creating LLM wrapper instances provided by the OpenPromptInjection models module.

Description

The create_model function dispatches to the correct Model subclass (GPT, PaLM2, Flan, Llama, Llama3, Vicuna, DeepSeekWrapper, Internlm) based on the `model_info.provider` key in the config dictionary. Each wrapper implements a unified `.query(prompt)` interface that handles provider-specific API calls, tokenization, generation, and response extraction.

Usage

Import this function when setting up an experiment to create the LLM that will be used for querying. Requires a config dict from `open_config` with `api_key_info.api_key_use` set appropriately.

Code Reference

Source Location

Signature

def create_model(config):
    """
    Factory function to create a model wrapper.

    Args:
        config (dict): Model configuration with keys:
            - model_info.provider (str): One of 'openai', 'google', 'flan',
              'llama', 'llama3', 'vicuna', 'deepseek', 'internlm'
            - model_info.name (str): Model identifier
            - params: Model-specific parameters
            - api_key_info: API key configuration
    Returns:
        Model: Subclass instance with .query(prompt) -> str method
    """

Import

import OpenPromptInjection as PI
# or
from OpenPromptInjection import create_model

I/O Contract

Inputs

Name Type Required Description
config dict Yes Model config with `model_info.provider`, `model_info.name`, `params`, `api_key_info`

Outputs

Name Type Description
model Model Instance with `.query(prompt: str) -> str` method, `.print_model_info()`, `.name()`, `.provider()`

Usage Examples

Creating an OpenAI GPT Model

import OpenPromptInjection as PI
from OpenPromptInjection.utils import open_config

model_config = open_config("configs/model_configs/gpt_config.json")
model_config["api_key_info"]["api_key_use"] = 0  # Select API key index
model = PI.create_model(config=model_config)
model.print_model_info()

response = model.query("What is the sentiment of: I love this movie")
print(response)

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment