Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Liu00222 Open Prompt Injection Model query

From Leeroopedia
Knowledge Sources
Domains NLP, LLM
Last Updated 2026-02-14 15:00 GMT

Overview

Abstract base method defining the LLM query interface, provided by the Model base class with 9+ concrete implementations.

Description

The Model.query method is the abstract interface that all LLM wrappers must implement. It takes a string prompt and returns a string response. Concrete implementations include GPT (OpenAI API with retry), PaLM2 (Google API), Flan (HuggingFace T5), Llama (pipeline-based), Llama3 (chat template), Vicuna (FastChat conversation), DeepSeek (with think-block stripping), InternLM (pipeline-based), and QLoraModel (4-bit quantized with LoRA).

Usage

This method is called by `Application.query` during pipeline execution and directly by `main.py:L62` for injected task baseline evaluation (bypassing the Application defense pipeline).

Code Reference

Source Location

  • Repository: Open-Prompt-Injection
  • File: OpenPromptInjection/models/Model.py
  • Lines: L25-26 (abstract definition)

Signature

class Model:
    def query(self, msg):
        """
        Send a prompt to the LLM and return the response.

        Args:
            msg (str): The prompt string, typically formatted as
                       "{instruction}\nText: {data_prompt}".
        Returns:
            str: The model's text response.
        Raises:
            NotImplementedError: If called on base class directly.
        """
        return NotImplementedError

Import

from OpenPromptInjection.models.Model import Model
# Concrete instances created via:
# model = PI.create_model(config)

I/O Contract

Inputs

Name Type Required Description
msg str Yes Prompt string for the LLM (typically `"{instruction}\nText: {data_prompt}"`)

Outputs

Name Type Description
response str Raw model response text. May be `None` if generation fails.

Usage Examples

Direct Model Query (Injected Task Baseline)

# As used in main.py for injected task baseline evaluation
model = PI.create_model(model_config)
inject_task = PI.create_task(inject_config, data_num=100, for_injection=True)

for i, (data_prompt, label) in enumerate(inject_task):
    prompt = inject_task.get_instruction() + '\nText: ' + data_prompt
    response = model.query(prompt)
    print(response)

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment