Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Open compass VLMEvalKit LMDeployWrapper

From Leeroopedia
Field Value
source VLMEvalKit
domain Vision, API_Integration

Overview

LMDeployWrapper provides a VLMEvalKit API adapter for models served through the LMDeploy inference framework.

Description

LMDeployWrapper inherits from BaseAPI and connects to a locally or remotely deployed LMDeploy model server via its OpenAI-compatible REST API. It supports custom prompt strategies (InternVL2, CogVLM2) through a prompt_map mechanism, allowing dataset-specific prompt engineering. The class automatically discovers available model IDs from the deployment endpoint.

Usage

Use this adapter when evaluating vision-language models that are served via LMDeploy's API server, such as InternVL2 or CogVLM2 deployments.

Code Reference

  • Source: vlmeval/api/lmdeploy.py, Lines: L1-333
  • Import: from vlmeval.api.lmdeploy import LMDeployWrapper

Signature:

class LMDeployWrapper(BaseAPI):
    def __init__(self, model=None, retry=5, key='sk-123456', verbose=True,
                 temperature=0.0, timeout=60, api_base=None,
                 system_prompt=None, max_tokens=1024, **kwargs): ...
    def generate_inner(self, inputs, **kwargs): ...

I/O Contract

Direction Description
Inputs message — text/image/video content list; model-specific params via kwargs
Outputs generate() returns str prediction; generate_inner() returns (int, str, str) tuple

Usage Examples

# Example instantiation
model = LMDeployWrapper(model='internvl2', api_base='http://localhost:23333/v1')
response = model.generate(message)

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment