Implementation:Spcl Graph of thoughts Prompter ABC
Appearance
| Knowledge Sources | |
|---|---|
| Source File | graph_of_thoughts/prompter/prompter.py, Lines L14-86 |
| Import | from graph_of_thoughts.prompter import Prompter
|
| Domains | Prompt_Engineering, LLM_Orchestration |
| Last Updated | 2026-02-14 12:00 GMT |
Overview
The Prompter class is the abstract base class (ABC) that defines the interface for all prompters in the Graph of Thoughts framework. Prompters are responsible for generating the prompts that are sent to language models for each type of graph operation.
Interface
Class Definition
from abc import ABC, abstractmethod
from typing import Dict, List
class Prompter(ABC):
"""
Abstract base class that defines the interface for all prompters.
Prompters are used to generate the prompts for the language models.
"""
Abstract Methods
The Prompter defines 5 abstract methods, one for each operation type in the Graph of Operations:
aggregation_prompt
@abstractmethod
def aggregation_prompt(self, state_dicts: List[Dict], **kwargs) -> str:
"""
Generate a aggregation prompt for the language model.
:param state_dicts: The thought states that should be aggregated.
:type state_dicts: List[Dict]
:param kwargs: Additional keyword arguments.
:return: The aggregation prompt.
:rtype: str
"""
pass
improve_prompt
@abstractmethod
def improve_prompt(self, **kwargs) -> str:
"""
Generate an improve prompt for the language model.
The thought state is unpacked to allow for additional keyword arguments
and concrete implementations to specify required arguments explicitly.
:param kwargs: Additional keyword arguments.
:return: The improve prompt.
:rtype: str
"""
pass
generate_prompt
@abstractmethod
def generate_prompt(self, num_branches: int, **kwargs) -> str:
"""
Generate a generate prompt for the language model.
The thought state is unpacked to allow for additional keyword arguments
and concrete implementations to specify required arguments explicitly.
:param num_branches: The number of responses the prompt should ask the LM to generate.
:type num_branches: int
:param kwargs: Additional keyword arguments.
:return: The generate prompt.
:rtype: str
"""
pass
validation_prompt
@abstractmethod
def validation_prompt(self, **kwargs) -> str:
"""
Generate a validation prompt for the language model.
The thought state is unpacked to allow for additional keyword arguments
and concrete implementations to specify required arguments explicitly.
:param kwargs: Additional keyword arguments.
:return: The validation prompt.
:rtype: str
"""
pass
score_prompt
@abstractmethod
def score_prompt(self, state_dicts: List[Dict], **kwargs) -> str:
"""
Generate a score prompt for the language model.
:param state_dicts: The thought states that should be scored,
if more than one, they should be scored together.
:type state_dicts: List[Dict]
:param kwargs: Additional keyword arguments.
:return: The score prompt.
:rtype: str
"""
pass
Input / Output
| Method | Input | Output |
|---|---|---|
| aggregation_prompt | state_dicts: List[Dict] -- list of thought state dicts to aggregate; **kwargs |
str -- the aggregation prompt
|
| improve_prompt | **kwargs -- unpacked thought state |
str -- the improve prompt
|
| generate_prompt | num_branches: int -- how many responses to request; **kwargs -- unpacked thought state |
str -- the generate prompt
|
| validation_prompt | **kwargs -- unpacked thought state |
str -- the validation prompt
|
| score_prompt | state_dicts: List[Dict] -- thought states to score; **kwargs |
str -- the score prompt
|
Design Notes
- Methods like improve_prompt, generate_prompt, and validation_prompt accept the thought state as unpacked
**kwargs. This allows concrete subclasses to declare explicit keyword arguments that they require (e.g.,input,current), giving clear signatures while remaining flexible. - Methods like aggregation_prompt and score_prompt take a
List[Dict]because they operate on multiple thought states simultaneously (merging or comparing them). - All methods return a plain
strprompt -- the framework does not impose any prompt templating library.
Example: Subclassing Prompter
from graph_of_thoughts.prompter import Prompter
from typing import Dict, List
class SortingPrompter(Prompter):
"""Concrete prompter for the sorting task."""
sort_prompt = (
"<Instruction> Sort the following list of numbers in ascending order. "
"Output only the sorted list of numbers, no additional text. </Instruction>\n"
"Input: {input}"
)
def generate_prompt(self, num_branches: int, **kwargs) -> str:
return self.sort_prompt.format(input=kwargs["input"])
def aggregation_prompt(self, state_dicts: List[Dict], **kwargs) -> str:
sub_lists = [str(s["current"]) for s in state_dicts]
return f"Merge these sorted sub-lists into one sorted list:\n" + "\n".join(sub_lists)
def improve_prompt(self, **kwargs) -> str:
return f"Improve this sorted list if any elements are out of order:\n{kwargs['current']}"
def validation_prompt(self, **kwargs) -> str:
return f"Is the following list sorted in ascending order? Answer Yes or No.\n{kwargs['current']}"
def score_prompt(self, state_dicts: List[Dict], **kwargs) -> str:
return f"Rate the correctness of this sorted list from 0 to 10:\n{state_dicts[0]['current']}"
Related Pages
Implements
Concrete Implementations (Examples)
- SortingPrompter in
examples/sorting/sorting_032.py - KeywordCountingPrompter in
examples/keyword_counting/ - DocMergePrompter in
examples/doc_merge/ - SetIntersectionPrompter in
examples/set_intersection/
Page Connections
Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment