Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Princeton nlp Tree of thought llm Get Proposals

From Leeroopedia
Knowledge Sources
Domains LLM_Reasoning, Search_Algorithms, NLP
Last Updated 2026-02-14 03:30 GMT

Overview

Concrete tool for generating structured thought proposals from the LLM provided by the Tree of Thoughts BFS module.

Description

The get_proposals function implements the propose generation strategy. It wraps the current input and partial solution into a propose prompt via the task object, calls the LLM once (n=1), splits the response by newlines to extract individual proposed steps, and appends each step to the current partial solution to produce candidate continuations. This is complemented by get_samples which implements the sample strategy using independent LLM completions.

Usage

Used within solve() when args.method_generate == 'propose' . Primarily used by the Game of 24 task where possible arithmetic operations can be enumerated.

Code Reference

Source Location

  • Repository: tree-of-thought-llm
  • File: src/tot/methods/bfs.py
  • Lines: 34-37 (get_proposals), 39-47 (get_samples)

Signature

def get_proposals(task, x, y):
    """
    Generate thought proposals using structured propose prompt.

    Args:
        task: Task object with propose_prompt_wrap() method.
        x (str): Original problem input.
        y (str): Current partial solution.

    Returns:
        list[str]: List of candidate continuations (y + proposed_step + '\n').
    """

def get_samples(task, x, y, n_generate_sample, prompt_sample, stop):
    """
    Generate thought samples using independent LLM completions.

    Args:
        task: Task object with standard_prompt_wrap() and cot_prompt_wrap().
        x (str): Original problem input.
        y (str): Current partial solution.
        n_generate_sample (int): Number of independent samples to generate.
        prompt_sample (str): 'standard' or 'cot' prompt type.
        stop (str or None): Stop token for generation.

    Returns:
        list[str]: List of candidate continuations (y + sample).
    """

Import

from tot.methods.bfs import get_proposals, get_samples

I/O Contract

Inputs (get_proposals)

Name Type Required Description
task Task Yes Task object with propose_prompt_wrap(x, y) method
x str Yes Original problem input string
y str Yes Current partial solution string

Inputs (get_samples)

Name Type Required Description
task Task Yes Task object with standard_prompt_wrap and cot_prompt_wrap methods
x str Yes Original problem input
y str Yes Current partial solution
n_generate_sample int Yes Number of independent samples
prompt_sample str Yes 'standard' or 'cot'
stop str/None Yes Stop token for generation

Outputs

Name Type Description
return list[str] List of candidate continuation strings, each extending the partial solution y

Usage Examples

Propose Generation (Game of 24)

from tot.tasks import get_task
from tot.methods.bfs import get_proposals

task = get_task('game24')
x = task.get_input(900)   # e.g., "1 2 3 4"
y = ''                     # empty partial solution at step 0

# Get proposed next steps (arithmetic operations)
proposals = get_proposals(task, x, y)
# proposals might be:
# ["1 + 2 = 3 (left: 3 3 4)\n", "1 * 4 = 4 (left: 2 3 4)\n", ...]

Sample Generation (Creative Writing)

from tot.tasks import get_task
from tot.methods.bfs import get_samples

task = get_task('text')
x = task.get_input(0)
y = ''

# Get 5 independent CoT completions
samples = get_samples(task, x, y, n_generate_sample=5, prompt_sample='cot', stop='\nPassage:\n')

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment