Principle:Iamhankai Forest of Thought CGDM Answer Selection
| Knowledge Sources | |
|---|---|
| Domains | Ensemble_Methods, Decision_Theory, Reasoning |
| Last Updated | 2026-02-14 03:00 GMT |
Overview
A two-stage answer selection strategy combining majority voting with LLM expert arbitration for robust final answer determination.
Description
Confidence-Guided Decision Making (CGDM) is a two-stage answer selection strategy designed for the Forest-of-Thought framework. In the first stage, majority voting selects the answer if a clear consensus exists. In the second stage, when no majority exists (a tie), an LLM expert judge is prompted to analyze all candidate answers and select the most correct one. This hybrid approach combines the efficiency of voting with the reasoning ability of LLMs for ambiguous cases.
CGDM supports four stopping strategies:
- cgdm: Majority vote first, LLM judge for ties (default and recommended)
- majority: Pure majority voting
- random: Random selection from activated answers
- score: Selection based on highest MCTS reward score
Usage
Use this principle for final answer determination in FoT evaluation. CGDM is the default stopping strategy and is invoked via get_fot_final_answer() after all trees have completed (or after early stopping triggers).
Theoretical Basis
CGDM implements a cascaded decision framework:
# CGDM pseudo-code
def cgdm_select(answers, query):
majority = most_frequent(answers)
if len(majority) == 1:
return majority[0] # Stage 1: Clear consensus
else:
return llm_judge(query, answers) # Stage 2: Expert arbitration
Rationale:
- Stage 1 is fast and accurate when trees converge (common case)
- Stage 2 handles ambiguous cases where statistical voting fails
- The expert judge has access to the original question and all candidate answers, enabling informed selection