Graph of Thoughts Prompting
Prompt engineering series on techniques eliciting reasoning capabilities of LLMs
Original Paper: Graph of Thoughts: Solving Elaborate Problems with Large Language Models
Abstract:
The paper introduces Graph of Thoughts (GoT), a framework that enhances prompting capabilities in large language models by modelling information as a graph, enabling synergistic outcomes and enhancing thoughts using feedback loops.
GoT offers advantages over existing paradigms like Chain-of-Thought or Tree of Thoughts, increasing task quality and reducing costs, making it extensible for new prompting schemes
Practical Implications:
Graph of Thoughts (GoT) enhances large language models' prompting capabilities, offering advantages in tasks like sorting and reducing costs significantly.
GoT allows for combining ideas into synergistic outcomes, mimicking human reasoning processes, and improving the quality of outcomes over existing paradigms.
The framework is extensible for new thought transformations, leading to more principled prompt engineering and potentially advancing LLM reasoning closer to human thinking.
Methodology:
GoT models information as a graph, where units of information are vertices and edges represent dependencies, enabling combining thoughts into synergistic outcomes and enhancing thoughts using feedback loops.
The framework employs merge-based sorting, decomposing input sequences into subarrays, sorting them individually, and merging them into a final solution.
Self-reflection and self-evaluation are utilized in GoT to expand the graph of thoughts within a prompt, enhancing decision-making processes.
Conclusion:
The paper introduces Graph of Thoughts (GoT) as a framework that advances prompting capabilities in large language models beyond existing paradigms like Chain-of-Thought or Tree of Thoughts.
GoT outperforms other prompting schemes, ensuring a 62% increase in the quality of sorting over ToT, while simultaneously reducing costs by 31%.
The framework enables combining arbitrary LLM thoughts into synergistic outcomes, distilling the essence of whole networks of thoughts, or enhancing thoughts using feedback loops.
GoT reflects non-linear task-solving processes by allowing the combination of intermediate solutions into final ones or changing the flow of reasoning upon discovering new insights
How graph of thoughts framework is different from tree of thoughts prompting?
Graph of Thoughts (GoT) allows LLM thoughts to form an arbitrary graph structure, enabling more complex networks of thoughts, unlike the rigid tree structure imposed by Tree of Thoughts (ToT) prompting.
GoT facilitates combining intermediate solutions into final ones or changing the flow of reasoning upon discovering new insights, reflecting non-linear task-solving processes, which is not naturally expressible with ToT.
ToT limits reasoning abilities within a prompt by imposing a rigid tree structure on the thought process, while GoT enhances LLM capabilities through networked reasoning without resorting to model updates.
How graph of thoughts framework is different from chain of thoughts prompting?
Graph of Thoughts (GoT) allows for modelling LLM thoughts as an arbitrary graph structure, enabling complex networks of thoughts, unlike the linear chain structure of Chain of Thoughts prompting.
GoT facilitates combining arbitrary thoughts into synergistic outcomes or enhancing thoughts using feedback loops, offering advantages over the linear progression of Chain of Thoughts.
Chain of Thoughts follows a linear progression of thoughts, while GoT enables more flexible and non-linear task-solving processes by allowing the combination of intermediate solutions into final ones or changing the flow of reasoning upon discovering new insights
Paper Infographic