[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Everything of Thoughts: Defying the Law of Penrose Triangle for Thought Generation

Ruomeng Ding, Chaoyun Zhang, Lu Wang, Yong Xu, Minghua Ma, Wei Zhang, Si Qin, Saravan Rajmohan, Qingwei Lin, Dongmei Zhang


Abstract
This paper introduce a novel thought prompting approach called ”Everything of Thoughts” (XoT) for Large Language Models (LLMs) to defy the law of ”Penrose triangle” of existing thought paradigms, to achieve three key perspectives in thought generation simultaneously: performance, efficiency, and flexibility. XoT leverages pretrained reinforcement learning and Monte Carlo Tree Search (MCTS) to incorporate external domain knowledge and planning capability into thoughts, thereby enhancing LLMs’ decision-making capabilities. Through the MCTS-LLM collaborative thought revision framework, XoT autonomously produces high-quality comprehensive cognitive mappings with minimal LLM interactions. Additionally, XoT empowers LLMs to utilize flexible cognitive mappings for solving problems with multiple solutions.We evaluate XoT on several challenging problem-solving tasks, including Game of 24, 8-Puzzle, and Pocket Cube. Our results demonstrate that XoT significantly outperforms existing approaches in various dimensions, showcasing its remarkable proficiency in addressing complex problems across diverse domains. The data and code are available at https://github.com/microsoft/Everything-of-Thoughts-XoT.
Anthology ID:
2024.findings-acl.95
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1638–1662
Language:
URL:
https://aclanthology.org/2024.findings-acl.95
DOI:
10.18653/v1/2024.findings-acl.95
Bibkey:
Cite (ACL):
Ruomeng Ding, Chaoyun Zhang, Lu Wang, Yong Xu, Minghua Ma, Wei Zhang, Si Qin, Saravan Rajmohan, Qingwei Lin, and Dongmei Zhang. 2024. Everything of Thoughts: Defying the Law of Penrose Triangle for Thought Generation. In Findings of the Association for Computational Linguistics: ACL 2024, pages 1638–1662, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Everything of Thoughts: Defying the Law of Penrose Triangle for Thought Generation (Ding et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.95.pdf