Mind Your Theory: Theory of Mind Goes Deeper Than Reasoning
Published in ArXiv, 2024
Recommended citation: Wagner, Alon, Barnby and Abend. (2024). "Mind Your Theory: Theory of Mind Goes Deeper Than Reasoning" ArXiv https://arxiv.org/pdf/2412.13631
This work criticize the current methods for evaluation of Theory of Mind (ToM) capabilities in Large Language Models (LLMs). Drawing from cognitive science, we propose that ToM involves two critical steps:
Determining When to Invoke ToM: This step involves recognizing the necessity to apply ToM and deciding the appropriate Depth of Mentalizing (DoM), which refers to the level of recursive thinking required for a given task.
Applying the Correct Inference: Once the need for ToM is established, this step focuses on making accurate inferences based on the identified DoM.
We observe that current AI research predominantly emphasizes the second step, often treating ToM tasks as static logic problems. This approach overlooks the dynamic nature of real-world social interactions, where both recognizing the need for ToM and applying it appropriately are essential.
To address this gap, the paper suggests enhancing the evaluation of ToM capabilities in LLMs by incorporating dynamic environments inspired by cognitive tasks used with biological agents. This perspective aims to foster a more comprehensive understanding of LLMs’ social intelligence, moving beyond static assessments to more nuanced evaluations that reflect real-life complexities.