On the Representational Capacity of Neural Language Models with Chain-of-Thought Reasoning
Published in ACL 2024, 2024
This work investigates the computational expressivity of language models based on recurrent neural networks. We extend the Turing completeness result by Siegelmann and Sontag (1992) to the probabilistic case, showing how a rationally weighted RLM with unbounded computation time can simulate any probabilistic Turing machine (PTM).
Citation BibTeX
:
@inproceedings{nowak-etal-2024-representational,
title = "On the Representational Capacity of Neural Language Models with Chain-of-Thought Reasoning",
author = "Nowak, Franz and
Svete, Anej and
Butoi, Alexandra and
Cotterell, Ryan",
month = aug,
year = "2024",
address = "Bangkok, Thailand",
publisher = "Association for Computational Linguistics",
abstract = "This work investigates the computational expressivity of language models based on recurrent neural networks. We extend the Turing completeness result by Siegelmann and Sontag (1992) to the probabilistic case, showing how a rationally weighted RLM with unbounded computation time can simulate any probabilistic Turing machine (PTM).",
}