Exact Expressive Power of Transformers with Padding
arXiv:2505.18948v2 Announce Type: replace Abstract: Chain of thought is a natural inference-time method for increasing the computational power of transformer-based large language models (LLMs), but comes at the cost of sequential decoding. Are there more efficient alternatives to expand a…
