Archives AI News

On the Quantization Robustness of Diffusion Language Models in Coding Benchmarks

arXiv:2604.20079v1 Announce Type: new Abstract: Auto-regressive Large Language Models (LLMs) achieve strong performance on coding tasks, but incur high memory and inference costs. Diffusion-based language models (d-LLMs) offer bounded inference cost via iterative denoising, but their behavior under post-training quantization…

Concept Graph Convolutions: Message Passing in the Concept Space

arXiv:2604.20082v1 Announce Type: new Abstract: The trust in the predictions of Graph Neural Networks is limited by their opaque reasoning process. Prior methods have tried to explain graph networks via concept-based explanations extracted from the latent representations obtained after message…

Agnostic Language Identification and Generation

arXiv:2601.23258v2 Announce Type: replace Abstract: Recent works on language identification and generation have established tight statistical rates at which these tasks can be achieved. These works typically operate under a strong realizability assumption: that the input data is drawn from…

CASS: Nvidia to AMD Transpilation with Data, Models, and Benchmark

arXiv:2505.16968v4 Announce Type: replace-cross Abstract: Cross-architecture GPU code transpilation is essential for unlocking low-level hardware portability, yet no scalable solution exists. We introduce CASS, the first dataset and model suite for source- and assembly-level GPU translation (CUDA HIP, SASS RDNA3).…

Gauge-covariant stochastic neural fields: Stability and finite-width effects

arXiv:2508.18948v2 Announce Type: replace-cross Abstract: We develop a gauge-covariant stochastic effective field theory for stability and finite-width effects in deep neural systems. The model uses classical commuting fields: a complex matter field, a real Abelian connection field, and a fictitious…