Archives AI News

Federated Multi-Task Clustering

arXiv:2512.22897v2 Announce Type: replace Abstract: Spectral clustering has emerged as one of the most effective clustering algorithms due to its superior performance. However, most existing models are designed for centralized settings, rendering them inapplicable in modern decentralized environments. Moreover, current…

A Survey on Graph Neural Networks for Fraud Detection in Ride Hailing Platforms

arXiv:2512.23777v1 Announce Type: new Abstract: This study investigates fraud detection in ride hailing platforms through Graph Neural Networks (GNNs),focusing on the effectiveness of various models. By analyzing prevalent fraudulent activities, the research highlights and compares the existing work related to…

Myopically Verifiable Probabilistic Certificates for Safe Control and Learning

arXiv:2404.16883v2 Announce Type: replace-cross Abstract: This paper addresses the design of safety certificates for stochastic systems, with a focus on ensuring long-term safety through fast real-time control. In stochastic environments, set invariance-based methods that restrict the probability of risk events…

Deep sequence models tend to memorize geometrically; it is unclear why

arXiv:2510.26745v2 Announce Type: replace Abstract: Deep sequence models are said to store atomic facts predominantly in the form of associative memory: a brute-force lookup of co-occurring entities. We identify a dramatically different form of storage of atomic facts that we…

Tazza: Shuffling Neural Network Parameters for Secure and Private Federated Learning

arXiv:2412.07454v3 Announce Type: replace Abstract: Federated learning enables decentralized model training without sharing raw data, preserving data privacy. However, its vulnerability towards critical security threats, such as gradient inversion and model poisoning by malicious clients, remain unresolved. Existing solutions often…

Learning Network Dismantling Without Handcrafted Inputs

arXiv:2508.00706v2 Announce Type: replace Abstract: The application of message-passing Graph Neural Networks has been a breakthrough for important network science problems. However, the competitive performance often relies on using handcrafted structural features as inputs, which increases computational cost and introduces…

Nonlinear Noise2Noise for Efficient Monte Carlo Denoiser Training

arXiv:2512.24794v1 Announce Type: cross Abstract: The Noise2Noise method allows for training machine learning-based denoisers with pairs of input and target images where both the input and target can be noisy. This removes the need for training with clean target images,…

Optimal Approximation — Smoothness Tradeoffs for Soft-Max Functions

arXiv:2010.11450v2 Announce Type: replace Abstract: A soft-max function has two main efficiency measures: (1) approximation – which corresponds to how well it approximates the maximum function, (2) smoothness – which shows how sensitive it is to changes of its input.…