Archives AI News

Windows 11 SSD issues blamed on reviewers using ‘early versions of firmware’

1247086031

Reports have been circulating for the past few weeks that recent Windows 11 updates (KB5063878 and KB5062660) were causing some SSDs using Phison controllers to fail. While plenty of YouTube and TikTok videos have blamed Microsoft for the issues, Phison has identified “early versions of firmware and BIOS” as the problem instead. “Many of the […]

It’s time to change your Plex password again

heroplex logo tv

The Plex media streaming platform has been breached in what looks to be a repeat of a 2022 incident that saw authentication data and encrypted passwords compromised. The company is urging users to change their password, enable two-factor authentication, and sign out of any connected devices that might already be logged in. In an email sent to […]

Lenovo Coupon Codes and Deals: Up to $890 Off

Whether you’re shopping for a ThinkPad, Yoga laptop, or Legion gaming PC, these Lenovo discount codes and promotions can help you save big on your next tech upgrade.

Amortized In-Context Mixed Effect Transformer Models: A Zero-Shot Approach for Pharmacokinetics

arXiv:2508.15659v2 Announce Type: replace Abstract: Accurate dose-response forecasting under sparse sampling is central to precision pharmacotherapy. We present the Amortized In-Context Mixed-Effect Transformer (AICMET) model, a transformer-based latent-variable framework that unifies mechanistic compartmental priors with amortized in-context Bayesian inference. AICMET is pre-trained on hundreds of thousands of synthetic pharmacokinetic trajectories with Ornstein-Uhlenbeck priors over the parameters of compartment models, endowing the model with strong inductive biases and enabling zero-shot adaptation to new compounds. At inference time, the decoder conditions on the collective context of previously profiled trial participants, generating calibrated posterior predictions for newly enrolled patients after a few early drug concentration measurements. This capability collapses traditional model-development cycles from weeks to hours while preserving some degree of expert modelling. Experiments across public datasets show that AICMET attains state-of-the-art predictive accuracy and faithfully quantifies inter-patient variability -- outperforming both nonlinear mixed-effects baselines and recent neural ODE variants. Our results highlight the feasibility of transformer-based, population-aware neural architectures as offering a new alternative for bespoke pharmacokinetic modeling pipelines, charting a path toward truly population-aware personalized dosing regimens.

Simulation Priors for Data-Efficient Deep Learning

arXiv:2509.05732v1 Announce Type: new Abstract: How do we enable AI systems to efficiently learn in the real-world? First-principles models are widely used to simulate natural systems, but often fail to capture real-world complexity due to simplifying assumptions. In contrast, deep learning approaches can estimate complex dynamics with minimal assumptions but require large, representative datasets. We propose SimPEL, a method that efficiently combines first-principles models with data-driven learning by using low-fidelity simulators as priors in Bayesian deep learning. This enables SimPEL to benefit from simulator knowledge in low-data regimes and leverage deep learning's flexibility when more data is available, all the while carefully quantifying epistemic uncertainty. We evaluate SimPEL on diverse systems, including biological, agricultural, and robotic domains, showing superior performance in learning complex dynamics. For decision-making, we demonstrate that SimPEL bridges the sim-to-real gap in model-based reinforcement learning. On a high-speed RC car task, SimPEL learns a highly dynamic parking maneuver involving drifting with substantially less data than state-of-the-art baselines. These results highlight the potential of SimPEL for data-efficient learning and control in complex real-world environments.