Bounds in Wasserstein Distance for Locally Stationary Processes

arXiv:2412.03414v2 Announce Type: replace-cross Abstract: Locally stationary (LSPs) constitute an essential modeling paradigm for capturing the nuanced dynamics inherent in time series data whose statistical characteristics, including mean and variance, evolve smoothly across time. In this paper, we introduce a novel conditional probability distribution estimator specifically tailored for LSPs, employing the Nadaraya-Watson (NW) kernel smoothing methodology. The NW estimator, a prominent local averaging technique, leverages kernel smoothing to approximate the conditional distribution of a response variable given its covariates. We rigorously establish convergence rates for the NW-based conditional probability estimator in the univariate setting under the Wasserstein metric, providing explicit bounds and conditions that guarantee optimal performance. Extending this theoretical framework, we subsequently generalize our analysis to the multivariate scenario using the sliced Wasserstein distance, an approach particularly advantageous in circumventing the computational and analytical challenges typically associated with high-dimensional settings. To corroborate our theoretical contributions, we conduct extensive numerical simulations on synthetic datasets and provide empirical validations using real-world data, highlighting the estimator's practical relevance and effectiveness in capturing intricate temporal dependencies and underscoring its relevance for analyzing complex nonstationary phenomena.

2025-08-29 05:30 GMT · 2 months ago arxiv.org

arXiv:2412.03414v2 Announce Type: replace-cross Abstract: Locally stationary (LSPs) constitute an essential modeling paradigm for capturing the nuanced dynamics inherent in time series data whose statistical characteristics, including mean and variance, evolve smoothly across time. In this paper, we introduce a novel conditional probability distribution estimator specifically tailored for LSPs, employing the Nadaraya-Watson (NW) kernel smoothing methodology. The NW estimator, a prominent local averaging technique, leverages kernel smoothing to approximate the conditional distribution of a response variable given its covariates. We rigorously establish convergence rates for the NW-based conditional probability estimator in the univariate setting under the Wasserstein metric, providing explicit bounds and conditions that guarantee optimal performance. Extending this theoretical framework, we subsequently generalize our analysis to the multivariate scenario using the sliced Wasserstein distance, an approach particularly advantageous in circumventing the computational and analytical challenges typically associated with high-dimensional settings. To corroborate our theoretical contributions, we conduct extensive numerical simulations on synthetic datasets and provide empirical validations using real-world data, highlighting the estimator's practical relevance and effectiveness in capturing intricate temporal dependencies and underscoring its relevance for analyzing complex nonstationary phenomena.

Original: https://arxiv.org/abs/2412.03414