Nyström Kernel Stein Discrepancy
Kernel methods underpin many of the most successful approaches in data science and statistics, and they allow representing probability measures as elements of a reproducing kernel Hilbert space without loss of information. Recently, the kernel Stein discrepancy (KSD), which combines Stein's method with the flexibility of kernel techniques, gained considerable attention. Through the Stein operator, KSD allows the construction of powerful goodness-of-fit tests where it is sufficient to know the target distribution up to a multiplicative constant. However, the typical U- and V-statistic-based KSD estimators suffer from a quadratic runtime complexity, which hinders their application in large-scale settings. In this work, we propose a Nyström-based KSD acceleration-with runtime O(mn + m 3) for n samples and m ≪ n Nyström points-, show its √nconsistency with a classical sub-Gaussian assumption, and demonstrate its applicability for goodness-of-fit testing on a suite of benchmarks. We also show the √n-consistency of the quadratic-time KSD estimator.
| Item Type | Article |
|---|---|
| Copyright holders | © 2025 The Author(s) |
| Departments | LSE > Academic Departments > Statistics |
| Date Deposited | 14 Jun 2024 |
| Acceptance Date | 22 Jan 2024 |
| URI | https://researchonline.lse.ac.uk/id/eprint/123872 |