No Statistical-Computational Gap in Spiked Matrix Models with Generative Network Priors

Cocola, Jorio and Hand, Paul and Voroninski, Vladislav (2021) No Statistical-Computational Gap in Spiked Matrix Models with Generative Network Priors. Entropy, 23 (1). p. 115. ISSN 1099-4300

[thumbnail of entropy-23-00115-v2.pdf] Text
entropy-23-00115-v2.pdf - Published Version

Download (626kB)

Abstract

We provide a non-asymptotic analysis of the spiked Wishart and Wigner matrix models with a generative neural network prior. Spiked random matrices have the form of a rank-one signal plus noise and have been used as models for high dimensional Principal Component Analysis (PCA), community detection and synchronization over groups. Depending on the prior imposed on the spike, these models can display a statistical-computational gap between the information theoretically optimal reconstruction error that can be achieved with unbounded computational resources and the sub-optimal performances of currently known polynomial time algorithms. These gaps are believed to be fundamental, as in the emblematic case of Sparse PCA. In stark contrast to such cases, we show that there is no statistical-computational gap under a generative network prior, in which the spike lies on the range of a generative neural network. Specifically, we analyze a gradient descent method for minimizing a nonlinear least squares objective over the range of an expansive-Gaussian neural network and show that it can recover in polynomial time an estimate of the underlying spike with a rate-optimal sample complexity and dependence on the noise level.

Item Type: Article
Uncontrolled Keywords: Keywords: spiked matrix models; generative networks; rank-one matrix recovery; statistical-computational gap
Subjects: STM Repository > Physics and Astronomy
Depositing User: Managing Editor
Date Deposited: 08 Mar 2023 08:20
Last Modified: 17 Jun 2024 06:06
URI: http://classical.goforpromo.com/id/eprint/437

Actions (login required)

View Item
View Item