A family of measurement matrices for generalized compressed sensing
We consider the problem of recovering a structured signal x that lies close to a subset of interest T in R^n, from its random noisy linear measurements y = B A x + w, i.e., a generalization of the classic compressed sensing problem. Above, B is a fixed matrix and A has independent sub-gaussian rows. By varying B, and the sub-gaussian distribution of A, this gives a family of measurement matrices which may have heavy tails, dependent rows and columns, and singular values with large dynamic range. Classic (generalized) compressed sensing typically assumes a random measurement matrix with nearly uniform singular values (with high probability), and asks: How many measurements are needed to recover x? In our setting, this question becomes: What properties of B allow good recovery? We show that the “effective rank” of B may be used as a surrogate for the number of measurements, and if this exceeds the squared Gaussian complexity of T then accurate recovery is guaranteed. We also derive the optimal dependence on the sub-gaussian norm of the rows of A, to our knowledge this was not known previously even in the case when B is the identity. We specialize to the case of low-rank matrix recovery, where T is a set of low-rank matrices.