[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

For-All Sparse Recovery in Near-Optimal Time

Published: 06 March 2017 Publication History

Abstract

An approximate sparse recovery system in ℓ1 norm consists of parameters k, ϵ, N; an m-by-N measurement Φ; and a recovery algorithm R. Given a vector, x, the system approximates x by xˆ = Rx), which must satisfy ‖ xˆ-x1 ≤ (1+ϵ)‖ x - xk1. We consider the “for all” model, in which a single matrix Φ, possibly “constructed” non-explicitly using the probabilistic method, is used for all signals x. The best existing sublinear algorithm by Porat and Strauss [2012] uses O−3klog (N/k)) measurements and runs in time O(k1 − αNα) for any constant α > 0.
In this article, we improve the number of measurements to O − 2klog (N/k)), matching the best existing upper bound (attained by super-linear algorithms), and the runtime to O(k1+βpoly(log N,1/ϵ)), with a modest restriction that kN1 − α and ϵ ⩽ (log k/log N)γ for any constants α, β, γ > 0. When k ⩽ log cN for some c > 0, the runtime is reduced to O(kpoly(N,1/ϵ)). With no restrictions on ϵ, we have an approximation recovery system with m = O(k/ϵlog (N/k)((log N/log k)γ + 1/ϵ)) measurements.
The overall architecture of this algorithm is similar to that of Porat and Strauss [2012] in that we repeatedly use a weak recovery system (with varying parameters) to obtain a top-level recovery algorithm. The weak recovery system consists of a two-layer hashing procedure (or with two unbalanced expanders for a deterministic algorithm). The algorithmic innovation is a novel encoding procedure that is reminiscent of network coding and that reflects the structure of the hashing stages. The idea is to encode the signal position index i by associating it with a unique message mi, which will be encoded to a longer message mi (in contrast to Porat and Strauss [2012] in which the encoding is simply the identity). Portions of the message mi correspond to repetitions of the hashing, and we use a regular expander graph to encode the linkages among these portions.
The decoding or recovery algorithm consists of recovering the portions of the longer messages mi and then decoding to the original messages mi, all the while ensuring that corruptions can be detected and/or corrected. The recovery algorithm is similar to list recovery introduced in Indyk et al. [2010] and used in Gilbert et al. [2013]. In our algorithm, the messages {mi} are independent of the hashing, which enables us to obtain a better result.

References

[1]
Radu Berinde, Anna C. Gilbert, Piotr Indyk, Howard Karloff, and Martin J. Strauss. 2008. Combining geometry and combinatorics: A unified approach to sparse signal recovery. In Proceedings of the 46th Annual Allerton Conference on Communication, Control, and Computing. IEEE, 798--805.
[2]
E. Candès, J. Romberg, and T. Tao. 2006. Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information. IEEE Inf. Theor. 52, 2 (2006), 489--509.
[3]
Moses Charikar, Kevin Chen, and Martin Farach-Colton. 2002. Finding frequent items in data streams. In Proceedings of the 29th International Colloquium on Automata, Languages and Programming (ICALP). 693--703.
[4]
Mahdi Cheraghchi. 2013. Noise-resilient group testing: Limitations and constructions. Discr. Appl. Math. 161, 1--2 (2013), 81--95.
[5]
Mahdi Cheraghchi and Piotr Indyk. 2016. Nearly optimal deterministic algorithm for sparse walsh-hadamard transform. In Proceedings of the 27th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA’16). Society for Industrial and Applied Mathematics, Philadelphia, PA, 298--317.
[6]
Raphaël Clifford, Klim Efremenko, Ely Porat, and Amir Rothschild. 2009. From coding theory to efficient pattern matching. In Proceedings of the 20th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA). Society for Industrial and Applied Mathematics, Philadelphia, PA, 778--784.
[7]
Albert Cohen, Wolfgang Dahmen, and Ronald Devore. 2009. Compressed sensing and best k-term approximation. J. Am. Math. Soc (2009), 211--231.
[8]
G. Cormode and S. Muthukrishnan. 2006. Combinatorial algorithms for Compressed Sensing. In Proc. 40th Annual Conference on Information Sciences and Systems. Princeton, Princeton, NJ.
[9]
D. L. Donoho. 2006. Compressed Sensing. IEEE Trans. Info. Theory 52, 4 (Apr. 2006), 1289--1306.
[10]
M. F. Duarte, M. A. Davenport, D. Takhar, J. N. Laska, K. F. Kelly, and R. G. Baraniuk. 2008. Single-pixel imaging via compressive sampling. IEEE Sign. Process. Mag. 25, 2 (2008), 83--91.
[11]
J. Friedman, J. Kahn, and E. Szemerédi. 1989. On the second eigenvalue of random regular graphs. In Proceedings of the 21st Annual ACM Symposium on Theory of Computing (STOC). 587--598.
[12]
Anna Gilbert, Martin Strauss, Joel Tropp, and Roman Vershynin. 2006. Algorithmic linear dimension reduction in the ℓ1 norm for sparse vectors. In Proceedings of 44th Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[13]
Anna C. Gilbert, Yi Li, Ely Porat, and Martin J. Strauss. 2012. Approximate sparse recovery: Optimizing time and measurements. SIAM J. Comput. 41, 2 (2012), 436--453.
[14]
Anna C. Gilbert, Hung Q. Ngo, Ely Porat, Atri Rudra, and Martin J. Strauss. 2013. ℓ2/ℓ2-Foreach sparse recovery with low risk. In Automata, Languages, and Programming. Lecture Notes in Computer Science, Vol. 7965. Springer, Berlin, 461--472.
[15]
A. C. Gilbert, M. J. Strauss, J. A. Tropp, and R. Vershynin. 2007. One sketch for all: Fast Algorithms for Compressed Sensing. In Proceedings of the 39th Annual ACM Symposium on Theory of Computing (STOC). ACM, New York, NY, 237--246.
[16]
Venkatesan Guruswami, Christopher Umans, and Salil Vadhan. 2009. Unbalanced expanders and randomness extractors from Parvaresh-Vardy codes. J. ACM 56, 4, Article 20 (July 2009).
[17]
Piotr Indyk, Hung Q. Ngo, and Atri Rudra. 2010. Efficiently decodable non-adaptive group testing. In Proceedings of the 21st Annual ACM-SIAM Symposium on Discrete Algorithms (SODA). 1126--1142.
[18]
Piotr Indyk and Milan Ruzic. 2008. Near-optimal sparse recovery in the L1 norm. Found. Comput. Sci. (2008), 199--207.
[19]
Michael Lustig, David Donoho, and John M. Pauly. 2007. Sparse MRI: The application of compressed sensing for rapid MR imaging. Magn. Res. Med. 58, 6 (2007), 1182--1195.
[20]
Jelani Nelson, Huy L. Nguyen, and David P. Woodruff. 2014. On deterministic sketching and streaming for sparse recovery and norm estimation. Lin. Algebr. Appl. 441 (2014), 152--167.
[21]
Farzad Parvaresh and Alexander Vardy. 2005. Correcting errors beyond the guruswami-sudan radius in polynomial time. In Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science (FOCS). 285--294.
[22]
Ely Porat and Martin J. Strauss. 2012. Sublinear time, measurement-optimal, sparse recovery for all. In Proceedings of the 23rd Annual ACM-SIAM Symposium on Discrete Algorithms (SODA). 1215--1227.
[23]
Amnon Ta-Shma and David Zuckerman. 2004. Extractor codes. IEEE Trans. Inf. Theor. 50, 12 (2004), 3015--3025.
[24]
Eli Upfal. 1992. Tolerating linear number of faults in networks of bounded degree. In Proceedings of the 11th Annual ACM Symposium on Principles of Distributed Computing (PODC). 83--89.

Cited By

View all
  • (2024)A New Information Complexity Measure for Multi-pass Streaming with ApplicationsProceedings of the 56th Annual ACM Symposium on Theory of Computing10.1145/3618260.3649672(1781-1792)Online publication date: 10-Jun-2024
  • (2023)Simple Codes and Sparse Recovery with Fast DecodingSIAM Journal on Discrete Mathematics10.1137/21M146535437:2(612-631)Online publication date: 18-May-2023
  • (2022)Nearly Optimal Pseudorandomness from HardnessJournal of the ACM10.1145/355530769:6(1-55)Online publication date: 17-Nov-2022
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Transactions on Algorithms
ACM Transactions on Algorithms  Volume 13, Issue 3
July 2017
390 pages
ISSN:1549-6325
EISSN:1549-6333
DOI:10.1145/3058789
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 06 March 2017
Accepted: 01 November 2016
Revised: 01 September 2016
Received: 01 April 2015
Published in TALG Volume 13, Issue 3

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Compressive sensing
  2. list decoding
  3. sparse recovery

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • DARPA/ONR
  • NSF CCF

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)2
Reflects downloads up to 13 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)A New Information Complexity Measure for Multi-pass Streaming with ApplicationsProceedings of the 56th Annual ACM Symposium on Theory of Computing10.1145/3618260.3649672(1781-1792)Online publication date: 10-Jun-2024
  • (2023)Simple Codes and Sparse Recovery with Fast DecodingSIAM Journal on Discrete Mathematics10.1137/21M146535437:2(612-631)Online publication date: 18-May-2023
  • (2022)Nearly Optimal Pseudorandomness from HardnessJournal of the ACM10.1145/355530769:6(1-55)Online publication date: 17-Nov-2022
  • (2021)Construction of binary matrices for near-optimal compressed sensing2021 IEEE International Symposium on Information Theory (ISIT)10.1109/ISIT45174.2021.9518096(1612-1617)Online publication date: 12-Jul-2021
  • (2021)Approximate Support Recovery using Codes for Unsourced Multiple Access2021 IEEE International Symposium on Information Theory (ISIT)10.1109/ISIT45174.2021.9517995(2948-2953)Online publication date: 12-Jul-2021
  • (2021)A Hybrid Approach to Coded Compressed Sensing Where Coupling Takes Place Via the Outer CodeICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)10.1109/ICASSP39728.2021.9414469(4770-4774)Online publication date: 6-Jun-2021
  • (2021)Minimization of the logarithmic function in sparse recoveryNeurocomputing10.1016/j.neucom.2020.11.033427(141-155)Online publication date: Feb-2021
  • (2021)Sparse Harmonic Transforms: A New Class of Sublinear-Time Algorithms for Learning Functions of Many VariablesFoundations of Computational Mathematics10.1007/s10208-020-09462-z21:2(275-329)Online publication date: 1-Apr-2021
  • (2020)Sublinear-Time Algorithms for Compressive Phase RetrievalIEEE Transactions on Information Theory10.1109/TIT.2020.302070166:11(7302-7310)Online publication date: Nov-2020
  • (2020)A Coded Compressed Sensing Scheme for Unsourced Multiple AccessIEEE Transactions on Information Theory10.1109/TIT.2020.301294866:10(6509-6533)Online publication date: Oct-2020
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media