- Journal Home
- Volume 43 - 2025
- Volume 42 - 2024
- Volume 41 - 2023
- Volume 40 - 2022
- Volume 39 - 2021
- Volume 38 - 2020
- Volume 37 - 2019
- Volume 36 - 2018
- Volume 35 - 2017
- Volume 34 - 2016
- Volume 33 - 2015
- Volume 32 - 2014
- Volume 31 - 2013
- Volume 30 - 2012
- Volume 29 - 2011
- Volume 28 - 2010
- Volume 27 - 2009
- Volume 26 - 2008
- Volume 25 - 2007
- Volume 24 - 2006
- Volume 23 - 2005
- Volume 22 - 2004
- Volume 21 - 2003
- Volume 20 - 2002
- Volume 19 - 2001
- Volume 18 - 2000
- Volume 17 - 1999
- Volume 16 - 1998
- Volume 15 - 1997
- Volume 14 - 1996
- Volume 13 - 1995
- Volume 12 - 1994
- Volume 11 - 1993
- Volume 10 - 1992
- Volume 9 - 1991
- Volume 8 - 1990
- Volume 7 - 1989
- Volume 6 - 1988
- Volume 5 - 1987
- Volume 4 - 1986
- Volume 3 - 1985
- Volume 2 - 1984
- Volume 1 - 1983
Cited by
- BibTex
- RIS
- TXT
A fundamental problem in some applications including group testing and communications is to acquire the support of a $K$-sparse signal $x,$ whose nonzero elements are 1, from an underdetermined noisy linear model. This paper first designs an algorithm called binary least squares (BLS) to reconstruct $x$ and analyzes its complexity. Then, we establish two sufficient conditions for the exact reconstruction of $x$’s support with $K$ iterations of BLS based on the mutual coherence and restricted isometry property of the measurement matrix, respectively. Finally, extensive numerical tests are performed to compare the efficiency and effectiveness of BLS with those of batch orthogonal matching pursuit (Batch-OMP) which to our best knowledge is the fastest implementation of OMP, orthogonal least squares (OLS), compressive sampling matching pursuit (CoSaMP), hard thresholding pursuit (HTP), Newton-step-based iterative hard thresholding (NSIHT), Newton-step-based hard thresholding pursuit (NSHTP), binary matching pursuit (BMP) and $ℓ_1$-regularized least squares. Test results show that: (1) BLS can be 10-200 times more efficient than Batch-OMP, OLS, CoSaMP, HTP, NSIHT and NSHTP with higher probability of support reconstruction, and the improvement can be 20%-80%; (2) BLS has more than 25% improvement on the support reconstruction probability than the explicit BMP algorithm with a little higher computational complexity; (3) BLS is around 100 times faster than $ℓ_1$-regularized least squares with lower support reconstruction probability for small $K$ and higher support reconstruction probability for large $K.$ Numerical tests on the generalized space shift keying (GSSK) detection indicate that although BLS is a little slower than BMP, it is more efficient than the other seven tested sparse recovery algorithms, and although it is less effective than $ℓ_1$-regularized least squares, it is more effective than the other seven algorithms.
}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.2308-m2023-0044}, url = {http://global-sci.org/intro/article_detail/jcm/23547.html} }A fundamental problem in some applications including group testing and communications is to acquire the support of a $K$-sparse signal $x,$ whose nonzero elements are 1, from an underdetermined noisy linear model. This paper first designs an algorithm called binary least squares (BLS) to reconstruct $x$ and analyzes its complexity. Then, we establish two sufficient conditions for the exact reconstruction of $x$’s support with $K$ iterations of BLS based on the mutual coherence and restricted isometry property of the measurement matrix, respectively. Finally, extensive numerical tests are performed to compare the efficiency and effectiveness of BLS with those of batch orthogonal matching pursuit (Batch-OMP) which to our best knowledge is the fastest implementation of OMP, orthogonal least squares (OLS), compressive sampling matching pursuit (CoSaMP), hard thresholding pursuit (HTP), Newton-step-based iterative hard thresholding (NSIHT), Newton-step-based hard thresholding pursuit (NSHTP), binary matching pursuit (BMP) and $ℓ_1$-regularized least squares. Test results show that: (1) BLS can be 10-200 times more efficient than Batch-OMP, OLS, CoSaMP, HTP, NSIHT and NSHTP with higher probability of support reconstruction, and the improvement can be 20%-80%; (2) BLS has more than 25% improvement on the support reconstruction probability than the explicit BMP algorithm with a little higher computational complexity; (3) BLS is around 100 times faster than $ℓ_1$-regularized least squares with lower support reconstruction probability for small $K$ and higher support reconstruction probability for large $K.$ Numerical tests on the generalized space shift keying (GSSK) detection indicate that although BLS is a little slower than BMP, it is more efficient than the other seven tested sparse recovery algorithms, and although it is less effective than $ℓ_1$-regularized least squares, it is more effective than the other seven algorithms.