arrow
Volume 43, Issue 2
Binary Least Squares: An Algorithm for Binary Sparse Signal Recovery

Jinming Wen

J. Comp. Math., 43 (2025), pp. 493-514.

Published online: 2024-11

Export citation
  • Abstract

A fundamental problem in some applications including group testing and communications is to acquire the support of a $K$-sparse signal $x,$ whose nonzero elements are 1, from an underdetermined noisy linear model. This paper first designs an algorithm called binary least squares (BLS) to reconstruct $x$ and analyzes its complexity. Then, we establish two sufficient conditions for the exact reconstruction of $x$’s support with $K$ iterations of BLS based on the mutual coherence and restricted isometry property of the measurement matrix, respectively. Finally, extensive numerical tests are performed to compare the efficiency and effectiveness of BLS with those of batch orthogonal matching pursuit (Batch-OMP) which to our best knowledge is the fastest implementation of OMP, orthogonal least squares (OLS), compressive sampling matching pursuit (CoSaMP), hard thresholding pursuit (HTP), Newton-step-based iterative hard thresholding (NSIHT), Newton-step-based hard thresholding pursuit (NSHTP), binary matching pursuit (BMP) and $ℓ_1$-regularized least squares. Test results show that: (1) BLS can be 10-200 times more efficient than Batch-OMP, OLS, CoSaMP, HTP, NSIHT and NSHTP with higher probability of support reconstruction, and the improvement can be 20%-80%; (2) BLS has more than 25% improvement on the support reconstruction probability than the explicit BMP algorithm with a little higher computational complexity; (3) BLS is around 100 times faster than $ℓ_1$-regularized least squares with lower support reconstruction probability for small $K$ and higher support reconstruction probability for large $K.$ Numerical tests on the generalized space shift keying (GSSK) detection indicate that although BLS is a little slower than BMP, it is more efficient than the other seven tested sparse recovery algorithms, and although it is less effective than $ℓ_1$-regularized least squares, it is more effective than the other seven algorithms.

  • AMS Subject Headings

94A12, 65F22, 65J22

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{JCM-43-493, author = {Wen , Jinming}, title = {Binary Least Squares: An Algorithm for Binary Sparse Signal Recovery}, journal = {Journal of Computational Mathematics}, year = {2024}, volume = {43}, number = {2}, pages = {493--514}, abstract = {

A fundamental problem in some applications including group testing and communications is to acquire the support of a $K$-sparse signal $x,$ whose nonzero elements are 1, from an underdetermined noisy linear model. This paper first designs an algorithm called binary least squares (BLS) to reconstruct $x$ and analyzes its complexity. Then, we establish two sufficient conditions for the exact reconstruction of $x$’s support with $K$ iterations of BLS based on the mutual coherence and restricted isometry property of the measurement matrix, respectively. Finally, extensive numerical tests are performed to compare the efficiency and effectiveness of BLS with those of batch orthogonal matching pursuit (Batch-OMP) which to our best knowledge is the fastest implementation of OMP, orthogonal least squares (OLS), compressive sampling matching pursuit (CoSaMP), hard thresholding pursuit (HTP), Newton-step-based iterative hard thresholding (NSIHT), Newton-step-based hard thresholding pursuit (NSHTP), binary matching pursuit (BMP) and $ℓ_1$-regularized least squares. Test results show that: (1) BLS can be 10-200 times more efficient than Batch-OMP, OLS, CoSaMP, HTP, NSIHT and NSHTP with higher probability of support reconstruction, and the improvement can be 20%-80%; (2) BLS has more than 25% improvement on the support reconstruction probability than the explicit BMP algorithm with a little higher computational complexity; (3) BLS is around 100 times faster than $ℓ_1$-regularized least squares with lower support reconstruction probability for small $K$ and higher support reconstruction probability for large $K.$ Numerical tests on the generalized space shift keying (GSSK) detection indicate that although BLS is a little slower than BMP, it is more efficient than the other seven tested sparse recovery algorithms, and although it is less effective than $ℓ_1$-regularized least squares, it is more effective than the other seven algorithms.

}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.2308-m2023-0044}, url = {http://global-sci.org/intro/article_detail/jcm/23547.html} }
TY - JOUR T1 - Binary Least Squares: An Algorithm for Binary Sparse Signal Recovery AU - Wen , Jinming JO - Journal of Computational Mathematics VL - 2 SP - 493 EP - 514 PY - 2024 DA - 2024/11 SN - 43 DO - http://doi.org/10.4208/jcm.2308-m2023-0044 UR - https://global-sci.org/intro/article_detail/jcm/23547.html KW - Binary sparse signal, Precise support reconstruction, Binary least squares. AB -

A fundamental problem in some applications including group testing and communications is to acquire the support of a $K$-sparse signal $x,$ whose nonzero elements are 1, from an underdetermined noisy linear model. This paper first designs an algorithm called binary least squares (BLS) to reconstruct $x$ and analyzes its complexity. Then, we establish two sufficient conditions for the exact reconstruction of $x$’s support with $K$ iterations of BLS based on the mutual coherence and restricted isometry property of the measurement matrix, respectively. Finally, extensive numerical tests are performed to compare the efficiency and effectiveness of BLS with those of batch orthogonal matching pursuit (Batch-OMP) which to our best knowledge is the fastest implementation of OMP, orthogonal least squares (OLS), compressive sampling matching pursuit (CoSaMP), hard thresholding pursuit (HTP), Newton-step-based iterative hard thresholding (NSIHT), Newton-step-based hard thresholding pursuit (NSHTP), binary matching pursuit (BMP) and $ℓ_1$-regularized least squares. Test results show that: (1) BLS can be 10-200 times more efficient than Batch-OMP, OLS, CoSaMP, HTP, NSIHT and NSHTP with higher probability of support reconstruction, and the improvement can be 20%-80%; (2) BLS has more than 25% improvement on the support reconstruction probability than the explicit BMP algorithm with a little higher computational complexity; (3) BLS is around 100 times faster than $ℓ_1$-regularized least squares with lower support reconstruction probability for small $K$ and higher support reconstruction probability for large $K.$ Numerical tests on the generalized space shift keying (GSSK) detection indicate that although BLS is a little slower than BMP, it is more efficient than the other seven tested sparse recovery algorithms, and although it is less effective than $ℓ_1$-regularized least squares, it is more effective than the other seven algorithms.

Wen , Jinming. (2024). Binary Least Squares: An Algorithm for Binary Sparse Signal Recovery. Journal of Computational Mathematics. 43 (2). 493-514. doi:10.4208/jcm.2308-m2023-0044
Copy to clipboard
The citation has been copied to your clipboard