arrow
Volume 24, Issue 1
A Gradient-Enhanced $ℓ_1$ Approach for the Recovery of Sparse Trigonometric Polynomials

Zhiqiang Xu & Tao Zhou

Commun. Comput. Phys., 24 (2018), pp. 286-308.

Published online: 2018-03

Export citation
  • Abstract

In this paper, we discuss a gradient-enhanced $ℓ_1$ approach for the recovery of sparse Fourier expansions. By $gradient$-$enhanced$ approaches we mean that the directional derivatives along given vectors are utilized to improve the sparse approximations. We first consider the case where both the function values and the directional derivatives at sampling points are known. We show that, under some mild conditions, the inclusion of the derivatives information can indeed decrease the coherence of measurement matrix, and thus leads to the improved the sparse recovery conditions of the $ℓ_1$ minimization. We also consider the case where either the function values or the directional derivatives are known at the sampling points, in which we present a sufficient condition under which the measurement matrix satisfies RIP, provided that the samples are distributed according to the uniform measure. This result shows that the derivatives information plays a similar role as that of the function values. Several numerical examples are presented to support the theoretical statements. Potential applications to function (Hermite-type) interpolations and uncertainty quantification are also discussed.

  • AMS Subject Headings

65D15, 41A10, 41A63

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CiCP-24-286, author = {}, title = {A Gradient-Enhanced $ℓ_1$ Approach for the Recovery of Sparse Trigonometric Polynomials}, journal = {Communications in Computational Physics}, year = {2018}, volume = {24}, number = {1}, pages = {286--308}, abstract = {

In this paper, we discuss a gradient-enhanced $ℓ_1$ approach for the recovery of sparse Fourier expansions. By $gradient$-$enhanced$ approaches we mean that the directional derivatives along given vectors are utilized to improve the sparse approximations. We first consider the case where both the function values and the directional derivatives at sampling points are known. We show that, under some mild conditions, the inclusion of the derivatives information can indeed decrease the coherence of measurement matrix, and thus leads to the improved the sparse recovery conditions of the $ℓ_1$ minimization. We also consider the case where either the function values or the directional derivatives are known at the sampling points, in which we present a sufficient condition under which the measurement matrix satisfies RIP, provided that the samples are distributed according to the uniform measure. This result shows that the derivatives information plays a similar role as that of the function values. Several numerical examples are presented to support the theoretical statements. Potential applications to function (Hermite-type) interpolations and uncertainty quantification are also discussed.

}, issn = {1991-7120}, doi = {https://doi.org/10.4208/cicp.OA-2018-0006}, url = {http://global-sci.org/intro/article_detail/cicp/10938.html} }
TY - JOUR T1 - A Gradient-Enhanced $ℓ_1$ Approach for the Recovery of Sparse Trigonometric Polynomials JO - Communications in Computational Physics VL - 1 SP - 286 EP - 308 PY - 2018 DA - 2018/03 SN - 24 DO - http://doi.org/10.4208/cicp.OA-2018-0006 UR - https://global-sci.org/intro/article_detail/cicp/10938.html KW - Gradient-enhanced $ℓ_1$ minimization, compressed sensing, sparse Fourier expansions, restricted isometry property, mutual incoherence. AB -

In this paper, we discuss a gradient-enhanced $ℓ_1$ approach for the recovery of sparse Fourier expansions. By $gradient$-$enhanced$ approaches we mean that the directional derivatives along given vectors are utilized to improve the sparse approximations. We first consider the case where both the function values and the directional derivatives at sampling points are known. We show that, under some mild conditions, the inclusion of the derivatives information can indeed decrease the coherence of measurement matrix, and thus leads to the improved the sparse recovery conditions of the $ℓ_1$ minimization. We also consider the case where either the function values or the directional derivatives are known at the sampling points, in which we present a sufficient condition under which the measurement matrix satisfies RIP, provided that the samples are distributed according to the uniform measure. This result shows that the derivatives information plays a similar role as that of the function values. Several numerical examples are presented to support the theoretical statements. Potential applications to function (Hermite-type) interpolations and uncertainty quantification are also discussed.

Zhiqiang Xu & Tao Zhou. (2020). A Gradient-Enhanced $ℓ_1$ Approach for the Recovery of Sparse Trigonometric Polynomials. Communications in Computational Physics. 24 (1). 286-308. doi:10.4208/cicp.OA-2018-0006
Copy to clipboard
The citation has been copied to your clipboard