TY - JOUR T1 - Gradient Type Methods for Linear Hyperspectral Unmixing AU - Xu , Fangfang AU - Wang , Yating AU - Li , Yanyan AU - Liu , Lu AU - Tian , Tonghua JO - CSIAM Transactions on Applied Mathematics VL - 1 SP - 109 EP - 132 PY - 2022 DA - 2022/03 SN - 3 DO - http://doi.org/10.4208/csiam-am.SO-2021-0001 UR - https://global-sci.org/intro/article_detail/csiam-am/20291.html KW - Hyperspectral unmixing, minimum volume simplex, linear mixture model, alternating minimization, proximal gradient method, adaptive moments method, stochastic variance reduction strategy. AB -

Hyperspectral unmixing (HU) plays an important role in terrain classification, agricultural monitoring, mineral recognition and quantification, and military surveillance. The existing model of the linear HU requires the observed vector to be a linear combination of the vertices. Due to the presence of noise, or any other perturbation source, we relax this linear constraint and penalize it to the objective function. The obtained model is solved by a sequence of gradient type steps which contain a projection onto the simplex constraint. We propose two gradient type algorithms for the linear HU, which can find vertices of the minimum volume simplex containing the observed hyper-spectral vectors. When the number of given pixels is huge, the computational time and complexity are so large that solving HU efficiently is usually challenging. A key observation is that our objective function is a summation of many similar simple functions. Then the computational time and complexity can be reduced by selecting a small portion of data points randomly. Furthermore, a stochastic variance reduction strategy is used. Preliminary numerical results showed that our new algorithms outperformed state-of-the-art algorithms on both synthetic and real data.