Volume 13, Issue 1
Legendre Neural Network for Solving Linear Variable Coefficients Delay Differential-Algebraic Equations with Weak Discontinuities

Hongliang Liu, Jingwen Song, Huini Liu, Jie XuLijuan Li

Adv. Appl. Math. Mech., 13 (2021), pp. 101-118.

Published online: 2020-10

Preview Full PDF 238 5892
Export citation
  • Abstract

In this paper, we propose a novel Legendre neural network combined with the extreme learning machine algorithm to solve variable coefficients linear delay differential-algebraic equations with weak discontinuities. First, the solution interval is divided into multiple subintervals by weak discontinuity points. Then, Legendre neural network is used to eliminate the  hidden layer by expanding the input pattern using Legendre polynomials on each subinterval. Finally, the parameters of the neural network are obtained by training with the extreme learning machine. The numerical examples show that the proposed method can effectively deal with the difficulty of numerical simulation caused by the discontinuities.

  • Keywords

Convergence, delay differential-algebraic equations, Legendre activation function, neural network.

  • AMS Subject Headings

65L80, 68T07, 68W25

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{AAMM-13-101, author = {Liu , Hongliang and Song , Jingwen and Liu , Huini and Xu , Jie and Li , Lijuan}, title = {Legendre Neural Network for Solving Linear Variable Coefficients Delay Differential-Algebraic Equations with Weak Discontinuities}, journal = {Advances in Applied Mathematics and Mechanics}, year = {2020}, volume = {13}, number = {1}, pages = {101--118}, abstract = {

In this paper, we propose a novel Legendre neural network combined with the extreme learning machine algorithm to solve variable coefficients linear delay differential-algebraic equations with weak discontinuities. First, the solution interval is divided into multiple subintervals by weak discontinuity points. Then, Legendre neural network is used to eliminate the  hidden layer by expanding the input pattern using Legendre polynomials on each subinterval. Finally, the parameters of the neural network are obtained by training with the extreme learning machine. The numerical examples show that the proposed method can effectively deal with the difficulty of numerical simulation caused by the discontinuities.

}, issn = {2075-1354}, doi = {https://doi.org/10.4208/aamm.OA-2019-0281}, url = {http://global-sci.org/intro/article_detail/aamm/18342.html} }
TY - JOUR T1 - Legendre Neural Network for Solving Linear Variable Coefficients Delay Differential-Algebraic Equations with Weak Discontinuities AU - Liu , Hongliang AU - Song , Jingwen AU - Liu , Huini AU - Xu , Jie AU - Li , Lijuan JO - Advances in Applied Mathematics and Mechanics VL - 1 SP - 101 EP - 118 PY - 2020 DA - 2020/10 SN - 13 DO - http://doi.org/10.4208/aamm.OA-2019-0281 UR - https://global-sci.org/intro/article_detail/aamm/18342.html KW - Convergence, delay differential-algebraic equations, Legendre activation function, neural network. AB -

In this paper, we propose a novel Legendre neural network combined with the extreme learning machine algorithm to solve variable coefficients linear delay differential-algebraic equations with weak discontinuities. First, the solution interval is divided into multiple subintervals by weak discontinuity points. Then, Legendre neural network is used to eliminate the  hidden layer by expanding the input pattern using Legendre polynomials on each subinterval. Finally, the parameters of the neural network are obtained by training with the extreme learning machine. The numerical examples show that the proposed method can effectively deal with the difficulty of numerical simulation caused by the discontinuities.

Hongliang Liu, Jingwen Song, Huini Liu, Jie Xu & Lijuan Li. (2020). Legendre Neural Network for Solving Linear Variable Coefficients Delay Differential-Algebraic Equations with Weak Discontinuities. Advances in Applied Mathematics and Mechanics. 13 (1). 101-118. doi:10.4208/aamm.OA-2019-0281
Copy to clipboard
The citation has been copied to your clipboard