ReLU deep neural networks from the hierarchical basis perspective[Formula presented]

Juncai He, Lin Li, Jinchao Xu

Research output: Contribution to journalArticlepeer-review

21 Scopus citations

Abstract

We study ReLU deep neural networks (DNNs) by investigating their connections with the hierarchical basis method in finite element methods. First, we show that the approximation schemes of ReLU DNNs for x2 and xy are composition versions of the hierarchical basis approximation for these two functions. Based on this fact, we obtain a geometric interpretation and systematic proof for the approximation result of ReLU DNNs for polynomials, which plays an important role in a series of recent exponential approximation results of ReLU DNNs. Through our investigation of connections between ReLU DNNs and the hierarchical basis approximation for x2 and xy, we show that ReLU DNNs with this special structure can be applied only to approximate quadratic functions. Furthermore, we obtain a concise representation to explicitly reproduce any linear finite element function on a two-dimensional uniform mesh by using ReLU DNNs with only two hidden layers.
Original languageEnglish (US)
Pages (from-to)105-114
Number of pages10
JournalComputers and Mathematics with Applications
Volume120
DOIs
StatePublished - Aug 15 2022
Externally publishedYes

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2023-02-15

ASJC Scopus subject areas

  • Modeling and Simulation
  • Computational Theory and Mathematics
  • Computational Mathematics

Fingerprint

Dive into the research topics of 'ReLU deep neural networks from the hierarchical basis perspective[Formula presented]'. Together they form a unique fingerprint.

Cite this