Skip to main content
King Abdullah University of Science and Technology
King Abdullah University of Science and Technology
KAUST
Main navigation
  • Home

Sobolev Spaces

Sharp Approximation Rates for Deep ReLU Neural Networks on Sobolev Spaces

Assistant Professor Jonathan Siegel, Texas A and M University

Nov 22, 15:30 - 17:00

B1 L3 R3119

ReLU Neural Network Sobolev Spaces

Sobolev spaces are centrally important objects in PDE theory. Consequently, to understand how deep neural networks can be used to numerically solve PDEs a necessary first step is to determine now efficiently they can approximate Sobolev functions. In this talk we consider this problem for deep ReLU neural networks, which are the most important class of neural networks in practical applications.

Approximation and Generalization Errors in Deep Neural Networks for Sobolev Spaces measured by Sobolev Norms

Dr. Yahong Yang, Department of Mathematics, Penn State University

Feb 14, 16:00 - 17:00

KAUST

deep neural networks approximation Sobolev Spaces

Abstract In this presentation, we initially discuss the approximation capabilities of deep neural networks (DNNs) with ReLU and the square of ReLU as activation functions for Sobolev functions measured by the Sobolev norms \(W^{m,p}\) where \(m \ge 1\). Subsequently, we consider how to address the issue of the curse of dimensionality for DNNs’ approximation. Finally, we analyze the generalization errors associated with DNNs using such Sobolev loss functions. Additionally, we provide recommendations on when to opt for deeper NNs versus wider NNs, considering factors such as the number of sample

Footer

  • A-Z Directory
    • All Content
    • Browse Related Sites
  • Site Management
    • Log in

© 2024 King Abdullah University of Science and Technology. All rights reserved. Privacy Notice