158,39 €
175,99 €
-10% with code: EXTRA
On Kolmogorov's Superposition Theorem and Its Applications
On Kolmogorov's Superposition Theorem and Its Applications
158,39
175,99 €
  • We will send in 10–14 business days.
We present a Regularization Network approach based on Kolmogorov's superposition theorem (KST) to reconstruct higher dimensional continuous functions from their function values on discrete data points. The ansatz is based on a new constructive proof of a version of the theorem. Additionally, the thesis gives a comprehensive overview on the various versions of KST that exist and its relation to well known approximation schemes and Neural Networks. The efficient representation of higher dimension…
  • SAVE -10% with code: EXTRA

On Kolmogorov's Superposition Theorem and Its Applications (e-book) (used book) | bookbook.eu

Reviews

(3.00 Goodreads rating)

Description

We present a Regularization Network approach based on Kolmogorov's superposition theorem (KST) to reconstruct higher dimensional continuous functions from their function values on discrete data points. The ansatz is based on a new constructive proof of a version of the theorem. Additionally, the thesis gives a comprehensive overview on the various versions of KST that exist and its relation to well known approximation schemes and Neural Networks. The efficient representation of higher dimensional continuous functions as superposition of univariate continuous functions suggests the conjecture that in a reconstruction, the exponential dependency of the involved numerical costs on the dimensionality, the so-called curse of dimensionality, can be circumvented. However, this is not the case, since the involved univariate functions are either unknown or not smooth. Therefore, we develop a Regularization Network approach in a reproducing kernel Hilbert space setting such that the restriction of the underlying approximation spaces defines a nonlinear model for function reconstruction. Finally, a verification and analysis of the model is given by various numerical examples.

EXTRA 10 % discount with code: EXTRA

158,39
175,99 €
We will send in 10–14 business days.

The promotion ends in 20d.23:49:01

The discount code is valid when purchasing from 10 €. Discounts do not stack.

Log in and for this item
you will receive 1,76 Book Euros!?

We present a Regularization Network approach based on Kolmogorov's superposition theorem (KST) to reconstruct higher dimensional continuous functions from their function values on discrete data points. The ansatz is based on a new constructive proof of a version of the theorem. Additionally, the thesis gives a comprehensive overview on the various versions of KST that exist and its relation to well known approximation schemes and Neural Networks. The efficient representation of higher dimensional continuous functions as superposition of univariate continuous functions suggests the conjecture that in a reconstruction, the exponential dependency of the involved numerical costs on the dimensionality, the so-called curse of dimensionality, can be circumvented. However, this is not the case, since the involved univariate functions are either unknown or not smooth. Therefore, we develop a Regularization Network approach in a reproducing kernel Hilbert space setting such that the restriction of the underlying approximation spaces defines a nonlinear model for function reconstruction. Finally, a verification and analysis of the model is given by various numerical examples.

Reviews

  • No reviews
0 customers have rated this item.
5
0%
4
0%
3
0%
2
0%
1
0%
(will not be displayed)