Noisy nonlinear information and entropy numbers

David Krieg, Erich Novak, Leszek Plaskota, Mario Ullrich

Research output: Working paper and reportsPreprint

Abstract

It is impossible to recover a vector from $\mathbb{R}^m$ with less than $m$ linear measurements, even if the measurements are chosen adaptively. Recently, it has been shown that one can recover vectors from $\mathbb{R}^m$ with arbitrary precision using only $O(\log m)$ continuous (even Lipschitz) adaptive measurements, resulting in an exponential speed-up of continuous information compared to linear information for various approximation problems. In this note, we characterize the quality of optimal (dis-)continuous information that is disturbed by deterministic noise in terms of entropy numbers. This shows that in the presence of noise the potential gain of continuous over linear measurements is limited, but significant in some cases.
Original languageEnglish
Number of pages15
DOIs
Publication statusPublished - 27 Oct 2025

Publication series

NamearXiv.org
No.2510.23213

Fields of science

  • 101032 Functional analysis
  • 102 Computer Sciences

JKU Focus areas

  • Digital Transformation

Cite this