4

The first notion of continuity in a math class is usually the one based on metric spaces. In particular, the $\epsilon,\delta$ definition of continuity.

But in topology, a more general notion of continuity is defined.

In the context of machine learning, and computational learning theory, I've encountered $\epsilon, \delta$ based results (e.g. in PAC learning, and function approximation theorems for neural nets).

Are there "topological" style theorems about machine learning/computational learning theory, that don't make use of $\epsilon$'s and $\delta$'s?

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
user56834
  • 2,157
  • 13
  • 35

1 Answers1

6

Are there "topological" style theorems about machine learning/computational learning theory, that don't make use of $\epsilon$'s and $\delta$'s?

I think it's illustrative to mention why most ML uses "$\epsilon$'s and $\delta$'s" - that's because it's based on statistics, and if you actually use statistics for anything, you need guarantees for finite samples, not for how stuff behaves in the limit.

I recall single theorem that may actually look like what you've described: On Fiber Diameters of Continuous Maps uses Borsuk-Ulam theorem to establish that every dimensionality reduction method necessarily 'glues together' arbitrarily distant points.

If on the other hand you're asking about interesting topological ideas in data science/ML, then you could check these ones out:

I think that you should also find Topological and Geometric Data Reduction and Visualization course interesting.

Jakub Bartczuk
  • 5,526
  • 1
  • 14
  • 36