The first notion of continuity in a math class is usually the one based on metric spaces. In particular, the $\epsilon,\delta$ definition of continuity.
But in topology, a more general notion of continuity is defined.
In the context of machine learning, and computational learning theory, I've encountered $\epsilon, \delta$ based results (e.g. in PAC learning, and function approximation theorems for neural nets).
Are there "topological" style theorems about machine learning/computational learning theory, that don't make use of $\epsilon$'s and $\delta$'s?