Recall in meta-learning we have a meta-set which is a data-set of data-sets:
$$ D_{meta-set} = \{ D_n \}^N_{n=1} $$
where $D_n$ is data-set (or usually a task). Usually defined as a data sampled from a target function for regression or N classes for a classification task. Usually these individual data sets $D_n$ are split into a support set (train set) and a query set (test set).
I've seen the term episode used in meta-learning but it's not been clear to me. There are two possible definitions:
- 1 episode means sampling 1 single data set $D_n$
- 1 episode means sampling M data-sets. i.e. sampling a batch of tasks
which one is it?
reference:
- https://github.com/tristandeleu/pytorch-meta/issues/78
- https://www.quora.com/unanswered/What-does-the-term-episode-mean-in-meta-learning
- https://www.reddit.com/r/MLQuestions/comments/hve478/what_does_the_term_episode_mean_in_metalearning/?
- https://discuss.pytorch.org/t/what-does-the-term-episode-mean-in-meta-learning/90051
- What does the term episode mean in meta-learning?