1

I came accross this warning in the higher PyTorch library

    # Crucially in our testing procedure here, we do *not* fine-tune
    # the model during testing for simplicity.
    # Most research papers using MAML for this task do an extra
    # stage of fine-tuning here that should be added if you are
    # adapting this code for research.

which I found confusing. Does someone know what this means?


Thoughts:

Since maml already has it's fine-tuning step (it's adaptation step) I'm unsure what this means.

Does this mean that people change the meta-learned weights (slow weights) on the validation set again? Isn't that just making the data set larger in overall effect? Or something else?


original question on github repo: https://github.com/facebookresearch/higher/issues/120

Charlie Parker
  • 5,836
  • 11
  • 57
  • 113

0 Answers0