0

I am watching a video on Thompson Sampling for Machine Learning and at [16:38] the presenter mentions that we can use various models to fit the data, (such as Gaussian Process, SVM in my opinion), but he used Neural network as it doesnt imply any structure on the model so it can go and discover what ever.

What does in it mean that neural Network doesnt imply any structure, what is structure here?

GENIVI-LEARNER
  • 720
  • 4
  • 13
  • 1
    Negative log-likelihood based optimization has a big assumption attached. Unless he is learning a loss function implicitly with semi-supervised learning, then Neural Networks aren't as free as is implied. Forests in general are way more assumption free in my opinion. – Firebug Dec 25 '19 at 22:40
  • So -ve log-likelihood based optimization impose a constraint to the learning process and hence cant be considered model-free? – GENIVI-LEARNER Dec 26 '19 at 11:32
  • 1
    Yes, you must assume a distribution to perform likelihood-based estimation. Most (not all) neural networks do that. – Firebug Dec 26 '19 at 15:49
  • @Firebug oh i see so that "in general" if we are using likelihood based estimation, we are "implying" a distribution. right? – GENIVI-LEARNER Dec 26 '19 at 21:44

1 Answers1

0

Structure in this context refers to a predetermined model or assumptions related to that model or the underlying sample or population distributions. For example when we use a Linear Regression model to fit the data we assume that the assumptions for linear regressions hold for our data. When we use a random forrest, assumptions about random forrests need to hold. The structure of the model also makes it tractable. A neural network on the other hand has no predetermined structure or assumptions about the underlying distribution of the data. The downside is that the resulting model is not tractable and cannot be interpreted easily.

tomanizer
  • 116
  • 2
  • So the resulting model cannot be analytically represented like in the case of others. right? – GENIVI-LEARNER Dec 25 '19 at 21:59
  • Yes it cannot be represented analytically. The neural network would be the weighted connections between your perceptron layers, but they do not have any meaning outside the model. – tomanizer Dec 25 '19 at 22:05
  • What assumptions are there in Random Forests? They are just, if not more, liberal than the ones in Neural Networks. And Neural Networks do not lack structure and (in their most successful applications) do make structural assumptions on data as well. – Firebug Dec 25 '19 at 22:42
  • Your are correct. Random forrests was a poorly chosen example since it makes no assumptions about the structure. There are some ongoing discussions about the effect of multicolinearity on random forrests on this site and elsewhere. In any case I should have chosen a model with strict assumptions as an example. – tomanizer Dec 25 '19 at 22:50
  • @Firebug so the structure on the neural network is basically the amount of hidden layers and their depth, right? – GENIVI-LEARNER Dec 26 '19 at 11:35
  • 1
    @GENIVI-LEARNER no, even the objective function imposes an assumption on the distribution of the outputs. Also, neural networks are so diverse it's really hard to pinpoint what they even are. See [this question](https://stats.stackexchange.com/questions/362425/what-is-an-artificial-neural-network) – Firebug Dec 26 '19 at 15:48