1

If by following way single perceptron is made to work like Logistic Regression.

How much correct is it to say that I made perceptron to work as Logistic Regression.

Question came to mind as activation function of perceptron is a step function and learning is not done by back propagation. https://en.wikipedia.org/wiki/Perceptron

Also as the activation function is changed and also back propagation is considered to be done then is it still a perceptron?

Ajey
  • 155
  • 4

1 Answers1

1

It is perfect to say a logistic regression is a single perceptron. Logistic regression would be a specific case of a perceptron. In fact in Andrew Ng's deep learning specialization, an intuition for perceptron is given by beginning with logistic regression.

Back-propagation helps in scaling up when you add several logistic regressions in series (several hidden layers) and parallel (a single hidden layer). Back propagation for a single perceptron will look like gradient descent for logistic regression.

References

whisperer
  • 357
  • 1
  • 9