1

Not a duplicate since the linked question does not answer this question: A measure of similarity should be maximal for instances which are the same (e.g. similarity between (1,1) and (1,1) should be higher than the similarity between (1, 1) and (1, 2) ). This is not the case. Therefore the dot product is not a measure of similarity. Where am I wrong?

All explanations of the kernel trick I read take for granted that the dot product is a good measure of similarity for instances. I know that the dot product is zero if the vectors are orthogonal.

However, in this image, from my intuitive understanding $x_1$ and $x_3$ are quite similar, while $x_1$ and $x_2$ are very different. Yet, the dot product between $x_1$ and $x_2$ is large, and between $x_1$ and $x_3$ is smaller.

Take for example $x_1 = (1, 1)^T$, $x_2 = (20, 20)^T$, $x_3 = (1, 0.9)^T$. In what sense are $x_1$ and $x_2$ more similiar than $x_1$ and $x_3$? Why does the angle matter so much in practise?

enter image description here

PascalIv
  • 404
  • 4
  • 10
  • See also https://math.stackexchange.com/questions/689022/how-does-the-dot-product-determine-similarity – Tim Sep 19 '19 at 10:32
  • I know that cosine similarity is a measure of similarity for vectors. But the dot product lacks normalization, so it does not make sense to me to say "it is a measure of similiarity". Why is (1,1) more similar to (1,2) then to itself? – PascalIv Sep 19 '19 at 10:41
  • A measure of similarity should be maximal for instances which are the same. This is not the case. Therefore the dot product is not a measure of similarity. Where am I wrong? – PascalIv Sep 19 '19 at 11:02
  • Re your edit: you seem to have a different concept of "same" than that implied by the use of the dot product, but you haven't articulated your concept. – whuber Sep 19 '19 at 14:54
  • Do we agree that (1,1) is the same as (1,1), because it is equal? Therefore (1,1) should have the maximal similarity to (1,1). But the dot product between (1,1) and (1,2) is higher than between (1,1) and (1,1), although the vectors intuitively have less similarity. – PascalIv Sep 19 '19 at 15:30
  • I agree that "equal" implies "same," but the converse often is not true. The use of cosine similarity, for instance, implies that *length* is immaterial. It's important to distinguish what it means for two objects you are modeling to be equal from what it means for their *mathematical representations* to be equal. In particular, there is no universal or general meaning to "intuitive" in this context: it depends on the application. – whuber Sep 19 '19 at 17:08
  • @whuber Those are great points, thanks. I cannot think of any application though, where the length of an instance vector in my feature space is immaterial. If I, for example, have a feature "house price" and two instances differ only in the "house price", why are they considered as "equal objects" (=their mathematical representation differs only in length, not angle)? – PascalIv Sep 19 '19 at 17:40

0 Answers0