0

I would like to use the KDE to predict future positions of surrounding vehicles. So given a set of data [I: some input features, O: future position] I learn the joint distribution of the input and output with KDE. Then I compute the distribution conditioned by the input features/vector and obtain an estimation for the position from this distribution.

I there anyone who done this before and can share with me his implementation. The challenging part for me is how to compute the conditional distribution after estimating the joint distribution with KDE. Thanky for any help.

  • Why do you want to use it like this? Usually learning the joint distribution is a harder task, than using conditional distribution, so the usual solution would be to use some kind of regression to learn the conditional expectation and use it to make the predictions. – Tim Apr 27 '21 at 09:24
  • Thak you for comment ! I did this with a Gaussian Mixture Model (I fitted a joint distribution and then conditioned it to make predictions) So I just want to do this analog with a KDE. Could you be more specific about the Regression to learn conditional expectation ? some keywords ? – zouloulou Apr 27 '21 at 10:11
  • KDE with Gaussian kernel is the same as Gaussian Mixture Model with the same number of mixing components and equal mixing proportions, so if you have code for GMM, it would work the same for KDE. As about regression, regression algorithms do learn conditional expectations https://stats.stackexchange.com/questions/173660/definition-and-delimitation-of-regression-model/211229#211229 – Tim Apr 27 '21 at 10:16

0 Answers0