Lets X=(x1,x2,x3,x4,x5) and let the target variable Y=(y1,y2). Generative model learns a joint probability distribution P(X,Y)=P(x1,x2,x3,x4,x5,y1,y2).
So now think of this P(X,Y) in the form of a table with all these variables and with another column appended to it as the probability of the particular configuration of the variable values. Generative model as you know defines how likely the label(y1,y2) generated the data(x1,x2,x3,x4.x5). Now consider this, if my data is (x1,x2,x3,x4,x5,x6,x7) then I can learn a joint probability distribution and use bayes theorem to fill in any missing values like (x3,x7) as P(x3,x7|(x1,x2,x4,x5,x6)).
These missing values can be your target variable too.
Now you can call them target variable or whatever. But the point here is that generative model defines joint probability distribution over variables irrelevant to what names you give them. Since RBM defines joint probability distribution on input variables that is basically just the data and no labels it is therefore unsupervised learning.