Suppose I have a discrete set of (finite) data (values are only positive integers) (which always remains discrete whenever the observation/survey is taken), divided into some identical categories . Now, I want to calculate the mean values of some random variable for each category, hence getting a sequence of mean values. Considering mean as a random variable (let's call it $M$), it's taking decimal values also (obviously possible). But I am not sure if $M$ is a discrete or a continuous random variable.
My intuition suggest me that it should be discrete. Since, my original data from which I calculated the sets of mean is discrete so the mean can take finitely many values which I suppose is not the case with a continuous random variable where the mean can take infinitely many values within an interval. So is my intuition right or wrong? Is it true that a discrete random variable takes finitely many values and continuous R.V. can take infinitely many possible values?