I have a sample from a large population where it is possible to select either a 1 or a 5. I want to find the probability that the average in any sample is greater than 4.76, and I want to chart the probability of selecting a sample with average greater than 4.76 as the sample size increases. I assume the probability starts off higher, and decreases.
I have here the PMF: https://en.wikipedia.org/wiki/Hypergeometric_distribution
The problem here is that the PMF only really works with integer values, and I would like to find a continuous theoretical function to show that the probability of getting an average of 4.76 or greater decreases as the sample size gets larger.
For example, with a sample size of 2, you would basically have to select 2 fives to get an average of 4.76 or greater, the average in that case being five. What I would like to be able to get is the theoretical probability that would be estimated if it were possible to select 1.X or one and a fraction of a five.
The population, however large, is assumed to have a mean of 4.76.
Am I just overcomplicating this? Would the answer really just be 50% if it could be done that way?