The maximum entropy probability distribution has entropy at least as great as that of all other members of a specified class of probability distributions (pdf's). Does that mean that the pdf with maximum entropy can only be determined by comparing the entropy values of pdfs 1, 2, 3, 4 and 5 side by side to figure out that pdf 3 has the highest entropy?
Whereas, in the case you are only given pdf 1, you have no evidence to say that it has the maximum entropy of its class?