Alright, I am reading this book called "Bayesian Data Analysis" and it states this idea of being uniform in log of something which I don't have any clue what that might mean??
Example 1: Introduction of Dirichlet distribution as a conjugate for Multinomial distribution:
$$p(\theta|\alpha) \propto \prod\limits_{j=1}^{k}{\theta_{j}^{\alpha_{j}-1}}$$ where the distribution is restricted to nonnegative $\theta_{j}$'s with $\sum\limits_{i=1}^{k}{\theta_j}=1$. ... Setting $\alpha_j=0$ for all $j$ results in an improper prior distribution that is uniform in the $log(\theta_j)$.
Example 2: Or somewhere else in the book about normal distributions it mentions that considering $p(\sigma^2)\propto \frac{1}{\sigma^2}$ is makes a uniform distribution with regard to $\sigma$ on log scale.
Could somebody please explain the situation?