Measures like KL divergence can be symmetrized (into JS divergence). Bhattacharyya distance serves a similar function. Either is well-suited to both continuous distributions and discrete (e.g. multinomial) distributions. Is there a measure that conveys the divergence or distance between ordered distributions? One that penalizes more if probability mass is in a more distant category?
The particular application is comparing populations' responses to a Likert-scale survey question. I have multiple paired populations, and I treat each population's response as a probability distribution over {1,...,7}. I want to compare the pairs' distances (or symmetric divergences)—e.g., are A1 and A2 farther than B1 and B2? With a cursory understanding, I'm drawn to the Wasserstein distance, but perhaps there is something more standard or appropriate.