I just spent the last 2 hours trying to understand degrees of freedom, i read multiple interpertations, however one really stuck with me: it's the number of data points that are free to vary AFTER the parameter is calculated, and it's basically used instead of the sample size onwards in any calculations that includes said parameter, this was the most agreed upon definition to my knowledge ( it's even the definition used by stackexchange for the tag :D ). That being said, I have 2 questions:
1- why do some sources say the opposite, that degrees of freedom is the number of variables that are basically LOCKED IN once the parameter is calculated, and the variance for example is divided by n-dof, which is contrary to common sense when it comes to comparing it to degrees of freedom in other sciences in my opinion. ( this concept is adopted by programming languages like numpy in python )
2- why does the concept of degrees of freedom exists only when it comes to sample, if the definition is correct and it's divided by (n-1) because " the last data point does not contribute new information", what makes this data point important in case of population and negligible in case of samples?