From Casella's Statistical Inference:
Definition 10.1.7 For an estimator $T_n$, if $\lim_{n\to \infty} k_n Var T_n = \tau^2 < \infty$, where $\{k_n\}$ is a sequence of constants, then $\tau^2$ is called the limiting variance or limit of the variances of $T_n$.
Definition 10.1.9 For an estimator $T_n$, suppose that $k_n(T_n - \tau(\theta)) \to n(0, \sigma^2)$ in distribution. The parameter $\sigma^2$ is called the asymptotic variance or variance of the limit distribution of $T_n$.
I was wondering if both definitions depend on the choice of the sequence $k_n$, because I suspect for some choice of the sequence $k_n$, the convergence may fail, while for some other choice of the sequence $k_n$, the convergence may succeed. Then are the two definitions not well defined, because aren't they supposed to be not dependent on the choice of the sequence $k_n$?
For example, in Lyapunov CLT, $\frac{1}{s_n} \sum_{i=1}^{n} (X_i - \mu_i) \ \xrightarrow{d}\ \mathcal{N}(0,\;1)$ where $ s_n^2 = \sum_{i=1}^n \sigma_i^2 $. According to the above definition of asymptotic variance, $T_n = \sum_{i=1}^n X_i$, $\tau(\theta) = \sum_{i=1}^n \mu_i$ (should \tau(\theta) be independent of sample size $n$?), and the asymptotic variance of $\sum_{i=1}^n X_i$ is $1$ (this is hard to believe, because the variance $\sigma_i^2$ of $X_i$ can be any as long as it is finite)?
Can the limiting distribution in the definition of the asymptotic variance to be other than a Normal distribution?
When will the limiting variance and the asymptotic variance be the same?
Similarly but more generally,
how can we define limiting moments and asymptotic moments?
Is the limiting distribution in the definition of an asymptotic moment required to be a Normal distribution?
When will the limiting moment and the asymptotic moment coincide?
For example, those two concepts for means: limiting mean and asymptotic mean?
Thanks and regards!