Under the new fixed-G asymptotics, the centered two-step GMM estimator and the two continuously-updating estimators have the same asymptotic mixed normal distribution.
In addition, the t statistics, J statistic, as well as the trinity of two-step GMM statistics (QLR, LM and Wald) are all asymptotically pivotal, and each can be modified to have an asymptotic standard F distribution or t distribution.
While conventional asymptotic theory completely ignores the variability in the cluster-robust GMM weighting matrix, the new asymptotic theory takes it into account, leading to more accurate approximations.
The key difference between these two types of asymptotics is whether the number of clusters G is regarded as fixed or growing when the sample size increases.
This paper proposes a computationally feasible variation on these standard two-step GMM estimators by applying the idea of continuous-updating to the autoregressive parameter only, given the fact that the absolute value of the autoregressive parameter is less than unity as a necessary requirement for the data-generating process to be stationary.
We show that our subset-continuous-updating method does not alter the asymptotic distribution of the two-step GMM estimators, and it therefore retains consistency.
The GMM method then minimizes a certain norm of the sample averages of the moment conditions.
Recent work on Generalized Method of Moments (GMM) estimators has suggested that the continuous updating estimator is less biased than the commonly used two-step estimator.
We show that the continuous updating estimator can be interpreted as jackknife estimator.
The interpretation gives some insight into why there is less bias associated with this estimator.
k-means clustering is a method of vector quantization, originally from signal processing, that is popular for cluster analysis in data mining.