Why is a maximum likelihood estimation consistent?
It can only be more precisely measured, not moved.
The “maximum likelihood” is another way to describe the peak or center of the normal distribution curve. The Central Limit Theorem says that addition data can only improve the accuracy of the value, but the central value will remain the same. Thus, it is consistent even with varying amounts of data.
In probability theory, the central limit theorem establishes that, for the most commonly studied scenarios, when independent random variables are added, their sum tends toward a normal distribution even if the original variables themselves are not normally distributed.
The central limit theorem and the law of large numbers are the two fundamental theorems of probability. Roughly, the central limit theorem states that the distribution of the sum (or average) of a large number of independent, identically distributed variables will be approximately normal, regardless of the underlying distribution. The importance of the central limit theorem is hard to overstate; indeed it is the reason that many statistical procedures work. http://www.math.uah.edu/stat/sample/CLT.html
Detailed discussion here: https://www.thoughtco.com/importance-of-the-central-limit-theorem-3126556