Like the previous poster said, it's an average. But to better answer
the "why" part of your question:
I'm going from memory here, but Moore's Law is based on observation
rather than pure theory. Every few years scientists expect Moore's
Law to break down, but it simply hasn't yet -- and I don't think it
will. For example, just recently humans started making multiple core
CPU's because simply wasn't practical anymore. Quad core CPU's exist
now, and it won't stop there. This goes back to the previous poster
mentioning how we always seem to find a way to keep Moore's Law
holding true.
So nothing says Moore's Law must hold true, it just has because of
Human Will, Determination, and Capitalism (add whatever reasons you
wish) ... and nothing has surfaced to oppose those reasons. Maybe
nothing ever will.
If you know exactly how much raw computational power would be needed
to brute force the creation of human level AI, then using Moore's Law
you could anticipate the latest possible date that affordable AI would
be developed.
Right now an educated guess could be made based upon 1) the raw
computational power required by currently developed AI visual
recognition systems, 2) the number of neurons in a human optic nerve,
and 3) the number of neurons in the whole brain.
Based on that, the date should be around 2050 for human level AI -- at
the price of today's home PC.