G
GT
kony said:Nobody declared Mega a valid prefix for -bits, -bytes, until
the computer industry had already standardized the terms to
be representations of binary values, in a binary number
system where base 2, not base 10, determines the values.
Mega is a standard prefix for any units. The computer industry can't change
that. End of story.
That's not what a standard is.
Yes it is.
Because you don't understand that it is a decimal
representation, you don't understand why it is necessary it
be a specific binary value not based on powers of 10. It
was clearly the standardized way to express values long
before NIST tried to step in and muddy the waters. Nobody
was using the term as you suggested until recently, doing so
contrary to the majority use for many years, making it
impossible that the standard is actually what you claim.
Wrong, 99% of the population has used the term mega in its correct 10^6 for
decades, centuries.
It does matter, it was all the people using the term,
including scientists who realized it was a decimal
representation of a binary value, for decades that made it
the standard when nobody was using the term the way you
insist must be the standard.
How long have people called dogs, "dogs"?
We don't need to know the exact date this began, but we see
it is a fairly standard term.
Nice example of why a standard shouldn't be changed. We don't want to change
the standard term for dogs, just because a small proportion of society think
dogs are small black 3 legged wooden horses! A dog is a dog - 4 legged
canine. A mega is a prefix meaning 10^6. Lets leave the standards alone -
they have worked for centuries. If you want to represent things that are
2^20, then come up with a term for it, but you can't steel a long standing
SI unit and change its value! That is ludicrous and leads to confusion and
these kinds of silly discussions!