You say megabyte, I say mebibyte

  • Thread starter Thread starter Grinder
  • Start date Start date
oh boy

well, this aspect of your argument is unique to you. I would be
suprised if anybody else nodded in agreement to that. Looks like
mishmash. You said this sort of thing in the previous thread that I
mentioned.

Which part? I'll be sure to send all my remarks to your
email address first for editing purposes, but that would
seem to defeat the purpose of offering my opinion instead of
yours.

The industry did settle for a term that used a prefix having
a different value in the decimal system, knowing that it
did, choosing to do so to represent a binary value that was
different than the decimal value. Today we might think they
should have used a different prefix. Perhaps they
should've, _but_ they didn't. On the other hand it's not as
though there aren't many words in our language that have
more than one definition. Exceptions are the rule.
 
Hardly.

Standards authorities can help smooth the way, but they can equally well
just get in the way and botch things up. We should shun them when they
do the latter.

-Miles

Controlling bodies with authority are seldom the ones that
actually create a standard, and NIST is not a controlling
authority in the computer industry.

GT doesn't want to accept that other bodies with supposed
authority, like Webster's dictionary, don't typically invent
new words but instead incorporate them after a large enough
group of people have settled on a standard definition.
 
The industry did settle for a term that used a prefix having
a different value in the decimal system, knowing that it
did, choosing to do so to represent a binary value that was
different than the decimal value.

this is "chinese"



<snip>
 
Which part? <snip>

the part where you make use of words like binary, decimal, and number
system.
The industry did settle for a term that used a prefix having
a different value in the decimal system, knowing that it
did, choosing to do so to represent a binary value that was
different than the decimal value.  <snip>

I do understand what you are saying, which is what you said before,
and what I and Miles and we are all trying to drill into GT's head.

But the way you are using these terms just makes no sense.

What is "a prefix having a different value in the decimal system".
I know exactly what you mean, but what you wrote does not mean
anything.

Different value to what?

when you said "a prefix" were you referring to a 2^ prefix, or a 10^
prefix? I know what you mean, and I know what you would have written
if asked . you would have written binary prefix. But you did not say.
You define nothing.


Your mistake there (I can see how you misdefined things) is one does
not have a prefix like Mega having a different value in different
number systems. The 2^20 prefix is a different prefix to the 10^6
prefix, even though both go by the same name of Mega.

Furthermore,
It just so happens, that by some annoying terminology, the 2^20 prefix
is known as the binary prefix. Binary in a general sense can mean
boolean, or it can as in this case, related to powers of 2.
Binary prefix does not mean binary number system. Certainly not in
this case. If it was in the binary number system it would be in 1s and
0s.

I do understand what you mean. But the meaning of what you are writing
when -you- start using words like binary, decimal, number system,
prefix, together, is nonsense. no meaning at all.
 
kony said:
They aren't standard terms, they are an attempt to redefine
what the standard terms are and always were.

Mega is a standard term meaning 10^6. Mebi is a standard term meaning 2^20.
Trying to use Mean to mean 2^20 is an attempt to redefine the standard term
and is therefore wrong.
One last time - in the computer industry these terms were
developed and used as standard terms, long before NIST tried
to redefine them. The industry ignores NIST because they
are not the industry that standardized the term.

One last time - the computer industry didn't develop these terms. The terms
exists long before that and the computer industry has tried to re-define the
standard!
If you want your quest to succeed you have to convince the
industry to stop using the terms and use yours instead. The
manufacturers and windows have to change first, THEN the
industry has done what it did in the first place, decided on
a standard term. No 3rd party can change the meaning. It is
irrelevant that the prefixes mean something else in the
decimal system because computers don't use the decimal
system, it is only converted to that for the benefit of the
users.

Kony, you have said this before and I have tried to explain to you. It
doesn't matter what information is stored inside something, we can count
quantity in any base we choose. Bits are binary digits, but I can still have
decimal 37 of them, there is nothing wrong with counting in the base
familiar to 99.99% of the planet's population! Coin tosses are binary
results (heads or tails = 0 or 1). I can count the number of coin tosses in
decimal just as validly as I can count bits in decimal. If you want to count
them in base 2, them please go ahead, but you will find that 2^10, 2^20 etc
are not a binary numbers so you are in fact still counting in base 10, so
your argument just crumbled around your ankles!
No, it looks like they can't even understand basic computer
terms that've been around for decades.

Which computer terms are you talking about? Mega isn't a computer-specific
term its a mathematical term meaning 10^6.
 
this is "chinese"

As well as nonsense. Mega means 1,000,000 in decimal. You can convert this
decimal number to any base you choose, but the term mega still means 10^6.
No industry or individual is free to change that standard. Unfortunately
Microsoft has used the wrong value and everyone else has followed suit, but
it is still wrong! Just because one of the biggest companies in the world
mis-uses a term, doesn't mean we have changed the standard!
 
Miles Bader said:
Hardly.

Standards authorities can help smooth the way, but they can equally well
just get in the way and botch things up. We should shun them when they
do the latter.

Nonsense. Ever heard of ISO?
 
kony said:
Controlling bodies with authority are seldom the ones that
actually create a standard, and NIST is not a controlling
authority in the computer industry.

GT doesn't want to accept that other bodies with supposed
authority, like Webster's dictionary, don't typically invent
new words but instead incorporate them after a large enough
group of people have settled on a standard definition.

More than happy to accept this. I even agree with this. The problem I have
is when somebody or some company tries to change a standard that is already
well established. This change goes against the majority and is not valid.
 
well, this aspect of your argument is unique to you. I would be
suprised if anybody else nodded in agreement to that. Looks like
mishmash. You said this sort of thing in the previous thread that I
mentioned.

Which part? <snip>

the part where you make use of words like binary, decimal, and number
system.

[snip]

The 2^20 prefix is a different prefix to the 10^6
prefix, even though both go by the same name of Mega.

That is where you all make your mistake. The prefix / term Mega does not
have 2 meanings. It has 1 meaning - 10^6 AKA 1 million. 2^20 is not Mega -
it is commonly mistaken for Mega, but is not Mega.
 
Which part? <snip>
the part where you make use of words like binary, decimal, and number
system.
[snip]

The 2^20 prefix is a different prefix to the 10^6
prefix, even though both go by the same name of Mega.

That is where you all make your mistake. The prefix / term Mega does not
have 2 meanings. It has 1 meaning - 10^6 AKA 1 million. 2^20 is not Mega -
it is commonly mistaken for Mega, but is not Mega.-

Don`t play games and try to hitchhike your stupid argument onto
everything. You made your argument to me elsewhere in the thread, and
it is not relevant to what I wrote to Kony here. I responded to your
argument in this thread, when it was appropriate to do so. As did
everybody else. Fortunately for me, you repeatedly failed to even
write a response to me (no prizes for why I said "fortunately").
Don`t hijack other points and drop your crap on them. This is a
different subject. The subject is about Kony`s usage of the terms
binary, number system, prefix, together. It is one point I separated
from his post, I separated it because it is one thing that had nothing
to do with you/this stupid argument you have been repeating over and
over (so unless you want to contribute to - for or against IT (IT
being the point I am making HERE), then don`t.
 
<snip>
It is
irrelevant that the prefixes mean something else in the
decimal system because computers don't use the decimal
system, it is only converted to that for the benefit of the
users.
well, this aspect of your argument is unique to you. I would be
suprised if anybody else nodded in agreement to that. Looks like
mishmash. You said this sort of thing in the previous thread that I
mentioned.
Which part? <snip>
the part where you make use of words like binary, decimal, and number
system.
[snip]

The 2^20 prefix is a different prefix to the 10^6
prefix, even though both go by the same name of Mega.

That is where you all make your mistake. The prefix / term Mega does not
have 2 meanings. It has 1 meaning - 10^6 AKA 1 million. 2^20 is not
Mega -
it is commonly mistaken for Mega, but is not Mega.-

Don`t play games and try to hitchhike your stupid argument onto
everything. You made your argument to me elsewhere in the thread, and
it is not relevant to what I wrote to Kony here. I responded to your
argument in this thread, when it was appropriate to do so. As did
everybody else. Fortunately for me, you repeatedly failed to even
write a response to me (no prizes for why I said "fortunately").
Don`t hijack other points and drop your crap on them. This is a
different subject. The subject is about Kony`s usage of the terms
binary, number system, prefix, together. It is one point I separated
from his post, I separated it because it is one thing that had nothing
to do with you/this stupid argument you have been repeating over and
over (so unless you want to contribute to - for or against IT (IT
being the point I am making HERE), then don`t.

Your so called change of subject contained an attack on me... "trying to
drill into GT's head.". In my eyes this warranted a response from me and was
not, as you are trying to twist it, a response directly and solely to Kony.
It was also not a change of subject! This entire thread has turned into an
argument about the prefix Mega and whether it is or is not a standard. I
have yet to see a single reply (quoting Wikipedia indicates a lost
argument!) that supports your argument that Mega is 2^20. When your opinion
differs from mine, I will respond. If you feel I am saying the same thing
over and over, then perhaps that is because my point is simple and correct,
so when the replies repeat the same incorrect nonsense about standards being
changed, what am I supposed to do? I certainly won't be joining you on theMS
bandwagon and start changing my view on mathematics?
 
As well as nonsense. Mega means 1,000,000 in decimal.

Except in the computer industry which uses it to describe a
binary, not decimal value.

You can convert this
decimal number to any base you choose,

So long as it still has the same numerical value this is
true. Since it does not, and the actual quantity is
necessarily based on binary true/false, on/off, 0/1 logical
design, not decimal. A decimal prefix is an invalid
expression in a binary system.
but the term mega still means 10^6.

Except in the computer industry

No industry or individual is free to change that standard.

Right, that's why it doesn't matter what NIST tried to
declare. That "standard" became a standard when the
computing industry made it their standard.

Unfortunately
Microsoft has used the wrong value and everyone else has followed suit, but
it is still wrong!

Whether you feel they shouldn't have used the (roughly
equivalent) wrong value or not, they DID. However, no it
was not Microsoft that started it all, it was that computers
are inherantly binary - MS and windows were preceeded by the
industry standard using the binary value.

Just because one of the biggest companies in the world
mis-uses a term, doesn't mean we have changed the standard!

You don't seem to have the history right. When these
standards were first created, if you'd been around at the
time then you would have had an opportunity to plead your
case. After the industry established the standard - before
there was any "windows" at all, it is fairly irrelevant that
the prefix means something else in another field of study.
 
That is where you all make your mistake. The prefix / term Mega does not
have 2 meanings. It has 1 meaning - 10^6 AKA 1 million. 2^20 is not Mega -
it is commonly mistaken for Mega, but is not Mega.

Wanna bet? You might have some memory in your computer,
care to tell us how the manufacturer (following
international standards) rates it's capacity?
 
I
have yet to see a single reply (quoting Wikipedia indicates a lost
argument!) that supports your argument that Mega is 2^20.

Ok, here ya go:
http://www.dramexchange.com/

Entire page full of memory chips bought and sold by the
major players in the industry. megabit, gigabit, megabyte,
gigabyte all in specific binary values.

It's not that there isn't anything supporting the binary
argument, it's that you choose to ignore everything around
you, the very industry that uses the term. Microsoft just
used the standard terms that pre-dated them, BECAUSE it was
standard, they certainly didn't pull the number out of thin
air for no reason.
 
Mega is a standard term meaning 10^6.

Except in the computer industry.


Mebi is a standard term meaning 2^20.

To someone not in the computer industry and thus, ignorant
of the correct terms.

Trying to use Mean to mean 2^20 is an attempt to redefine the standard term
and is therefore wrong.

Except in the computer industry. The evidence is
staggering, all around us. Even those who claim we should
use different terms are already conceding we are supposed
to "change" the standard way of expression to their new way
that they feel is more philosophically correct. There'd be
nothing to change if they weren't trying to mess with
standard definitions.
 
More than happy to accept this. I even agree with this. The problem I have
is when somebody or some company tries to change a standard that is already
well established.

Show us any evidence of a person using the term megabyte by
the definition you claim is correct prior to, oh let's pick
a date out of thin air like 1968, 40 years ago. Actually I
don't think you can find anyone using your defintion prior
to '98, a full 30 years later except after the hard drive
manufacturers switched... and by the way there have been
several class action lawsuits about their doing so since a
hard drive is a binary storage medium.

IF you can't find one example of someone using the terms
before they were considered a standard (decades ago) you
have no evidence there was even any minority, let alone a
majority decimally-defined value for megabyte.


This change goes against the majority and is not valid.

There was no majority defining megabyte the way you insist
we should. Never. once you tack byte on the meaning is a
very specific one.
 
GT said:
Nonsense. Ever heard of ISO?

Of course I've heard of ISO. Your point?

ISO can **** things up just as badly as anybody; maybe worse, because
they have a position of power to abuse.

Here's a very famous example:

http://en.wikipedia.org/wiki/Open_Systems_Interconnection

In that case, the ISO "standard" was crap, was shunned, and eventually
died. That is as it should have been. However, their ham-handed abuse
of the standards process caused lot of damage (ask anybody in Britain,
where proper wide-area networks were ****ed up for many years, in part
because of the whole OSI boondoggle).

-Miles
 
Back
Top