Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Software - Troubleshooting and Discussion > macOS > When will OS X use KiB, MiB, GiB??

When will OS X use KiB, MiB, GiB??
Thread Tools
mactropolis
Senior User
Join Date: Nov 1999
Location: Milkyway Galaxy
Status: Offline
Reply With Quote
Aug 21, 2005, 10:54 PM
 
Hi,
I was just wondering when/if Apple would update the nomenclature for file sizes through-out the OS? Currently, the Finder (Get Info, List view,etc) reports file sizes as KB, MB, GB, etc. However, this can be confusing to users who assume 1 GB = 10 Billion Byes, instead of 1 GB = 1 073 741 824 Bytes. For example, making this distinction will allow users to understand why when they buy a 60 GB (gigabyte) iPod that iTunes reports it as 55.8 GB (correctly 55.8 GiB - gibibyte). This would help prevent new users from assuming GB (binary) == GB (decimal). The IEC made this change standard back in December 1998, yet neither Windows, Mac, or Linux uses Kib, MiB, or GiB to this day. Why?
Death To Extremists!
     
Chuckit
Clinically Insane
Join Date: Oct 2001
Location: San Diego, CA, USA
Status: Offline
Reply With Quote
Aug 21, 2005, 11:00 PM
 
Because nobody knows what the **** "MiB" is other than a Will Smith movie.
Chuck
___
"Instead of either 'multi-talented' or 'multitalented' use 'bisexual'."
     
Thinine
Mac Elite
Join Date: Jul 2002
Status: Offline
Reply With Quote
Aug 22, 2005, 12:06 AM
 
Why? Because that's not how computers measure file size.
     
mactropolis  (op)
Senior User
Join Date: Nov 1999
Location: Milkyway Galaxy
Status: Offline
Reply With Quote
Aug 22, 2005, 04:28 AM
 
Originally Posted by Thinine
Why? Because that's not how computers measure file size.
KiB, MiB, and GiB are the correct way operating systems are supposed to report file size informations. The current system is ambiguous and mis-leading.
Death To Extremists!
     
red rocket
Mac Elite
Join Date: Mar 2002
Status: Offline
Reply With Quote
Aug 22, 2005, 05:12 AM
 
I41 might take the view that educating sheeple out of their retarded misledconceptions is a good thing. Not TEH good, mind, but argueably a valid point of view, 8infinitely more satanicimeanintelligent than ewr selfdelimitedinstructionist alternative, lolo. Just coz ewes are confused and mis-led, does not mean the system's @ fault, emu.
     
Link
Professional Poster
Join Date: Jun 2003
Location: Hyrule
Status: Offline
Reply With Quote
Aug 22, 2005, 05:28 AM
 
Complicating things to compensate for lack of education is silly. In essence, browsers should use KiB/sec instead of KB/sec and well, this just makes things even more confusing..

People have enough trouble with KBps and Kbps!
Aloha
     
Maflynn
Professional Poster
Join Date: Mar 2002
Location: Boston
Status: Offline
Reply With Quote
Aug 22, 2005, 07:02 AM
 
Originally Posted by red rocket
I41 might take the view that educating sheeple out of their retarded misledconceptions is a good thing. Not TEH good, mind, but argueably a valid point of view, 8infinitely more satanicimeanintelligent than ewr selfdelimitedinstructionist alternative, lolo. Just coz ewes are confused and mis-led, does not mean the system's @ fault, emu.
How about english or a good spell checker next time

I have no idea what point your tying to make
     
analogika
Posting Junkie
Join Date: Feb 2005
Location: 888500128
Status: Offline
Reply With Quote
Aug 22, 2005, 07:24 AM
 
His point was that people are confused enough as it is, and it is not the operating system's responsibility to educate them, especially at the risk of confusing them even more by deviating from the usual KB/MB defacto standard.

I'd have preferred English, as well, but it wasn't *that* difficult to decipher - eventually.
     
Millennium
Clinically Insane
Join Date: Nov 1999
Status: Offline
Reply With Quote
Aug 22, 2005, 08:11 AM
 
I'm going to agree with analogika and the others on this one. KB, MB, GB, and such are the standard terms, while KiB and such are recent mockups for people which are not as well-suited to computers. They are not the right tools for the job, and even fewer people understand them than understand the standard terms.
You are in Soviet Russia. It is dark. Grue is likely to be eaten by YOU!
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Aug 22, 2005, 08:32 AM
 
I think these quantities were just brought up by harddrive manufacturers who wanted to make their drives appear bigger than they are.

The OS (all I have worked with over the years) adhere to the usual 1024 convention.
I don't suffer from insanity, I enjoy every minute of it.
     
TETENAL
Addicted to MacNN
Join Date: Aug 2004
Location: FFM
Status: Offline
Reply With Quote
Aug 22, 2005, 08:47 AM
 
Originally Posted by OreoCookie
The OS (all I have worked with over the years) adhere to the usual 1024 convention.
It must have been an American who invented this convention, because everywhere else in the world we are using the decimal system. If the OS would consider kilo to mean thousand – like it's supposed to mean – then there wouldn't be any confusion.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Aug 22, 2005, 09:52 AM
 
Originally Posted by TETENAL
It must have been an American who invented this convention, because everywhere else in the world we are using the decimal system. If the OS would consider kilo to mean thousand – like it's supposed to mean – then there wouldn't be any confusion.
I don't know who invented it, but somehow it seemed reasonable to use powers of 2 instead of powers of 10. Still, powers of 10 are more reasonable. Unless you are measuring in inch and feet, obviously
I don't suffer from insanity, I enjoy every minute of it.
     
wataru
Addicted to MacNN
Join Date: Oct 2001
Location: Yokohama, Japan
Status: Offline
Reply With Quote
Aug 22, 2005, 11:40 AM
 
Originally Posted by OreoCookie
I think these quantities were just brought up by harddrive manufacturers who wanted to make their drives appear bigger than they are.
Um, you got it backwards. Hard drive manufacturers misleadingly use KB, MB, GB according to the standard metric definitions to inflate their sizes. Using KiB, MiB, and GiB would mean they report their drive sizes as smaller, e.g. "37.14GiB" instead of "40GB."
     
aristotles
Grizzled Veteran
Join Date: Jul 2004
Location: Canada
Status: Offline
Reply With Quote
Aug 22, 2005, 12:05 PM
 
Originally Posted by mactropolis
KiB, MiB, and GiB are the correct way operating systems are supposed to report file size informations. The current system is ambiguous and mis-leading.
Aren't there more important things to worry about? I'm not confused and know that KB = 1024 bytes. How old are you? I've used computers since the early 80's and I have no problem with the convention.

The only thing that is misleading is the use of metric GB ratings for hard drive sizes when they bloody well know everyone uses the 1024 convention.

Do you think of sizes in terms of 1000 or 1024 when dealing with computers?
--
Aristotle
15" rMBP 2.7 Ghz ,16GB, 768GB SSD, 64GB iPhone 5 S⃣ 128GB iPad Air LTE
     
aristotles
Grizzled Veteran
Join Date: Jul 2004
Location: Canada
Status: Offline
Reply With Quote
Aug 22, 2005, 12:08 PM
 
Originally Posted by wataru
Um, you got it backwards. Hard drive manufacturers misleadingly use KB, MB, GB according to the standard metric definitions to inflate their sizes. Using KiB, MiB, and GiB would mean they report their drive sizes as smaller, e.g. "37.14GiB" instead of "40GB."
It is a little late to change standards. KB = 1024 bytes is the de-facto standard as it has always been.
--
Aristotle
15" rMBP 2.7 Ghz ,16GB, 768GB SSD, 64GB iPhone 5 S⃣ 128GB iPad Air LTE
     
wataru
Addicted to MacNN
Join Date: Oct 2001
Location: Yokohama, Japan
Status: Offline
Reply With Quote
Aug 22, 2005, 12:48 PM
 
Then get the hard drive makers to actually use KB = 1024 bytes instead of KB = 1000 bytes. That will solve the problem without this ugly KiB business.
     
Sven G
Professional Poster
Join Date: Dec 2000
Location: Milan, Europe
Status: Offline
Reply With Quote
Aug 22, 2005, 01:27 PM
 
To add to the confusion, the French use the term octet instead of byte.

BTW, there is a very interesting webpage on the whole KB, KiB, etc. (or Ko, Kio, etc.) subject on the French Octet Wikipedia page...

The freedom of all is essential to my freedom. - Mikhail Bakunin
     
Mr Scruff
Mac Enthusiast
Join Date: Feb 2001
Location: London, UK
Status: Offline
Reply With Quote
Aug 22, 2005, 04:38 PM
 
Originally Posted by OreoCookie
I don't know who invented it, but somehow it seemed reasonable to use powers of 2 instead of powers of 10. Still, powers of 10 are more reasonable. Unless you are measuring in inch and feet, obviously
It wasn't invented by anyone. The reason why digital quantities tend to be powers of 2 (or multiples of powers of 2) is that computers 'count' in binary, not with their fingers. 1K is 1024 bytes because 1024 is 10000000000 in binary. If you really want to think about things in decimal think about it like this:

1K = 2^10 bytes
1MB = 2^20 bytes
1GB = 2^30 bytes
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Aug 22, 2005, 04:49 PM
 
Originally Posted by Mr Scruff
It wasn't invented by anyone. The reason why digital quantities tend to be powers of 2 (or multiples of powers of 2) is that computers 'count' in binary, not with their fingers. 1K is 1024 bytes because 1024 is 10000000000 in binary. If you really want to think about things in decimal think about it like this:

1K = 2^10 bytes
1MB = 2^20 bytes
1GB = 2^30 bytes
I know it isn't a coincidence. However, usually, computers work in powers of 8 (8 bit, 16, 24, 32, 64, 128, etc. So 2^10 `emulates' 1000, because it's close to it. At one point somebody thought it's a good idea to use 2^10 instead of 1000 and that stuck.
I don't suffer from insanity, I enjoy every minute of it.
     
Thinine
Mac Elite
Join Date: Jul 2002
Status: Offline
Reply With Quote
Aug 22, 2005, 04:59 PM
 
No, computers work in powers of 2. Only 24 there isn't a power of 2, and I've only seen that used for some old old graphics cards, and it was really just adding 16 and 8 to get more depth. And of those numbers, only 8 and 64 are actually 'powers of 8'.
     
Weyland-Yutani
Mac Elite
Join Date: Mar 2005
Location: LV-426
Status: Offline
Reply With Quote
Aug 22, 2005, 05:06 PM
 
I never knew what KiB and MiB meant. I saw it in some download app though (Azureus?) and it rang a bell. I just thought it was the same as KB and MB. Ah well. At these sizes it doesn't matter much, but when you reach GB vs GiB and TB vs TiB it starts to matter.

cheers

W-Y

“Building Better Worlds”
     
Weyland-Yutani
Mac Elite
Join Date: Mar 2005
Location: LV-426
Status: Offline
Reply With Quote
Aug 22, 2005, 05:07 PM
 
Originally Posted by OreoCookie
I know it isn't a coincidence. However, usually, computers work in powers of 8 (8 bit, 16, 24, 32, 64, 128, etc. So 2^10 `emulates' 1000, because it's close to it. At one point somebody thought it's a good idea to use 2^10 instead of 1000 and that stuck.
Certain older models have used six-, seven-, or nine-bit bytes - for instance on the 36-bit architecture of the PDP-10. Another example of a non eight-bit sequence is the 12-bit slab of the NCR-315.

The term byte was coined by Werner Buchholz in 1956 during the early design phase for the IBM Stretch computer. Originally it was described as one to six bits; typical I/O equipment of the period used six-bit units. The move to an eight-bit byte happened in late 1956, and this size was later adopted and promulgated as a standard by the System/360. The word was coined by mutating the word bite so it would not be accidentally misspelled as bit.

The reason for 2^10 bits in a kB is indeed probably used is because 1024 is the closest to 1000 as 2 in the power of x. But I don't really know. Mr Scruff's explaination makes more sense though.

cheers

W-Y

“Building Better Worlds”
     
Millennium
Clinically Insane
Join Date: Nov 1999
Status: Offline
Reply With Quote
Aug 22, 2005, 08:49 PM
 
Originally Posted by OreoCookie
I don't know who invented it, but somehow it seemed reasonable to use powers of 2 instead of powers of 10. Still, powers of 10 are more reasonable. Unless you are measuring in inch and feet, obviously
It is more reasonable, because computers "think" in powers of two. That's what binary is. By using a power of two, everything remains consistent between human and computer. When human terms are not the most appropriate, why should they be used?
You are in Soviet Russia. It is dark. Grue is likely to be eaten by YOU!
     
msuper69
Professional Poster
Join Date: Jan 2000
Location: Columbus, OH
Status: Offline
Reply With Quote
Aug 22, 2005, 11:13 PM
 
Originally Posted by Sven G
To add to the confusion, the French use the term octet instead of byte.

BTW, there is a very interesting webpage on the whole KB, KiB, etc. (or Ko, Kio, etc.) subject on the French Octet Wikipedia page...

Yes. The French always try to surrender in sets of eight.
     
Tsilou B.
Senior User
Join Date: May 2002
Location: Austria
Status: Offline
Reply With Quote
Aug 23, 2005, 02:22 AM
 
There are two problems with KB, MB and GB:

1.) Every other K/M/G stands for 1,000/1,000,000/1,000,000,000. This is "only" a problem for people who don't know much about computers.
2.) You can never know if KB means 1024 or 1000 bytes. Most applications and operating systems use 1024 bytes, but e.g. hard drive manufacturers use the 1000 byte convention, and, you cannot sue them, because their convention is the one which is approved by the IEC. This is a problem even for people who know a lot about computers.

When everyone who prefers to use powers of two would use the new "KiB", "MiB" and "GiB" names, the problem would be solved. You would no longer feel deceived if your new 60GB iPod only held 55GiB, because that would be exactly what you had expected.
     
Anubis IV
Dedicated MacNNer
Join Date: Nov 2003
Location: Huh?
Status: Offline
Reply With Quote
Aug 23, 2005, 03:04 AM
 
Originally Posted by Tsilou B.
you cannot sue them, because their convention is the one which is approved by the IEC.
Actually, you can sue them...

http://www.wired.com/news/business/0,1367,60505,00.html

And there are other problems with the nomenclature associated with memory sizes. Take the size "word" for example (yes, "word" is a size, just like bit or byte). I've heard about word referring to everything from 4 bits to 8 bytes or more. It varies from system to system as I understand it, but meh.

Going back to why it is called 1000 instead of 1024, I always was told that it was because it was the closest power of 10. Humans today tend to think in base 10, computers currently almost all think in base 2. Introducing a new naming convention would confuse things though. Maybe putting it out there a little and trying to transition wouldn't be a bad idea, but outright changing all documentation and advertising? No.

Look at what WC3 has been doing with the term "URL". "URL" is no longer the proper term, and it hasn't been for many years now. "URI" is the proper term. But neither they nor any of the other major web standards groups are forcing the term URI down people's throats. They're using it in their new documentation, and possibly slowly updating their old documentation, but otherwards they aren't doing much. Now, granted, switching to URI instead of URL doesn't make their websites suddenly appear to be smaller or something, as is the case with a switch from GB to GiB, but it is a somewhat similar circumstance.
( Last edited by Anubis IV; Aug 23, 2005 at 03:16 AM. )
"The captured hunter hunts your mind."
Profanity is the tool of the illiterate.
     
Maflynn
Professional Poster
Join Date: Mar 2002
Location: Boston
Status: Offline
Reply With Quote
Aug 23, 2005, 07:24 AM
 
Computers use base 2 because at the lowest level, they're on and off switches (or gates) and it was and continues to be infinitely easier to create a computer that handles base 2 then base 10.

We humans are easily confused so we round the KB to a 1000. There's no real mystery why kb is a 1024 or why we use a 1000.

Back in the early days of computers the higher level machine code (assembler) used base 8 (octal) and base 16 (hexidecmial) because it was and continues to be easier to count in those bases then in binary. unlike base 10, base 8/16 translates easier into binary. Most assermbly languages and higher level languages today use hexidecimal and octal has fallen out of favor for the most part.

I'm surprised at all of the posts here given the nerdiness level of macnn is usually pretty high

Mike
     
mactropolis  (op)
Senior User
Join Date: Nov 1999
Location: Milkyway Galaxy
Status: Offline
Reply With Quote
Aug 23, 2005, 11:27 PM
 
Originally Posted by Maflynn
I'm surprised at all of the posts here given the nerdiness level of macnn is usually pretty high

Mike
I must say, I agree with you completely. I expected a bit more intelligence from my MacNN Peers. I posted this same topic at the AppleInsider Forums (here) plus discussed it on AppleInsider IRC, and most resoundingly agreed that "GB/Gigabye" should refer _only_ to decimal, not binary also. I guess it's true the mean IQ of MacNN Forums members is now inferior to AppleInsider and other Mac forums where I posted this topic.
Death To Extremists!
     
msuper69
Professional Poster
Join Date: Jan 2000
Location: Columbus, OH
Status: Offline
Reply With Quote
Aug 23, 2005, 11:41 PM
 
Originally Posted by Maflynn
...
Back in the early days of computers the higher level machine code (assembler) used base 8 (octal) and base 16 (hexidecmial) because it was and continues to be easier to count in those bases then in binary. unlike base 10, base 8/16 translates easier into binary. Most assermbly languages and higher level languages today use hexidecimal and octal has fallen out of favor for the most part.
...
Where did you come up with that?????

Assembly language is one step away from the lowest of languages, machine code. It is most definitely not a higher level language.

The instructions are displayed in hexadecimal in the assembled output only so we humans can make sense out of them. They are still coded in executable files in binary as that's the only code a computer can execute. Every single file be it an executable or data is stored on disk as binary. No Mac, PC, mainframe or any other machine can use hexadecimal, octal or any other number base.
     
CharlieMac
Fresh-Faced Recruit
Join Date: Aug 2005
Status: Offline
Reply With Quote
Aug 23, 2005, 11:45 PM
 
Hey mactropolis! Yea it seems like all the idiots on MacNN descended on your topic. Sorry. But trust me, we do still have a few intelligent members around, so don't give up on us just yet...

The terms 'kilo', 'mega' and 'giga' have meant BASE-10 (1000) far far before the advent of the computer. Computers came along and effectively 'took-over' the meaning of the words 'kilo', 'mega', and 'giga' to mean BASE-2 (1024) -- which is just wrong. It should return to its original meaning and the new 'kibi', 'mibi' and 'gibi' used instead. Stop the confusion for the next generation of computer users, despite how hard it may be for us present day.
I finally got my head together, and my body fell apart.
     
Maflynn
Professional Poster
Join Date: Mar 2002
Location: Boston
Status: Offline
Reply With Quote
Aug 24, 2005, 07:01 AM
 
Originally Posted by msuper69
Where did you come up with that?????

Assembly language is one step away from the lowest of languages, machine code. It is most definitely not a higher level language.

The instructions are displayed in hexadecimal in the assembled output only so we humans can make sense out of them. They are still coded in executable files in binary as that's the only code a computer can execute. Every single file be it an executable or data is stored on disk as binary. No Mac, PC, mainframe or any other machine can use hexadecimal, octal or any other number base.
Ummm, I didn't say high level language but higher, you know higher then the previous choice.

By your own reckoning assembler is higher then machine code. That was my point not that it was a high level language in of itself.

As for executables being stored in binary, the last time I opened up a executable file I failed to see 1 and 0s If it was it would be humongous. Compiled (or assembled) code is stored in machine code which includes hexidecimal representation. Additionally a lot of high level languages such as VB make heavy use of hexidecimal code for various things such as flags and switches.
     
Chuckit
Clinically Insane
Join Date: Oct 2001
Location: San Diego, CA, USA
Status: Offline
Reply With Quote
Aug 24, 2005, 10:13 AM
 
Originally Posted by Maflynn
As for executables being stored in binary, the last time I opened up a executable file I failed to see 1 and 0s If it was it would be humongous.
Do you seriously think opening a binary in emacs will show you a string of 1s and 0s?

Every piece of digital data is a stream of bits, and bits are inherently binary. You can, of course, represent any number in any base. But when the computer sees the file, it is seeing it as binary — unless you think bits and bytes have become obsolete.
Chuck
___
"Instead of either 'multi-talented' or 'multitalented' use 'bisexual'."
     
Hugi
Grizzled Veteran
Join Date: Jun 2002
Status: Offline
Reply With Quote
Aug 24, 2005, 03:14 PM
 
Personally, I think the KiB, MiB and GiB convention should be adopted by the computer industry. The prefixes in question have a simple meaning in the metric system and it's just silly to change the meaning just for computers. I can see how it started - 1024 bytes was probably thought to be "close enough" to call it a kilobyte. Problem is that as the units get larger, the difference increases a lot. It's already an issue with "Gigabytes", and soon when PC disk size will be measured in Terabytes, the difference will become absolutely intolerable.
     
mactropolis  (op)
Senior User
Join Date: Nov 1999
Location: Milkyway Galaxy
Status: Offline
Reply With Quote
Aug 24, 2005, 03:18 PM
 
Originally Posted by Hugi
Personally, I think the KiB, MiB and GiB convention should be adopted by the computer industry. The prefixes in question have a simple meaning in the metric system and it's just silly to change the meaning just for computers. I can see how it started - 1024 bytes was probably thought to be "close enough" to call it a kilobyte. Problem is that as the units get larger, the difference increases a lot. It's already an issue with "Gigabytes", and soon when PC disk size will be measured in Terabytes, the difference will become absolutely intolerable.
Well said.

In a few years people will be buying 500 and 600 GB harddrives and wondering "OMG why do I only get 465 GB/558 GB (respectively)?!?! Where did that 35/42 GB go?!?" Today, the decimal vs. binary difference is somewhat insignificant. What about when our harddrives are double or tripple what we buy today??
Death To Extremists!
     
siMac
Mac Elite
Join Date: Aug 2004
Location: ZZ9 Plural Z Alpha
Status: Offline
Reply With Quote
Aug 24, 2005, 03:53 PM
 
Originally Posted by msuper69
Yes. The French always try to surrender in sets of eight.
Keep it in the lounge, eh?

|\|0\/\/ 15 7|-|3 71|\/|3
     
Hugi
Grizzled Veteran
Join Date: Jun 2002
Status: Offline
Reply With Quote
Aug 24, 2005, 04:01 PM
 
Originally Posted by siMac
Keep it in the lounge, eh?
Some people don't know how...
     
Wevah
Senior User
Join Date: Nov 2001
Location: State of Denial
Status: Offline
Reply With Quote
Aug 27, 2005, 11:23 AM
 
Originally Posted by Weyland-Yutani
I never knew what KiB and MiB meant. I saw it in some download app though (Azureus?) and it rang a bell. I just thought it was the same as KB and MB. Ah well. At these sizes it doesn't matter much, but when you reach GB vs GiB and TB vs TiB it starts to matter.

cheers

W-Y
Only if your OS's KB/MB/GB/TB/etc. are decimal (as opposed to binary), which is unlikely. Most of the time they (KB/KiB, MB/MiB, etc.) mean the same thing (unless you're dealing with what's printed on a hard drive, as was stated before). Now it's getting confusing, eh?
[Wevah setPostCount:[Wevah postCount] + 1];
     
Simon
Posting Junkie
Join Date: Nov 2000
Location: in front of my Mac
Status: Offline
Reply With Quote
Aug 28, 2005, 03:17 AM
 
Look people, the issue isn't binary or decimal. With computers we should use binary (bits, Bytes, 2^n, etc.) because it's the natural unit. The actual problem is that we combine those with prefixes like 'kilo', 'Mega' or 'Giga' which are inherently decimal (10^3, 10^6, 10^9, etc.).

The best approach would probably be to use Decabit (2^10), Icosabit (2^20) and Tricontabit (2^30) instead of kilobit, Megabit and Gigabit, but I think it's just too late for such name changes. You'll never get rid of the old abbreviations or terms. MiB can't be pronounced, it looks ridiculous and it doesn't mean anything either.

The bottom line is that it makes no sense to compensate lack of knowledge with extended vocabulary. If you buy a 1GB disk and think it holds 1 billion Bytes, well you just don't know enough about computing, so tough luck. Nobody is allowed to drive a car either w/o knowing what the red octagon with the stop in it means, so I guess that's just the way the world is.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Aug 28, 2005, 10:44 AM
 
I'm generally opposed to using "KiB" and "MiB" etc. The reason one kB = 1024 bytes is that when you make RAM chips, you HAVE to have sizes that are powers of 2. You don't have to have it that way for HDs, but the sectors are still 512 bytes. The size of a file is then the number of sectors it uses divided by 2 with the answer in kB, and that's the way all OSes have calculated the sizes of files since the dawn of time.

The only reason there is confusion is that HD manufacturers decided to calculate sizes as 1 kB =1000 bytes as a marketing trick. This is not a difference of opinion on what a kB is - it's flat out lying. If there had been a difference of opinion, they would have made the sector sizes 500 bytes instead of 512 bytes to make it easy for the OSes to calculate file sizes the "right" way. No, force the HD manufacturers to stop lying instead of adapting the world to their lie.

And while we're at it, make monitor manufacturers report screen sizes of CRTs as the usable (=lit) area instead of the theoretical tube size.

Note that 1 kb (kilobit) = 1000 bits, not 1024 bits. But people get that wrong as well.
     
ghporter
Administrator
Join Date: Apr 2001
Location: San Antonio TX USA
Status: Offline
Reply With Quote
Aug 28, 2005, 11:01 AM
 
So far, every computer item I've dealt with other than hard drives has consistently used binary counting (k = 1024...) to express sizes. I first started seeing the "gigabyte means one billion bytes" stuff when the first gigabyte hard drives came out-and of course a "hard drive gigabyte" is noticably less than a "memory gigabyte." But since all hard drive formats use up a chunk of the drive anyway, I have always figured that the "specified size" of a hard drive was a ballpark figure anyway.

The only place I've ever had issues with expressing sizes in different terms has been with the various utilities that show file sizes-often in differing terms. Finder shows file sizes one way, and Disc Utility shows them another way... It isn't only OS X, either, Windows has always done the same thing (probably just because the Explorer people never figured out DOS, but that's a different issue!)

Glenn -----OTR/L, MOT, Tx
     
Detrius
Professional Poster
Join Date: Apr 2001
Location: Asheville, NC
Status: Offline
Reply With Quote
Aug 29, 2005, 01:34 AM
 
Originally Posted by Maflynn
Ummm, I didn't say high level language but higher, you know higher then the previous choice.

By your own reckoning assembler is higher then machine code. That was my point not that it was a high level language in of itself.

As for executables being stored in binary, the last time I opened up a executable file I failed to see 1 and 0s If it was it would be humongous. Compiled (or assembled) code is stored in machine code which includes hexidecimal representation. Additionally a lot of high level languages such as VB make heavy use of hexidecimal code for various things such as flags and switches.

I acknowledge that what you original said was misinterpreted, but I do have to say that assembly language isn't a higher level language than machine code. In mathematical terms, there is a 1-to-1 and "onto" mapping between the two languages. The conversion is invertible. Basically, the machine language is the same thing as the assembly language. The difference is that the machine understands the binary version, and the human understands the hexadecimal and english versions. Claiming that assembly code isn't machine code is like claiming that C code isn't C++ code. At some level, you can make the claim, but at the same time you can't.



Back to the original topic though... do you know how many people here in the US don't understand the metric difference between kilo, mega, giga, and terra? To make things make sense for normal people over here, we would have to call things inchbit, footbyte, and milebyte. Anything above that would just be a big number. Then it would make sense to people here, but the rest of the planet would be confused. ( Don't argue against me--this is a joke. I'm not serious. )

Here's are the things that would have to change to use a metric measurement of bytes:

1- People will have to learn the date that the change is made and learn that anything (other than hard drives) older than this date has the wrong value.

2- RAM sizes will continue to be in binary, so the numbers won't make a lick of sense.

3- All documentation will have to be updated to use the new convention. No longer will a virtual memory page be 4kB--it will be approximately 4.1kB.

4- All code referencing data sizes will have to be modified. You no longer have 1GB of RAM--you have approximately 1.1GB of RAM.

5- Processor architectures will need to change. All of the modified code involves more computation. Figuring out how many gigabytes a certain size is used to involve a simple bitshift. Now it involves actual mathematical calculations. The processors will need to more efficiently handle the conversion from binary to decimal, as the current conversion method may not be efficient enough, considering how often the conversions may now need to take place.

6- Compilers will need to be updated to take advantage of the new processor hardware, as no one writes in assembly anymore.

7- Hard drive architectures will need to be rebuilt, as it would no longer make a bit of sense to have a 512 byte block--this makes file size calculations far more difficult.

8- File systems will need to be redefined to accommodate the new block sizes and the new file sizes.




You know, it would probably just be easier to teach people that hard drive manufacturers lie, and everything else uses 1024 as the base--not 1000. This solution is far easier to deal with than completely redefining the architecture. Any American that can handle the metric system can handle the way the computer computes this stuff. Maybe you think I'm closed-minded, or I have a lower IQ than the people on other message boards, but I'm going to go back to my neuroscience programming job tomorrow and continue to not care what you think.
ACSA 10.4/10.3, ACTC 10.3, ACHDS 10.3
     
Geobunny
Mac Elite
Join Date: Oct 2000
Location: Edinburgh, Scotland
Status: Offline
Reply With Quote
Aug 30, 2005, 05:40 PM
 
Just get the HD manufacturers to stop lying to everyone so we can all get on with our lives and this stupid situation can fade away and die. I (like many others here on MacNN) have a degree in Computer Science, but this retched discussion comes up again and again and it makes my head spin. It's not that difficult to find a solution, after all there's only one group of people (HD manu.s) causing the problem; get them to sort it!

Incidentally, WTF does the 'i' stand for in KiB and how do you pronounce it? Kib? Kibs? Kilo-eye-bytes? Kee-eelo-bytes?......
ClamXav - the free virus scanner for Mac OS X | Geobunny learns to fly
     
Chuckit
Clinically Insane
Join Date: Oct 2001
Location: San Diego, CA, USA
Status: Offline
Reply With Quote
Aug 30, 2005, 05:46 PM
 
Originally Posted by Geobunny
Incidentally, WTF does the 'i' stand for in KiB and how do you pronounce it? Kib? Kibs? Kilo-eye-bytes? Kee-eelo-bytes?......
Kibibytes. And the others are mebibytes and gibibytes. Yes, really.
Chuck
___
"Instead of either 'multi-talented' or 'multitalented' use 'bisexual'."
     
Wevah
Senior User
Join Date: Nov 2001
Location: State of Denial
Status: Offline
Reply With Quote
Aug 31, 2005, 01:46 AM
 
I think we should call them Tinkiewinkiebytes.
[Wevah setPostCount:[Wevah postCount] + 1];
     
Tsilou B.
Senior User
Join Date: May 2002
Location: Austria
Status: Offline
Reply With Quote
Aug 31, 2005, 02:20 AM
 
Originally Posted by Detrius
Here's are the things that would have to change to use a metric measurement of bytes:

1- People will have to learn the date that the change is made and learn that anything (other than hard drives) older than this date has the wrong value.

2- RAM sizes will continue to be in binary, so the numbers won't make a lick of sense.

3- All documentation will have to be updated to use the new convention. No longer will a virtual memory page be 4kB--it will be approximately 4.1kB.

4- All code referencing data sizes will have to be modified. You no longer have 1GB of RAM--you have approximately 1.1GB of RAM.

5- Processor architectures will need to change. All of the modified code involves more computation. Figuring out how many gigabytes a certain size is used to involve a simple bitshift. Now it involves actual mathematical calculations. The processors will need to more efficiently handle the conversion from binary to decimal, as the current conversion method may not be efficient enough, considering how often the conversions may now need to take place.

6- Compilers will need to be updated to take advantage of the new processor hardware, as no one writes in assembly anymore.

7- Hard drive architectures will need to be rebuilt, as it would no longer make a bit of sense to have a 512 byte block--this makes file size calculations far more difficult.

8- File systems will need to be redefined to accommodate the new block sizes and the new file sizes.

You know, it would probably just be easier to teach people that hard drive manufacturers lie, and everything else uses 1024 as the base--not 1000.
That's not the point. No one wants to use a metric system when it's absolutely stupid to do so. But it makes sense to use another nomenclature (TiB, GiB, MiB, KiB) instead of using the metric terms for things that are not metric, especially if things get worse all the time (KiB/KB = 102,4% - almost negligible, but soon TiB/TB = 110,0% - 10% difference!).
     
Apfhex
Mac Elite
Join Date: Dec 2000
Location: Northern California
Status: Offline
Reply With Quote
Aug 31, 2005, 03:26 AM
 
Originally Posted by Chuckit
Kibibytes. And the others are mebibytes and gibibytes. Yes, really.
I'm sorry for butting in and not adding anything useful to the discussion, but I couldn't help but really laugh when I pronounced those to myself. Could anyone even take those words seriously?
Mac OS X 10.5.0, Mac Pro 2.66GHz/2 GB RAM/X1900 XT, 23" ACD
esdesign
     
Simon
Posting Junkie
Join Date: Nov 2000
Location: in front of my Mac
Status: Offline
Reply With Quote
Aug 31, 2005, 07:32 AM
 
Originally Posted by Apfhex
Could anyone even take those words seriously?
No. They're ridiculous. And all because of the HD manufacturers.
     
mactropolis  (op)
Senior User
Join Date: Nov 1999
Location: Milkyway Galaxy
Status: Offline
Reply With Quote
Sep 1, 2005, 01:36 AM
 
It may seem hilarious at first since we're soo accustom to the present words, but after seeing the new terms over and over again it appears no more out-of-place than kilo, mega, or giga-byte. Besides, any new user to computers in general will not have been exposed to the old terms, hence kibi, mebi, gibi would seem perfectly natural.
Death To Extremists!
     
Chuckit
Clinically Insane
Join Date: Oct 2001
Location: San Diego, CA, USA
Status: Offline
Reply With Quote
Sep 1, 2005, 09:36 AM
 
Originally Posted by Tsilou B.
That's not the point. No one wants to use a metric system when it's absolutely stupid to do so. But it makes sense to use another nomenclature (TiB, GiB, MiB, KiB) instead of using the metric terms for things that are not metric, especially if things get worse all the time (KiB/KB = 102,4% - almost negligible, but soon TiB/TB = 110,0% - 10% difference!).
So what happens when somebody goes and redefines these terms to be metric like the retards did with the originals?
Chuck
___
"Instead of either 'multi-talented' or 'multitalented' use 'bisexual'."
     
Tsilou B.
Senior User
Join Date: May 2002
Location: Austria
Status: Offline
Reply With Quote
Sep 1, 2005, 09:59 AM
 
Originally Posted by Chuckit
So what happens when somebody goes and redefines these terms to be metric like the retards did with the originals?
If someone (let's say a hard drive manufacturer) redefines these terms (GiB, MiB, KiB) to be metric even though they could simply use the existing metric terms (GB, MB, KB etc.), then you can sue the hell out of them, because that wouldn't be mere simplification, but malicious deceit. Gi, Mi and Ki are never, not in a single case, metric - quite contrary to G,M,K.
     
 
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 03:07 AM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,