Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Software - Troubleshooting and Discussion > macOS > very basic memory question

very basic memory question
Thread Tools
gooser
Grizzled Veteran
Join Date: Jun 2006
Status: Offline
Reply With Quote
Oct 30, 2012, 06:57 PM
 
i can remember the days when going from 256mb of memory to 384mb was a big deal. nowadays people are talking about going from 8gb to 16gb. the huge increases of today seemed to occur around the introduction of intl processors. my question is: why the need for so much memory? is it the operating system, the applications or the hardware that has raised the threshold so high?
imac g3 600
imac g4 800 superdrive
ibook 466
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Oct 31, 2012, 08:50 AM
 
Going to 64 bits did increase memory requirements a bit, integrated graphics do use some RAM and Apple has switched to higher res graphics everywhere, but I think the main thing that happened was that Apple begun including reasonable amounts of RAM by default. My old iMac G5 came with 256 MB of RAM, which was barely enough to boot it. At some point Apple realized this and put some reasonable amount of RAM in their boxes, which decreased pressure on developers to keep things working in low RAM. There is also the fact that RAM is dirt cheap right now. I really didn't need more than the 4GB my iMac came with, but I put in 8 GB more just to be on the safe side. Then I was buying RAM for something else, and figured I might as well buy bigger DIMMs and move the original set to the new (lower performance) machine. It's useful with that extra RAM when I on occasion fire up a VM while doing something else that uses a big chunk, but really, that's not common.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
BLAZE_MkIV
Professional Poster
Join Date: Feb 2000
Location: Nashua NH, USA
Status: Offline
Reply With Quote
Oct 31, 2012, 01:40 PM
 
Everyone progresses in lock step. Memory gets cheaper (Moores Law) and so software can add more of those features people like which usually result in higher memory usage.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Oct 31, 2012, 03:25 PM
 
If you build it, they will come.

Every now and then one has to catch the other up, but as a rule the programmers will quickly use up the extra RAM when its cheap and available, then sometimes they work out how to do something cool that needs loads of memory and we have to wait for the sizes to go up and the prices to go down again.

That said, I am wondering whether a number of technologies that have to date been improving at a constant pace since their inception are starting to plateau at least in some ways.

CPUs while doing more per cycle and adding cores have not really progressed in clock speeds in the last decade. The P4 hit 4GHz ages, the Core series has yet to do so;
Apple in particular is going to have to be a bit more efficient when it comes to RAM in order to stop those 2GB non-upgradeable MacBook Airs from becoming obsolete too quickly. Whatever follows Mountain Lion is going to have to able to run on 2GB, maybe even the one after that too since those machines were sold as late as July this year;
Retina displays are already heading to the limits of the human eye. Further increasing pixel density would be largely pointless after that;
I can only assume that once displays hit their natural limit, video cards will get there a little later. There is a limit on discernible frame rates as well as pixels. Movies are already at much lower rates than many games will run; GPUs do have other uses of course, so this one is less true.

It'll be interesting. Innovation is likely going to have to start going sideways, we should start getting some really original stuff in the not too distant....
I have plenty of more important things to do, if only I could bring myself to do them....
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Nov 1, 2012, 01:10 AM
 
Originally Posted by Waragainstsleep View Post
CPUs while doing more per cycle and adding cores have not really progressed in clock speeds in the last decade. The P4 hit 4GHz ages, the Core series has yet to do so;
ObNitpick: The P4 hit 3.93 GHz, which always struck me as a conscious decision by someone at Intel to really grind in how the Prescott CPU had failed. The top Ivy Bridge i7-3770K turbos up to 3.9 GHz on one or two cores, which is within the margin of error. They have caught up, but barely - and AMD has turboed up to 4.2 GHz for some time now.

Originally Posted by Waragainstsleep View Post
Apple in particular is going to have to be a bit more efficient when it comes to RAM in order to stop those 2GB non-upgradeable MacBook Airs from becoming obsolete too quickly. Whatever follows Mountain Lion is going to have to able to run on 2GB, maybe even the one after that too since those machines were sold as late as July this year;
There are some tricks they could do (memory deduping, for instance) but I'm hoping for a new filesystem, and such tend to use more RAM than oldies like HFS (which was designed to be memory efficient back in 1986 or so). MBAs swap to an SSD, however, so the hit isn't nearly as bad as when you swap to an HDD.

Originally Posted by Waragainstsleep View Post
Retina displays are already heading to the limits of the human eye. Further increasing pixel density would be largely pointless after that;
I can only assume that once displays hit their natural limit, video cards will get there a little later. There is a limit on discernible frame rates as well as pixels. Movies are already at much lower rates than many games will run; GPUs do have other uses of course, so this one is less true.
It'll be interesting. Innovation is likely going to have to start going sideways, we should start getting some really original stuff in the not too distant....
The reason people like to game in "buttery smooth" 60 fps is that that is an average. What you really need is 25-30 fps constant, reliable, every single frame. That is still not guaranteed in modern games, but the rate of increase there has certainly dropped now that all the consoles are so long in the tooth.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Nov 1, 2012, 01:37 AM
 
Originally Posted by P View Post
ObNitpick: The P4 hit 3.93 GHz, which always struck me as a conscious decision by someone at Intel to really grind in how the Prescott CPU had failed. The top Ivy Bridge i7-3770K turbos up to 3.9 GHz on one or two cores, which is within the margin of error. They have caught up, but barely - and AMD has turboed up to 4.2 GHz for some time now.
That is quite a nitpick!

I thought there was one clocked at 4.06GHz but maybe I was thinking too far back to the 3.06GHz. Either way, that was a long time ago and they have only just caught up so my point stands. Obviously a modern chip will perform most tasks faster than a higher clocked old one anyway, but even the race to add cores seems to have slowed just a touch, waiting for the software to catch up for the most part I suspect.


Originally Posted by P View Post
There are some tricks they could do (memory deduping, for instance) but I'm hoping for a new filesystem, and such tend to use more RAM than oldies like HFS (which was designed to be memory efficient back in 1986 or so). MBAs swap to an SSD, however, so the hit isn't nearly as bad as when you swap to an HDD.
Well perhaps there is life in the RAM race for some years to come yet. Sounds like this new leg won't kick off for a little while though.

Originally Posted by P View Post
The reason people like to game in "buttery smooth" 60 fps is that that is an average. What you really need is 25-30 fps constant, reliable, every single frame. That is still not guaranteed in modern games, but the rate of increase there has certainly dropped now that all the consoles are so long in the tooth.
Fair enough. I wasn't suggesting they are quite there yet in terms of performance but I have to imagine that a card running 60fps average isn't going to dip below 25 too often. I guess it depends on the complexity of what its drawing but like I say the pixel limit is going to limit that too isn't it?
I have plenty of more important things to do, if only I could bring myself to do them....
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Nov 1, 2012, 05:09 AM
 
Originally Posted by Waragainstsleep View Post
That is quite a nitpick!
I thought there was one clocked at 4.06GHz but maybe I was thinking too far back to the 3.06GHz. Either way, that was a long time ago and they have only just caught up so my point stands. Obviously a modern chip will perform most tasks faster than a higher clocked old one anyway, but even the race to add cores seems to have slowed just a touch, waiting for the software to catch up for the most part I suspect.
Now that I checked up on it, I may have misremembered things as well. The Wikipedia page lists the Pentium 4 EE processor I remembered as 3.93 GHz as being only 3.73 GHz, which makes the regular 3.8 GHz the highest clocked model. Weird, I remember something about being just under 4 GHz.

Originally Posted by Waragainstsleep View Post
Well perhaps there is life in the RAM race for some years to come yet. Sounds like this new leg won't kick off for a little while though.
Fair enough. I wasn't suggesting they are quite there yet in terms of performance but I have to imagine that a card running 60fps average isn't going to dip below 25 too often. I guess it depends on the complexity of what its drawing but like I say the pixel limit is going to limit that too isn't it?
It's more a comment on how GPU tests are so pointless now, where a card delivering 78fps is deemed a winner over one delivering 75 fps when the display doesn't refresh more than 60 times per second anyway. What they should be talking about is how often the frames drop below 30 fps (or 60fps if you have to) - if the answer is "never", then they're equivalent at that resolution. Most games do pass the 30fps level at 2560*1440 if you get the best of the best in GPUs today. The retina resolution for a 15", 2880*1800, is not tested so often, and we likely need to get a little higher still in resolution for desktops to be retina, but we are getting there.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
   
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 05:29 PM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,