|
|
The MBP had a spec bump
|
|
|
|
Professional Poster
Join Date: Dec 2000
Location: UK
Status:
Offline
|
|
(
Last edited by ajprice; Oct 24, 2011 at 09:07 AM.
Reason: added AI link and store image)
|
It'll be much easier if you just comply.
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Oct 2000
Location: Los Angeles
Status:
Offline
|
|
Yeah, looks like a pretty nice bump. Strange timing of its introduction, though.
Originally Posted by Engadget
If you meander on down to the Apple store this morning, you'll spot some nice little spec bumps to the MacBook Pro range -- without any increase to the prices. For a start, you can now splash out on a faster AMD Radeon HD 6770M discrete GPU with your 15-inch or 17-inch lappie. Even better, there are some CPU improvements to be had: the 13-incher gets the option of a 2.8GHz Core i7 or a 2.4GHz Core i5 dual-core processor, instead of the previous entry-level 2.3GHz i5 (and it also gets its HDD notched up to a minimum 500GB, or max 750GB). The 15-incher now goes up to a quad-core 2.4GHz i7 -- the same speedy processor that comes in the updated 17-inch variant. Oh, the cost/benefit dilemmas.
Why does Apple have to be so limited when it comes to graphics card options? Why only have AMD and nothing from Nvidia depending on the generation? If PC manufacturers can offer a ton of flexibility in that regard, why can't Apple provide just a little more?
(
Last edited by Big Mac; Oct 24, 2011 at 09:18 AM.
)
|
"The natural progress of things is for liberty to yield and government to gain ground." TJ
|
|
|
|
|
|
|
|
Fresh-Faced Recruit
Join Date: Jun 2009
Status:
Offline
|
|
I have read that there are issues with the newer standard 6G SSD, they are supported with previous 17 inch models most have not been working. Hope this is fixed, I'll be checking here and Mac Performance Guide for the tests.
|
|
|
|
|
|
|
|
|
Posting Junkie
Join Date: Mar 2004
Location: UK
Status:
Offline
|
|
I think Apple just eventually lost patience with nVidia after the 9600M fiasco.
|
I have plenty of more important things to do, if only I could bring myself to do them....
|
|
|
|
|
|
|
|
Mac Enthusiast
Join Date: Apr 2003
Location: Monterrey, Mexico
Status:
Offline
|
|
I think you meant 8600 GT ? -- That´s the bad one.
|
|
|
|
|
|
|
|
|
Posting Junkie
Join Date: Mar 2004
Location: UK
Status:
Offline
|
|
Originally Posted by polendo
I think you meant 8600 GT ? -- That´s the bad one.
Yes, quite right.
|
I have plenty of more important things to do, if only I could bring myself to do them....
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Oct 2000
Location: Los Angeles
Status:
Offline
|
|
Originally Posted by Waragainstsleep
I think Apple just eventually lost patience with nVidia after the 9600M fiasco.
No, that's not it. Some of Apple's most successful MacBooks were based on the Core2/Nvidia 320M, a combo used for years. That was after the 8600M recall.
When SJ came back to Apple and simplified the product lines, it seems like he created an amendment to Apple's internal business constitution (haha) to never expand the lines or offer too much variety. I can definitely appreciate the simplistic lines and few configurations within models. Many of the PC makers have huge, very confusing product lines (look at the Asus laptop site for example), and I would never want to see Apple return to anything approaching that. But there could be a bit of an expansion, right?
|
"The natural progress of things is for liberty to yield and government to gain ground." TJ
|
|
|
|
|
|
|
|
Posting Junkie
Join Date: Mar 2004
Location: UK
Status:
Offline
|
|
There was something else that happened after the 8600M. I can't remember what it was but I'm sure it seriously dented Apple's faith in nVidia. The 8600 was the worst part, I swapped out hundreds of those myself, then something else happened or didn't happen or was going to happen and Apple shifted to AMD.
|
I have plenty of more important things to do, if only I could bring myself to do them....
|
|
|
|
|
|
|
|
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status:
Offline
|
|
Originally Posted by ajprice
It's because very little is new. The 13"ers have got 100 MHz more. Some of the models have a slightly bumped GPU, but mostly they dropped the low-end model of both the 15" and 17" and dropped the price of the next one up to fill that slot.
Originally Posted by Waragainstsleep
There was something else that happened after the 8600M. I can't remember what it was but I'm sure it seriously dented Apple's faith in nVidia. The 8600 was the worst part, I swapped out hundreds of those myself, then something else happened or didn't happen or was going to happen and Apple shifted to AMD.
nVidia has had supply problems, I suspect that that was what broke it. Meanwhile, AMD has been hitting it out the park from the 4850 and on, while nVidia has only been competitive at the very top of the line. Take a look at a round up like this one, for instance - except for one tie with an older card, nVidia loses everything until you get into the seriously expensive cards.
|
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status:
Offline
|
|
Originally Posted by P
mostly they dropped the low-end model of both the 15" and 17" and dropped the price of the next one up to fill that slot.
Yes, but they didn't just drop the models down in price. They shifted the graphics down and upped the CPU frequency. The 8MB-cache quad i7 (now at 2.5 GHz) is still only available BTO.
|
|
|
|
|
|
|
|
|
Banned
Join Date: Mar 2005
Status:
Offline
|
|
I bought a new MacBook Pro 15" last week. Just took it back tonight and exchanged for the latest entry level 15". This has double the graphics memory, faster Bluetooth, and a faster processor.
Screaming along... except Tim Machine is bloody slow for backing up. Jesus. Still a slug after all these years.
|
|
|
|
|
|
|
|
|
Addicted to MacNN
Join Date: Jul 2004
Location: Toronto
Status:
Offline
|
|
The new base 15" does look really nice to me. I was gonna wait for the 13" Ivy Bridge, but this looks like a good fit for me. Thinking....
|
|
|
|
|
|
|
|
|
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status:
Offline
|
|
Ivy Bridge improvements seem to be mostly in the GPU, while the CPU is unchanged. Ivy is also a bit late, starting production now for a launch in the March-April timeframe.
|
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
|
|
|
|
|
|
|
|
Moderator
Join Date: May 2001
Location: Hilbert space
Status:
Offline
|
|
|
I don't suffer from insanity, I enjoy every minute of it.
|
|
|
|
|
|
|
|
Banned
Join Date: Mar 2005
Status:
Offline
|
|
I can attest to the latest just released MacBook Pro entry 15" screams. I'm comparing it to the entry 15" directly preceding it. The slightly faster processor and double the graphics memory must be what is causing such a performance boost. Everything in Lion is now buttery smooth. Entry 15": highly recommended.
|
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Oct 2000
Location: Los Angeles
Status:
Offline
|
|
So it takes the absolute latest hardware to make Lion run "buttery smooth"? That's really lame.
|
"The natural progress of things is for liberty to yield and government to gain ground." TJ
|
|
|
|
|
|
|
|
Posting Junkie
Join Date: Dec 2000
Status:
Offline
|
|
Lion runs great on my 2008 MBP with an SSD. The latter, I think, makes much more of a difference than the CPU as far as smoothness goes.
|
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status:
Offline
|
|
Originally Posted by Big Mac
So it takes the absolute latest hardware to make Lion run "buttery smooth"? That's really lame.
No. My Thunderbolt 13" runs it wonderfully.
|
|
|
|
|
|
|
|
|
Posting Junkie
Join Date: Jan 2006
Location: Colorado
Status:
Offline
|
|
Originally Posted by Big Mac
So it takes the absolute latest hardware to make Lion run "buttery smooth"? That's really lame.
My late-2008 MBP runs it perfectly smoothly.
|
|
|
|
|
|
|
|
|
Banned
Join Date: Mar 2005
Status:
Offline
|
|
Ya, you guys say "smooth", but what do you own to compare it to? I had the Core 2 Duo running Lion... then, 1 week ago, got he i7 2.0 GHz entry 15" running Lion. Now, I've got the latest 2.2 GHz entry 15" running Lion. I'm in a pretty good position to compare and contrast.
There is no doubt this latest model is significantly faster than the i7 2.0 GHz 15" I was just using before it. Even on the Geekbench tests I ran.
What I'm noticing is how responsive the GUI is to my multi-touch inputs on the trackpad. It's significantly more responsive than the model preceding it, and much more than my 15" MBP Core 2 Duo.
All this without having an SSD.
OS X is a resource pig... it's like playing a video game. Need lots of RAM and graphics acceleration. I've been buying entry level 15" MBPs for years. Never have I been blown away by speed. Always a bit of lag. This time it's different. First MBP I've bought that really is fast without a bunch of expensive upgrades.
|
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Oct 2000
Location: Los Angeles
Status:
Offline
|
|
Something may have changed with Lion, but I've never considered Mac OS X a resource pig. It's a always been a little more graphically rich with each generation of hardware, but I don't think Snow Leopard was a graphical pig of a release at all. I know Lion adds a few needless eye candy elements like open window zoom rects, as Siracusa mentioned. Is that the type of thing you're mentioning, freudling?
|
"The natural progress of things is for liberty to yield and government to gain ground." TJ
|
|
|
|
|
|
|
|
Mac Elite
Join Date: Mar 2004
Location: Truckee, CA
Status:
Offline
|
|
Clearly Lion takes more advantage of RAM (or pigs on RAM, depending on your POV). Heavy apps like Aperture clearly show the increased 10.7 RAM demands.
That IMO is evolution more than resource pig. I expect and want RAM to cheapen over time, apps and OSs to use more RAM, and hardware to take advantage of more RAM as time goes on.
Similarly GPUs keep getting stronger and IMO OSs and apps should be built to use the increasingly stronger GPU power available. If that is being a resource pig it is a good thing.
Folks with other than the newest and strongest hardware should NOT assume that every new OS or app upgrade is appropriate for their setups. E..g if one has a lesser box stay at 10.6.8, duh.
The idea that Apple or MS should build new OS versions to suit the low end boxes is crazy. New OS and app versions should move forward - while support for legacy hardware/software is maintained.
Originally Posted by freudling
I've been buying entry level 15" MBPs for years. Never have I been blown away by speed. Always a bit of lag. This time it's different. First MBP I've bought that really is fast without a bunch of expensive upgrades.
Agreed, 100%.
-Allen
(
Last edited by SierraDragon; Oct 26, 2011 at 11:38 PM.
)
|
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Oct 2000
Location: Los Angeles
Status:
Offline
|
|
Originally Posted by SierraDragon
The idea that Apple or MS should build new OS versions to suit the low end boxes is crazy. New OS and app versions should move forward - while support for legacy hardware/software is maintained.
To its credit though, Microsoft has a built-in benchmark (the Windows Experience Index) that the OS uses to disable graphical features that lesser hardware can't handle. Windows also has a lot more user configurability in that area (allowing one to selectively turn off various UI features through a standard graphical user tool), whereas with OS X that certainly isn't the case (due in large part, obviously, to not wanting to overwhelm neophytes with a multitude of configuration details).
(
Last edited by Big Mac; Oct 27, 2011 at 11:25 AM.
)
|
"The natural progress of things is for liberty to yield and government to gain ground." TJ
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status:
Offline
|
|
Apple has a history of simply not enabling things on lesser hardware to reduce load (see the Dashboard "ripple" when adding a widget).
|
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Oct 2000
Location: Los Angeles
Status:
Offline
|
|
True, but that's the only real example of a feature being turned off because of lesser hardware that I can think of on OS X. Windows goes much further. Quartz Extreme wouldn't activate without a certain level of GPU. But other than that I can't really think of other things selectively disabled according to hardware present, and the user has very few options as far as manually turning things off is concerned.
(
Last edited by Big Mac; Oct 27, 2011 at 11:33 AM.
)
|
"The natural progress of things is for liberty to yield and government to gain ground." TJ
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status:
Offline
|
|
Well, what other relevant things would there be, beyond graphics acceleration?
Obviously, there's things like AirDrop not being available if the Wi-Fi chip doesn't support it.
|
|
|
|
|
|
|
|
|
Posting Junkie
Join Date: Oct 2005
Location: Houston, TX
Status:
Offline
|
|
Originally Posted by SierraDragon
Similarly GPUs keep getting stronger and IMO OSs and apps should be built to use the increasingly stronger GPU power available.
We've heard that song for years. Meanwhile, the apps that the majority of the market is using most of the day...
|
|
|
|
|
|
|
|
|
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status:
Offline
|
|
Originally Posted by Big Mac
True, but that's the only real example of a feature being turned off because of lesser hardware that I can think of on OS X. Windows goes much further. Quartz Extreme wouldn't activate without a certain level of GPU. But other than that I can't really think of other things selectively disabled according to hardware present, and the user has very few options as far as manually turning things off is concerned.
There are more - the translucent menubar is disabled on some GPUs, and Core Image won't work on others - but the UI does not let you control most features, no.
|
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
|
|
|
|
|
|
|
|
Professional Poster
Join Date: Dec 2000
Location: UK
Status:
Offline
|
|
My white macbook doesn't do multitouch stuff because the trackpad doesn't support it. A few Keynote transitions are disabled, I do get the translucent menubar though (OS X Lion, Core2 2.16 and GMA950)
|
It'll be much easier if you just comply.
|
|
|
|
|
|
|
|
Mac Elite
Join Date: Mar 2004
Location: Truckee, CA
Status:
Offline
|
|
Originally Posted by mduell
We've heard that song for years. Meanwhile, the apps that the majority of the market is using most of the day...
...can run on old hardware just fine, no need to buy new hardware or upgrade the OS.
This is tech. Stick with old hardware and a 10-year-old OS for a decade if your apps are stagnant anyway. Most of IT (including me) did exactly that with XP.
But do not suggest that OS and hardware development should stagnate just to suit an admittedly large segment of stagnant users.
-Allen
|
|
|
|
|
|
|
|
|
Posting Junkie
Join Date: Oct 2005
Location: Houston, TX
Status:
Offline
|
|
Originally Posted by SierraDragon
...can run on old hardware just fine, no need to buy new hardware or upgrade the OS.
This is tech. Stick with old hardware and a 10-year-old OS for a decade if your apps are stagnant anyway. Most of IT (including me) did exactly that with XP.
But do not suggest that OS and hardware development should stagnate just to suit an admittedly large segment of stagnant users.
-Allen
I'm not suggesting stagnation. I'm saying GPGPU is mostly fueled by hype with little to show outside very small niches. Your post was that "OSs and apps should be built to use the increasingly stronger GPU power available" and I'm observing that's not happening in any sort of significant way. The "power" of a GPU is very different than a CPU and not broadly useful like a faster CPU is.
|
|
|
|
|
|
|
|
|
Banned
Join Date: Mar 2005
Status:
Offline
|
|
Originally Posted by mduell
I'm not suggesting stagnation. I'm saying GPGPU is mostly fueled by hype with little to show outside very small niches. Your post was that "OSs and apps should be built to use the increasingly stronger GPU power available" and I'm observing that's not happening in any sort of significant way. The "power" of a GPU is very different than a CPU and not broadly useful like a faster CPU is.
BS... Do you understand how computers work? Do the words OpenGL, Core Image, QuartzExtreme, and so forth mean anything to you? I just got way better performance on my MBP because I now have double the graphics acceleration. Buttery smooth. Wasn't like that before because you need very good GPUs to handle all the interactivity in the OS.
|
|
|
|
|
|
|
|
|
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status:
Offline
|
|
freudling, GPGPU means doing General Purpose calculations on the GPU - ie, using the GPU as a very specialized CPU. It has nothing to do with the smoothness of the graphics interface.
|
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
|
|
|
|
|
|
|
|
Mac Elite
Join Date: Mar 2004
Location: Truckee, CA
Status:
Offline
|
|
Originally Posted by mduell
I'm not suggesting stagnation. I'm saying GPGPU is mostly fueled by hype with little to show outside very small niches. Your post was that "OSs and apps should be built to use the increasingly stronger GPU power available" and I'm observing that's not happening in any sort of significant way. The "power" of a GPU is very different than a CPU and not broadly useful like a faster CPU is.
We disagree. My post OSs and apps should be built to use the increasingly stronger GPU power available IMO has been happening in a significant way. On the OS side graphics acceleration is pretty obvious like freudling points out and IMO that does qualify as broadly useful. Even PS takes advantage.
On the app side some apps like Aperture clearly thrive on GPU power. Given that Aperture is an Apple app and Apple has since v1 designed it to not only use but demand strong GPU support I have to think that is an engineering direction that Apple is encouraging.
The "power" of a GPU is very different than a CPU and not broadly useful like a faster CPU is.
True maybe as regards broadly useful. But all 2011 boxes with or without discrete GPUs use integrated graphics, which seems to me blurs the distinction between GPU and CPU. But I confess ignorance as to just how the CPU/integratedG/discreteG all interact.
-Allen
|
|
|
|
|
|
|
|
|
Posting Junkie
Join Date: Oct 2005
Location: Houston, TX
Status:
Offline
|
|
Originally Posted by freudling
BS... Do you understand how computers work? Do the words OpenGL, Core Image, QuartzExtreme, and so forth mean anything to you? I just got way better performance on my MBP because I now have double the graphics acceleration. Buttery smooth. Wasn't like that before because you need very good GPUs to handle all the interactivity in the OS.
I'm familiar. Even not-very-good GPUs, like the ones integrated in every modern Intel CPU, handle the OS compositing fine. Powerful GPUs are pointless for most of the market.
Originally Posted by SierraDragon
On the app side some apps like Aperture clearly thrive on GPU power. Given that Aperture is an Apple app and Apple has since v1 designed it to not only use but demand strong GPU support I have to think that is an engineering direction that Apple is encouraging.
Aperture is a great example of a niche use. I'm sure it's really important and great to you if you're a pro in that field, but there aren't that many of those. Also note there are significant (in terms of time to complete) tasks in Aperture that don't leverage the GPU, like exporting TIFF/JPEG/etc.
Originally Posted by SierraDragon
True maybe as regards broadly useful. But all 2011 boxes with or without discrete GPUs use integrated graphics, which seems to me blurs the distinction between GPU and CPU. But I confess ignorance as to just how the CPU/integratedG/discreteG all interact.
Integrated GPU just mean there's a GPU on the chipset or CPU module or die. There's no shared execution hardware. It's just bolting extra transistors on the side and maybe sharing some level of cache (as Sandy Bridge does with L3). There's still a very clear CPU/GPU distinction even if the hardware is colocated. The GPU transistors can't help out with tasks programmed for the CPU's instruction set.
|
|
|
|
|
|
|
|
|
Banned
Join Date: Mar 2005
Status:
Offline
|
|
Originally Posted by mduell
I'm familiar. Even not-very-good GPUs, like the ones integrated in every modern Intel CPU, handle the OS compositing fine. Powerful GPUs are pointless for most of the market.
You don't know what you're talking about. OS X is coded and optimized around hardware acceleration. It makes a huge difference.
GPUs do matter for the market. For everyone. They make modern computers way faster. From playing hi def video to OS actions, behaviours, and animations. There's no question about this and no 'argument' to pursue. It's a fact of life.
|
|
|
|
|
|
|
|
|
Posting Junkie
Join Date: Mar 2004
Location: UK
Status:
Offline
|
|
Originally Posted by freudling
You don't know what you're talking about. OS X is coded and optimized around hardware acceleration. It makes a huge difference.
Its eye candy so it really isn't all that important but Apple doesn't ship Macs with GPUs that can't handle the workload of the basic OS well enough. Those are made obsolete with installation restrictions for the OS.
Originally Posted by freudling
GPUs do matter for the market. For everyone.
No they don't matter for everyone. You don't need them to send emails, use Facebook or watch DVDs. You don't need them to watch cats playing pianos on Youtube. Or to watch porn. This is all most people do.
Originally Posted by freudling
They make modern computers way faster. From playing hi def video to OS actions, behaviours, and animations. There's no question about this and no 'argument' to pursue. It's a fact of life.
Well if you say so that must be true.
|
I have plenty of more important things to do, if only I could bring myself to do them....
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status:
Offline
|
|
Originally Posted by freudling
GPUs do matter for the market. For everyone. They make modern computers way faster. From playing hi def video to OS actions, behaviours, and animations. There's no question about this and no 'argument' to pursue. It's a fact of life.
I dunno, Logic sounds about the same to me.
|
|
|
|
|
|
|
|
|
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status:
Offline
|
|
Originally Posted by freudling
You don't know what you're talking about. OS X is coded and optimized around hardware acceleration. It makes a huge difference.
I don't know what your experience comes from, but...
All modern GPUs work fine with necessary tasks like the compositing, but they are not generally useful for even all graphics tasks. Case in point: 2D drawing in a window, using Quartz - the most common case there is - is by default not accelerated in Mac OS X today. The reason is that it is not appreciably faster to do it that way - likely because the increased performance from the parallel nature of the GPU is lost in the bandwidth crunch when moving the data back and forth between CPU and GPU.
What is left is taking certain specific operations and accelerating those. Right now, in OS X, that is things like animations and effects, which - quite frankly - are eye candy rather than useful content. Doesn't mean that they can't BE useful, but they aren't. OS X doesn't even have the useful animation of showing where a window came from that was there in Classic Mac OS - they just pop up.
Originally Posted by freudling
GPUs do matter for the market. For everyone. They make modern computers way faster. From playing hi def video to OS actions, behaviours, and animations. There's no question about this and no 'argument' to pursue. It's a fact of life.
There is a saying: When the facts are on your side, you bang on the facts. When the law is on your side, you bang on the law. When neither is, you bang on the table. I'm not sure this table can take any more. If you have any facts, pointing to specific, necessary operations that NEED a high-powered GPU, please state them. Note that in Macs today, things like "playing high-def video" are accelerated on low-end GPUs, but not on the higher-end models like the Radeon 4850 in my iMac, because those are paired with CPUs strong enough to do the work.
(
Last edited by P; Nov 2, 2011 at 08:03 AM.
)
|
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
|
|
|
|
|
|
|
|
Banned
Join Date: Mar 2005
Status:
Offline
|
|
Originally Posted by P
I don't know what your experience comes from, but...
There is a saying: When the facts are on your side, you bang on the facts. When the law is on your side, you bang on the law. When neither is, you bang on the table. I'm not sure this table can take any more. If you have any facts, pointing to specific, necessary operations that NEED a high-powered GPU, please state them. Note that in Macs today, things like "playing high-def video" are accelerated on low-end GPUs, but not on the higher-end models like the Radeon 4850 in my iMac, because those are paired with CPUs strong enough to do the work.
There is a saying: people on MacNN like to argue for the sake of arguing. This fits the profile of trolling.
I like that article on AnandTech they ran a few years back about GPUs and OS X. Explanations of how GPU is utilized in the OS, some programs, and benchmark testing.
The contents of each window and the windows themselves are drawn by the GPU and stored in video memory. Previous versions of OS X either drew windows in system memory and then composited all of them in video memory, or did everything in system memory and just outputted the final scene to the video card. Ever since OS X 10.4, the entire drawing and display process happens on the GPU and in video memory. Ars Technica’s John Siracusa has an excellent explanation of the whole process.
Each window gets treated as a 2D OpenGL surface and all of the character and image rendering, blending and display happens on the GPU. The GPU is much faster at all of this than the CPU so it made sense. The result is much lower CPU and system memory usage. What it also means is that the amount of video memory you have matters.
AnandTech - EVGA's GeForce GTX 285 Mac Edition: The Best for OS X?
Mac OS X 10.4 Tiger
OpenCL from Apple:
Core Technologies - Mac OS X Technology Overview - Apple Developer
Some programs utilizing graphics acceleration for better performance:
FCP X
Photoshop
Quicktime
Safari
OpenCL in OSX Lion to give FCP X huge performance hike | EOSHD.com
Hardware acceleration in Quicktime:
I've been playing around with Quicktime X's supposed hardware acceleration of h264 video. Of course the best test is 1080p files to see if it actually works properly. My tests have shown that there is a remarkable 15% CPU usage with some really high bitrate 1080p h264 videos in an MP4 container which is amazing! Great for HD watching while keeping the mbp a lot more free to do other tasks. Unfortunately it seems only the lowly AAC-LC 2 channel audio codec allows for this acceleration rather than AC3 Dolby Digital or any other 5.1 (encoding with 5.1 spikes the CPU to the old clearly hardware acceleration-less 60% CPU) audio format which is a darn shame..
Hardware acceleration in Safari:
Web pages that use the HTML5 Canvas element can tap into the graphics processing unit on your Mac to display graphics and animations. With improved hardware acceleration for Canvas, games and interactive web applications render faster and smoother in the browser.
Graphics acceleration matters for all users as it's built into the core of OS X and has real world effects on how fast and smooth things run. It's absurd to think GPUs don't matter.
|
|
|
|
|
|
|
|
|
Moderator
Join Date: May 2001
Location: Hilbert space
Status:
Offline
|
|
@freudling
You vastly overestimate the benefit of hardware acceleration and getting a beefy graphics card. I agree with you on the trend, but I think you have a wrong idea about the slope and our current position.
You quote examples without understanding them properly: hardware acceleration ≠ hardware acceleration. If you take the hardware acceleration found in QuickTime, for instance, then this means hardware decoding of videos. There is dedicated hardware for this in most GPUs these days (and in the ARM cpus). Getting a faster GPU here will help you zilch, all that matters is that it is offloaded from the CPU to the GPU. My iPod touch's A4 is capable of decoding a high-def 720p h.264 video stream.
Ditto for Photoshop, Adobe is not very good at adopting hardware acceleration. If you have a look at which functions, then it's only the most primitive of operations (zooming, moving, rotation of the canvas), and even then, the performance benefit is rather miniscule (don't be misled by the first graph, the difference is at most 10 % in the zoom test and 20 % in the rotation test). (I'm aware of the section titled CUDA, but CUDA is nVidia-specific and most Macs ship with AMD graphics these days.)
In fact, I still haven't seen comparative tests where people show how much of a performance gain you get on Aperture if you use a more powerful GPU on an otherwise identical machine. (You could do that by running a set of benchmarks on a recent MacBook Pro, once with the discrete GPU enabled, once with the Intel GPU enabled.)
Programming your app to take advantage of a powerful GPU is not easy. That's the reason it is rarely used, and if it is, then mostly for bottlenecks that really shave off a lot of time (e. g. video encoding).
|
I don't suffer from insanity, I enjoy every minute of it.
|
|
|
|
|
|
|
|
Banned
Join Date: Mar 2005
Status:
Offline
|
|
Originally Posted by OreoCookie
@freudling
You vastly overestimate the benefit of hardware acceleration and getting a beefy graphics card. I agree with you on the trend, but I think you have a wrong idea about the slope and our current position.
You quote examples without understanding them properly: hardware acceleration ≠ hardware acceleration. If you take the hardware acceleration found in QuickTime, for instance, then this means hardware decoding of videos. There is dedicated hardware for this in most GPUs these days (and in the ARM cpus). Getting a faster GPU here will help you zilch, all that matters is that it is offloaded from the CPU to the GPU. My iPod touch's A4 is capable of decoding a high-def 720p h.264 video stream.
Ditto for Photoshop, Adobe is not very good at adopting hardware acceleration. If you have a look at which functions, then it's only the most primitive of operations (zooming, moving, rotation of the canvas), and even then, the performance benefit is rather miniscule (don't be misled by the first graph, the difference is at most 10 % in the zoom test and 20 % in the rotation test). (I'm aware of the section titled CUDA, but CUDA is nVidia-specific and most Macs ship with AMD graphics these days.)
In fact, I still haven't seen comparative tests where people show how much of a performance gain you get on Aperture if you use a more powerful GPU on an otherwise identical machine. (You could do that by running a set of benchmarks on a recent MacBook Pro, once with the discrete GPU enabled, once with the Intel GPU enabled.)
Programming your app to take advantage of a powerful GPU is not easy. That's the reason it is rarely used, and if it is, then mostly for bottlenecks that really shave off a lot of time (e. g. video encoding).
Anybody who believes "GPUs don't matter" is incredulous, ignorant, trolling, all of these things, and other things. That's all I'm responding to, which was said earlier in the thread.
It's demonstrated that GPUs "matter". Any further discussion on this is just a waste of time. If someone really believes they don't matter, why not just throw their GPU out the window. And what not just chalk up the fact that Apple includes GPUs in their machines to marketing. Why not just ignore how OpenCL is integrated into OS X, provides real world performance gains, and has APIs for developers, and is utilized by several Apps with benefits.
But watch, ya'll will continue and try to pigeon hole this topic into some geeky, obscure realm that has little basis in reality.
|
|
|
|
|
|
|
|
|
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status:
Offline
|
|
Originally Posted by freudling
But watch, ya'll will continue and try to pigeon hole this topic into some geeky, obscure realm that has little basis in reality.
That is exactly what you are doing.
The others are posting real data and making real points from an engineering and programming background.
Note that nobody is saying that GPUs don't matter; it's just that they don't matter much.
|
|
|
|
|
|
|
|
|
Moderator
Join Date: May 2001
Location: Hilbert space
Status:
Offline
|
|
Originally Posted by freudling
Anybody who believes "GPUs don't matter" is incredulous, ignorant, trolling, all of these things, and other things. That's all I'm responding to, which was said earlier in the thread.
Sigh.
Nobody has claimed GPUs don't matter, they do matter. But it's not just CPU-performance that's growing, but also GPU-performance. Meaning a current-gen iMac or MacBook Pro (15" and up, obviously) has more GPU-power than a Mac Pro from a few years back. And at a certain point, they're simply fast enough to do everything we ask them to do. And even the GPUs built into CPUs these days are more than sufficient to cover the needs of most consumers and pros. Nobody needs a special graphics card anymore to drive two displays at high resolution. Nobody needs a special graphics card to have a fluid OS interface (well, at least from 10.2 on with a Quartz Extreme-capable graphics card, that is ).
That's the reason the discrete graphics space is shrinking: Intel and AMD (and also ARM, but that's a different story) put GPU cores in most of their new CPUs, GPUs that are powerful enough to satisfy the needs of the graphics subsystem of modern OS (be it Windows or OS X). Most apps won't push the need for fast GPUs (the only »mass-market« exception being games) like Photoshop no longer pushes the need for ever-faster CPUs.
|
I don't suffer from insanity, I enjoy every minute of it.
|
|
|
|
|
|
|
|
Mac Elite
Join Date: Mar 2004
Location: Truckee, CA
Status:
Offline
|
|
Originally Posted by OreoCookie
...I still haven't seen comparative tests where people show how much of a performance gain you get on Aperture if you use a more powerful GPU on an otherwise identical machine.
We saw dozens of reports on the Aperture forums in the PPC days. The very strongest fully-equipped G5 towers provided essentially unusable Aperture performance with the stock video card. Then when a better card was added Aperture performance dramatically improved to functional. A similar thing happened with my 2.66 GHz 2006 MP; Aperture performance was much improved when I upgraded from the stock card. Even today only the top MBPs and iMacs have strong enough GPUs for best Aperture performance.
BareFeats.com has tests showing large benefits to having stronger graphics in different situations. Games obviously, but also Aperture and othe apps. From BareFeats.com:
"DOES GPU MATTER?
Using tools like OpenGL Driver Monitor and atMonitor, we determined that the Mac Pro's GPU was a factor in the import processing (CPU waiting on GPU) and that 410MB of video memory was in use. In the export test, 573MB of VRAM was in use. That implies that the MacBook Pro (13") and MacBook Air with integrated GPU are both going to "rob" from main memory leaving less for Aperture and the OS. That's another reason to choose a "muscular" Mac with a dedicated GPU (and at least 1GB of VRAM) for running Pro Apps like Aperture."
"For serious laptop gamers and 3D animators, you'll want a MacBook Pro with a dedicated GPU.
Though more than adequate for mere mortal tasks (Safari, Mail, etc.), the 2011 MacBook Air remains at the bottom of the Mac "food chain" when running apps that stress the CPU, GPU, and memory. This will be further illustrated with soon-to-be posted tests using After Effects, Aperture, Final Cut Pro, etc."
...even the GPUs built into CPUs these days are more than sufficient to cover the needs of most consumers and pros.
"most" remains the whole point here I think. No one suggests that most users' needs are not mundane and easily filled by basic Macs. However "most" is not "all" and therefore there is a need to fill that space between "most" users and all the rest.
-Allen
(
Last edited by SierraDragon; Nov 2, 2011 at 04:47 PM.
)
|
|
|
|
|
|
|
|
|
Banned
Join Date: Mar 2005
Status:
Offline
|
|
Good points Sierra.
Also, I think people need to be careful about the assumptions they make about "most people". I don't have the data in front of me, so I don't know what "most people" are doing with their computers. But I suspect a lot of Mac users... anyway... use Photoshop, surf the Web, Blog, and watch videos. I'm certain they'd want the best GPUs they could get for the best performance. Money is a factor... yes... but when you see the gains you get in performance across these tasks with a better GPU you'll obviously want it. Whether they can afford it is another thing.
I myself am blown away by how much better my i7 2.2 GHz 512 MB VRAM MBP performs in comparison to my i7 2.0 GHz 256 MB VRAM rig. It's significantly more fluid with its animations and OS behaviours, and Photoshop is more buttery smooth than I could have ever imagined. Safari is better too. I ran Geekbench and have isolated performance improvements to the GPU.
|
|
|
|
|
|
|
|
|
Posting Junkie
Join Date: Oct 2005
Location: Houston, TX
Status:
Offline
|
|
Hardware acceleration (particularly by the GPU) is really really good in the very very few places where it can be effectively used. The OS compositing is great - and takes very little GPU power (by modern standards) to achieve.
Aperture and Final Cut X are probably the best (and nearly only) examples of apps that can make great use of more powerful GPUs for a significant portion (but not all) of their usage.
Despite the GPGPU hype machine I don't see this changing much for the majority of users in the foreseeable future.
|
|
|
|
|
|
|
|
|
Banned
Join Date: Mar 2005
Status:
Offline
|
|
Originally Posted by mduell
Hardware acceleration (particularly by the GPU) is really really good in the very very few places where it can be effectively used. The OS compositing is great - and takes very little GPU power (by modern standards) to achieve.
Aperture and Final Cut X are probably the best (and nearly only) examples of apps that can make great use of more powerful GPUs for a significant portion (but not all) of their usage.
Despite the GPGPU hype machine I don't see this changing much for the majority of users in the foreseeable future.
You have not read this thread I guess. There are several Apps that take advantage of GPU acceleration. Safari and Quicktime, as pointed out, do. Tests demonstrate the benefits for all users. Any further discussion on this is just a waste of time.
|
|
|
|
|
|
|
|
|
Posting Junkie
Join Date: Mar 2004
Location: UK
Status:
Offline
|
|
Originally Posted by freudling
You have not read this thread I guess. There are several Apps that take advantage of GPU acceleration. Safari and Quicktime, as pointed out, do. Tests demonstrate the benefits for all users. Any further discussion on this is just a waste of time.
You are confusing 'can' with 'do'.
Most users don't know or care.
|
I have plenty of more important things to do, if only I could bring myself to do them....
|
|
|
|
|
|
|
|
Banned
Join Date: Mar 2005
Status:
Offline
|
|
Originally Posted by Waragainstsleep
You are confusing 'can' with 'do'.
Most users don't know or care.
Show me the data that supports the claim that "most users don't know or care".
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Forum Rules
|
|
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
|
HTML code is Off
|
|
|
|
|
|
|
|
|
|
|
|