Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Hardware - Troubleshooting and Discussion > Mac Desktops > New vs old C2D iMacs benched: Radeon 2600 Pro slower than GeForce 7300 GT

New vs old C2D iMacs benched: Radeon 2600 Pro slower than GeForce 7300 GT
Thread Tools
Eug
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Aug 10, 2007, 03:07 PM
 
iMac 24" 2.16 GHz C2D with 667 MHz bus and 7300 GT - 89 fps
iMac 24" 2.4 GHz C2D with 800 MHz bus and 2600 Pro - 86 fps

Macworld: First Look: From the Lab: iMac benchmarks

So, despite the new iMacs having a faster bus (albeit not faster memory) and an 11% faster CPU, the old lower end 7300 GT iMac edges out the new top end iMac in 3D.

It would have been interesting had they tested the old 2.33 GHz 24" with 7600 GT, cuz the 7600 GT is MUCH faster than the 7300 GT.
     
Grrr
Grizzled Veteran
Join Date: Jun 2001
Location: London'ish
Status: Offline
Reply With Quote
Aug 10, 2007, 03:13 PM
 
Interesting test. Thanks for that But yes a comparison with an older mac with the 2.33ghz cpu and 7600gt would have been better/interesting im sure.

Just how much fps do folks need though? I have to say, graphics performance on my new metal iMac doesn't disappoint in the slightest. I don't play many games mind, but early indications say its more than adequate.
The worst thing about having a failing memory is..... no, it's gone.
     
conmon
Junior Member
Join Date: Nov 2006
Location: Edinburgh
Status: Offline
Reply With Quote
Aug 10, 2007, 04:06 PM
 
u gotta remember tho tht the 2600 pro is just out so the drivers probable are not as good as the ones for 7300
White Macbook 2ghz core duo 2gb 60gig hdd superdrive
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Aug 10, 2007, 04:15 PM
 
Originally Posted by conmon View Post
u gotta remember tho tht the 2600 pro is just out so the drivers probable are not as good as the ones for 7300
Yeah but the high end one from last year was the 7600 GT.





Originally Posted by Grrr View Post
Interesting test. Thanks for that But yes a comparison with an older mac with the 2.33ghz cpu and 7600gt would have been better/interesting im sure.

Just how much fps do folks need though? I have to say, graphics performance on my new metal iMac doesn't disappoint in the slightest. I don't play many games mind, but early indications say its more than adequate.
I don't play games on the iMac, but I'd wonder if this translates to performance on stuff like Aperture or Motion. Both those apps need a fair bit of CPU performance, but they are also heavily dependent on GPU performance.
     
mduell
Posting Junkie
Join Date: Oct 2005
Location: Houston, TX
Status: Offline
Reply With Quote
Aug 10, 2007, 08:50 PM
 
The whole HD 2000 series is a disaster. I'm surprised they didn't go with nVidia's mobile chips after using them in the MBP.
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Aug 10, 2007, 11:32 PM
 
Well if the chips are truly and FULLY compliant with HDCP, then it's not a complete disaster, but it's quite unfortunate that their performance is so mediocre.
     
CIA
Mac Elite
Join Date: Dec 1999
Location: Utah
Status: Offline
Reply With Quote
Aug 11, 2007, 01:50 PM
 
And note that the 2xxx series have h.264 dedicated chips. (Altho I don't think the 2900 does iirc, not that that applies here). I don't know if Apple is utilizing these yet, or if they plan too, but it's a nice addition. Hardware encoding/decoding would be nice for the new iMacs.
Work: 2008 8x3.2 MacPro, 8800GT, 16GB ram, zillions of HDs. (video editing)
Home: 2008 24" 2.8 iMac, 2TB Int, 4GB ram.
Road: 2009 13" 2.26 Macbook Pro, 8GB ram & 640GB WD blue internal
Retired to BOINC only: My trusty never-gonna-die 12" iBook G4 1.25
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Aug 11, 2007, 02:42 PM
 
Is 3fps really a significant difference?
     
CheesePuff
Professional Poster
Join Date: Jan 2002
Location: Rochester, NY
Status: Offline
Reply With Quote
Aug 11, 2007, 03:41 PM
 
Originally Posted by Waragainstsleep View Post
Is 3fps really a significant difference?
Yes, considering this is a NEWER model with put up against the previous LOWER END graphics card, but the new system has a faster processor and system bus.
     
Meritocracy
Junior Member
Join Date: Feb 2004
Location: Earth
Status: Offline
Reply With Quote
Aug 11, 2007, 03:56 PM
 
Originally Posted by CIA View Post
And note that the 2xxx series have h.264 dedicated chips. (Altho I don't think the 2900 does iirc, not that that applies here). I don't know if Apple is utilizing these yet, or if they plan too, but it's a nice addition. Hardware encoding/decoding would be nice for the new iMacs.
Regrettably, Apple hasn't written suitable drivers to take advantage of GPU encoding/decoding period. I'd like to see that change, but suspect it hasn't with the 2xxx series here. The x1600 in the earlier iMacs main point was the inclusion of AVIVO (MPEG-2, MPEG-4, DivX, WMV9, VC-1, and H.264 decoding and transcoding etc.) and Apple drivers didn't support it. Considering the chips here are just AVIVO w/HD, I regret to say we should expect the same.
What exactly is rotten in Denmark?
     
Grrr
Grizzled Veteran
Join Date: Jun 2001
Location: London'ish
Status: Offline
Reply With Quote
Aug 11, 2007, 04:37 PM
 
Originally Posted by Meritocracy View Post
Regrettably, Apple hasn't written suitable drivers to take advantage of GPU encoding/decoding period.
Could it be possible its being saved for 10.5?

Anyway, I still think some folks are kicking up too much of a stink over the graphics spec. It really isnt bad.
The worst thing about having a failing memory is..... no, it's gone.
     
Meritocracy
Junior Member
Join Date: Feb 2004
Location: Earth
Status: Offline
Reply With Quote
Aug 11, 2007, 05:00 PM
 
Originally Posted by Grrr View Post
Could it be possible its being saved for 10.5?
One can only hope. They've been ignoring this aspect of the GPU for years (ever since they dropped hardware DVD decoding). It's something I'd really like to see some emphasis on.

Originally Posted by Grrr View Post
Anyway, I still think some folks are kicking up too much of a stink over the graphics spec. It really isnt bad.
Perhaps, but the fact remains that with the preliminary benchmarks we've seen thus far, the 7600 GT with the previous top end 24" likely blows away the sole ATi option we're being offered now. Not exactly something I'd consider worthy of an upgraded model.
( Last edited by Meritocracy; Aug 11, 2007 at 05:07 PM. )
What exactly is rotten in Denmark?
     
AceWilfong
Fresh-Faced Recruit
Join Date: Apr 2007
Location: San Francisco
Status: Offline
Reply With Quote
Aug 11, 2007, 08:01 PM
 
Glad I opted for the 7600 now for my four-month-old, white "Classic", but my "Fancodger" side is dismayed at all the criticism leveled at what I consider a stunning new design.
(I like a matte screen, but I bet I'd fall in love with a glossy within minutes...and I'd probably use gravity to hold the remote to the table...works fine with a Comcast remote.)
Never played games, bur Doom 3 arriving Monday.
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Aug 12, 2007, 12:54 AM
 
Originally Posted by AceWilfong View Post
Glad I opted for the 7600 now for my four-month-old, white "Classic", but my "Fancodger" side is dismayed at all the criticism leveled at what I consider a stunning new design.
I like the new design, but I prefer the white. I'd be happy enough with the aluminum one though.

(I like a matte screen, but I bet I'd fall in love with a glossy within minutes...and I'd probably use gravity to hold the remote to the table...works fine with a Comcast remote.)
I went from a matte iBook to the glossy MacBook. The MacBook is better in every way, and the screen overall is better on the MacBook but still, the fact that it's glossy still really annoys me sometimes. I really notice the glare in some lighting. ie. If I were to buy a MacBook Pro, I'd get the matte for sure.

I can sort of understand the choice of glossy in a MacBook, since it seems to me that lower end buyers often like to purchase machines with glossy screens. However, I don't quite understand it with 24" iMacs, since I don't consider a 24" iMac a low end machine by any stretch of the imagination. Of course, that's one reason I'm annoyed with the graphics performance of the new Radeons. The 7600 GT upgrade in the previous 24" was very nice, and that's what I got. It's curious that Apple has chosen to get rid of the upgrade option and now forces everyone to get that mediocrely performing GPU or something even slower (in the low end 20").
     
awaspaas
Mac Elite
Join Date: Apr 2001
Location: Minneapolis, MN
Status: Offline
Reply With Quote
Aug 12, 2007, 10:01 PM
 
WOE IS ME! WOE! WOE!

3 whole FPS, yeah that's enough to ruin my day. Get a life, really.
     
mduell
Posting Junkie
Join Date: Oct 2005
Location: Houston, TX
Status: Offline
Reply With Quote
Aug 12, 2007, 11:15 PM
 
Originally Posted by awaspaas View Post
WOE IS ME! WOE! WOE!

3 whole FPS, yeah that's enough to ruin my day. Get a life, really.
In the computer hardware industry, where everything is getting faster every 6 months, they took a whole year and ended up with a downgrade. Knowing they had better options (nV 8600) that had already been implemented (MBP) makes it even worse.
     
CheesePuff
Professional Poster
Join Date: Jan 2002
Location: Rochester, NY
Status: Offline
Reply With Quote
Aug 13, 2007, 12:11 AM
 
Originally Posted by awaspaas View Post
WOE IS ME! WOE! WOE!

3 whole FPS, yeah that's enough to ruin my day. Get a life, really.
Thank you for your completely idiotic comment.
     
d0GGii
Fresh-Faced Recruit
Join Date: Dec 2004
Status: Offline
Reply With Quote
Aug 13, 2007, 12:38 AM
 
Originally Posted by mduell View Post
In the computer hardware industry, where everything is getting faster every 6 months, they took a whole year and ended up with a downgrade. Knowing they had better options (nV 8600) that had already been implemented (MBP) makes it even worse.
Totally agree. I was on the edge of buying one. But the downgraded graphics card just doesn't fit in there.
     
wolfen
Mac Elite
Join Date: Jul 2002
Location: On this side of there
Status: Offline
Reply With Quote
Aug 13, 2007, 12:47 AM
 
It's a strange age we live in.

I don't play a lot of 3d games, but the iMac is perfectly fine for everything I do. I think that describes the majority of potential purchasers.
Do you want forgiveness or respect?
     
awaspaas
Mac Elite
Join Date: Apr 2001
Location: Minneapolis, MN
Status: Offline
Reply With Quote
Aug 13, 2007, 06:54 AM
 
Originally Posted by CheesePuff View Post
Thank you for your completely idiotic comment.
Well, I think this whole discussion is idiotic, but that's just my obviously-misguided opinion.
     
Big Mac
Clinically Insane
Join Date: Oct 2000
Location: Los Angeles
Status: Offline
Reply With Quote
Aug 13, 2007, 07:46 AM
 
There's no way to justify poorer GPU performance in what's called an upgrade to the line. There just isn't, and consumers shouldn't fall for it - just as consumers shouldn't fall for int-degraded graphics in Apple's crap consumer lines. As for hardware decoding support, I totally agree there's almost no chance Apple will change its mind and end up using it after rejecting hardware decoding for years now. In modern times Apple only codes for GPU features that will apply to all the GPUs in its computers going forward - for example, Quartz Extreme. Apple will not provide narrow support for decoding features found in only a few computer models that will lose those capabilities when Apple decides to switch back to Nvidia for the next revision (which appears very likely given the backlash against this revision that should occur).

"The natural progress of things is for liberty to yield and government to gain ground." TJ
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Aug 13, 2007, 07:51 AM
 
Why do people keep saying 3 fps? The bench compares today's iMac top of the line GPU, to the last iMac's slower GPU, the 7300 GT. The 7300 GT is quite slow.

Je repete... Here are benches comparing a C2D 2.33 iMac with 7600 GT vs a Mac Pro 3.0 with 7300 GT.



     
Tegeril
Fresh-Faced Recruit
Join Date: Feb 2007
Status: Offline
Reply With Quote
Aug 13, 2007, 10:35 AM
 
This may be a case of poor drivers at this moment. Also realize that both UT2K4 and Doom 3 were created under nVidia's programs instead of ATI's (note the splash at the beginning of both). Those titles historically perform better on nVidia hardware, similarly if we were able to compare HalfLife 2 on both, we could find that the HD series would outperform.

Please reference the Tom's Hardware VGA charts for accurate benchmarks, the HD 2600 (Pro or XT, only the XT is on the chart, but even the 2400XT does well) is vastly superior to the 7300GT:

VGA Charts 2007 | Tom's Hardware
     
Simon
Posting Junkie
Join Date: Nov 2000
Location: in front of my Mac
Status: Offline
Reply With Quote
Aug 13, 2007, 11:07 AM
 
Originally Posted by Tegeril View Post
Please reference the Tom's Hardware VGA charts for accurate benchmarks, the HD 2600 (Pro or XT, only the XT is on the chart, but even the 2400XT does well) is vastly superior to the 7300GT...
The 2600 XT is significantly better than the iMac's HD 2600 PRO.

The XT's core clock is 800 MHz rather than 550 MHz and the mem clock is 1100 MHz compared to the PRO's 700 MHz. Correspondingly the XT's bandwidth is 35.2 GB/s compared to the PRO's 22.4 GB/s and the XT's fillrate is 6400 MT/s rather than 4800 MT/s on the PRO. The XT would also support 512 MB VRAM compared to the PRO's 256 MB limit, but that's irrelevant for the iMac which comes with 256 MB anyway.

And the HD 2400 XT is even worse than the HD 2600 PRO.
( Last edited by Simon; Aug 13, 2007 at 11:15 AM. )
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Aug 13, 2007, 11:46 AM
 
Originally Posted by mduell View Post
In the computer hardware industry, where everything is getting faster every 6 months, they took a whole year and ended up with a downgrade. Knowing they had better options (nV 8600) that had already been implemented (MBP) makes it even worse.
Indeed.

It's also rather interesting that Apple posted benchmarks for the MacBook Pro 2.4 GHz with nV 8600m GT (compared to the CD 2.16 MBP with Radeon X1600):

Quake 4: 57% faster
Doom 3: 50% faster
Motion: 37% faster

Apple has neglected to post any graphics benchmarks for the new iMac.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Aug 13, 2007, 12:25 PM
 
While this is certainly interesting, I'd be even more interested in a test that stresses the GPU. 86 or 89 or 120 fps is pointless if the display only refreshes 60 times per second - I'd like to see stats at something closer to native resolution.

Apple is using ATi graphics in this one because the iMac and the MBP are their two biggest lines, and keeping the same graphics supplier in both means that the Mac volume for the other supplier becomes essentially zero. It could also be a saving designed to take away some of the margin they lose by transitioning to a bigger display size (20" at the 17" price, etc). They could have - and should have - offered the 2600 XT as a BTO though.
     
CheesePuff
Professional Poster
Join Date: Jan 2002
Location: Rochester, NY
Status: Offline
Reply With Quote
Aug 13, 2007, 12:42 PM
 
The ATI Radeon 2600 XT looks like one heck of a card, and it's only marginaly more expensive then the 2600 PRO... too bad they didn't go that route, or at least provide it as an option.
     
Andhee
Mac Enthusiast
Join Date: Feb 2007
Location: UK
Status: Offline
Reply With Quote
Aug 13, 2007, 01:02 PM
 
[QUOTE=AceWilfong;3453649]...and I'd probably use gravity to hold the remote to the table...works fine with a Comcast remote.QUOTE]

Aha, but it does have a place to magnetize it, bottom right hand side of the screen, found that out in't apple store this afternoon.
     
imacman
Baninated
Join Date: Aug 2007
Status: Offline
Reply With Quote
Aug 13, 2007, 04:34 PM
 
It's pretty sad. The 7600GT was always MUCH MUCH faster than the 7300, and if the new cards are even slower than that... ouch... I feel sorry for anybody who buys a new iMac.
     
bernt
Forum Regular
Join Date: Apr 2001
Location: Europe
Status: Offline
Reply With Quote
Aug 14, 2007, 05:41 AM
 
Seems like people are overlooking this part (from http://www.macworld.com/2007/08/revi...ac/index.php):

"The Unified Shader Architecture touted by Apple and ATI/AMD will make it easier for game developers and others to show off fancy new special effects in their software. The new chips can also perform 128-bit High Dynamic Range (HDR) rendering, which will give games more intense, realistic lighting and shadows."

Fps is one thing, but what about other features the new cards offer?
PowerBook 15" 1.25G/1G/80G | iMac G5 17" 1.6G/1.5G/300G | MacBook Pro 15" CD2.0G/1.5G/120G | MacBook C2D 2.2G/4G/160G
     
rnicoll
Forum Regular
Join Date: Aug 2003
Status: Offline
Reply With Quote
Aug 14, 2007, 07:52 AM
 
Originally Posted by awaspaas View Post
WOE IS ME! WOE! WOE!

3 whole FPS, yeah that's enough to ruin my day. Get a life, really.
It's not the 3 FPS slower, it's the 80-90 FPS that it isn't faster by. For those of us looking at replacing a gaming PC with an iMac 24"... well, Apple just lost a sale. I'd have really liked a Mac desktop system (have a laptop already), but I don't have the physical space available for two desktops (and don't want to have to keep two systems updated).

For a company that's been talking about how EA will now release all their games on OS X, and had ID Games at WWDC demonstrating their new games, they seem to have forgotten to actually provide anything sensible for gamers.
     
Tegeril
Fresh-Faced Recruit
Join Date: Feb 2007
Status: Offline
Reply With Quote
Aug 14, 2007, 10:00 AM
 
Originally Posted by Simon View Post
The 2600 XT is significantly better than the iMac's HD 2600 PRO.

The XT's core clock is 800 MHz rather than 550 MHz and the mem clock is 1100 MHz compared to the PRO's 700 MHz. Correspondingly the XT's bandwidth is 35.2 GB/s compared to the PRO's 22.4 GB/s and the XT's fillrate is 6400 MT/s rather than 4800 MT/s on the PRO. The XT would also support 512 MB VRAM compared to the PRO's 256 MB limit, but that's irrelevant for the iMac which comes with 256 MB anyway.

And the HD 2400 XT is even worse than the HD 2600 PRO.
This is all inconsequential because the 2400XT is superior to the 7300GT...and the 2600 Pro is superior to the 2400XT. The titles we're looking at in this thread are designed to run better on nVidia hardware.
     
D.O.G.S. CEO
Baninated
Join Date: Aug 2007
Status: Offline
Reply With Quote
Aug 14, 2007, 10:37 AM
 
Originally Posted by Tegeril View Post
This is all inconsequential because the 2400XT is superior to the 7300GT...and the 2600 Pro is superior to the 2400XT. The titles we're looking at in this thread are designed to run better on nVidia hardware.
Why? Because they use the graphics card? If so, I guess you're right. Anything that uses the graphics (openGL or quartz) will run FASTER on the previous 24" iMac with the 7600GT than it will on the newer iMacs.
     
CheesePuff
Professional Poster
Join Date: Jan 2002
Location: Rochester, NY
Status: Offline
Reply With Quote
Aug 14, 2007, 11:37 AM
 
Originally Posted by Tegeril View Post
This is all inconsequential because the 2400XT is superior to the 7300GT...and the 2600 Pro is superior to the 2400XT. The titles we're looking at in this thread are designed to run better on nVidia hardware.
Uhhh, want to try that one again? The ATI Radeon 2600 XT is *much* faster then the 2600 PRO... it has a much faster processor and VRAM. Sadly, it's also not much more expensive, either.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Aug 14, 2007, 11:56 AM
 
Has anyone put their grubby little hands on an iMac yet and checked how the GPU and memory are clocked? Just noticed that there is no Mobility 2600 Pro (just XT and "regular"), so I'm not sure exactly what to expect. Is it clocked like the desktop model?

Checking a few Wintel benchmarks, 2400 XT is on the same performance rung as the 7300 GT, and the 2600 XT on the same as the 7600 GT. The 2600 Pro is somewhere in between. This all shifts up and down a bit with various games, but generally. In all, Apple improved video decoding capabilities significantly and game performance slightly but removed the BTO option. Don't forget that they also dropped prices while increasing the clockspeed - if they had kept a decent BTO option, the noise in all the forums would be much lower.
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Aug 14, 2007, 01:55 PM
 
Bare Feats









That's an iMac 2.8 GHz 2600 Pro vs the old 2.33 GHz 7600 GT.

Ouch.
     
hsl
Dedicated MacNNer
Join Date: Jun 2001
Location: the netherlands
Status: Offline
Reply With Quote
Aug 14, 2007, 02:01 PM
 
does anyone know if the Videocard in the 24" model is replacable just as in the old 24" model?

would be great to just switch it for the nvidea card if you like gaming.
15,4" MBP (late 2008), 2,53Ghz, 4GB RAM, 256GB SSD | 27" ACD | 11" MBA, 1.6Ghz, 4GB RAM , 128GB | 16GB iPhone4 | 32GB iPad

The biggest fan of JoliOriginals MacBook, iPad and iPhone Sleeves!
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Aug 14, 2007, 02:09 PM
 
Even if you could, where would you get the GPU?

You'd be better off getting the old model and just swapping out the CPU. Plus you'd get a better wireless keyboard and a better screen at the same time.
     
PaperNotes
Registered User
Join Date: Jan 2006
Status: Offline
Reply With Quote
Aug 14, 2007, 02:21 PM
 
Originally Posted by Eug View Post

That's an iMac 2.8 GHz 2600 Pro vs the old 2.33 GHz 7600 GT.

Ouch.
It's sad they didn't put a fully clocked 8600GT in the iMac. Any way you look at it those numbers are still good. 60FPS in those games at those settings is very very playable. The numbers would be higher still in Windows. For a general purpose computer the iMac is totally usable for gaming and better than its direct competition for everything else. I don't want to sound like an apologist but Apple pitches the iMac as a digital lifestyle hub computer and not an all round gaming machine.
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Aug 14, 2007, 02:30 PM
 
Originally Posted by PaperNotes View Post
It's sad they didn't put a fully clocked 8600GT in the iMac. Any way you look at it those numbers are still good. 60FPS in those games at those settings is very very playable. The numbers would be higher still in Windows. For a general purpose computer the iMac is totally usable for gaming and better than its direct competition for everything else. I don't want to sound like an apologist but Apple pitches the iMac as a digital lifestyle hub computer and not an all round gaming machine.
Except even just at medium screen resolution the tricked out 24" iMac is getting FPS rates like 29 and 40 fps, not 60 fps. And remember, some Apple apps are heavily dependent upon 3D performance too.

Quite frankly, the pairing of the 2.8 GHz C2D with the Radeon 2600 Pro seems like some sort of sick joke.
     
PaperNotes
Registered User
Join Date: Jan 2006
Status: Offline
Reply With Quote
Aug 14, 2007, 02:33 PM
 
Originally Posted by Eug View Post
Except they're getting FPS rates like 29 and 40 fps in some stuff and just medium resolution, not 60 fps.
There's 58 and 59 listed there. 1280x800 is not medium resolution to most people. There aren't THAT many people who play higher than that.

And remember, some Apple apps are heavily dependent upon 3D performance too.
They don't demand the amount of power those games above do.
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Aug 14, 2007, 02:38 PM
 
Originally Posted by PaperNotes View Post
1280x800 is not medium resolution to most people.
You'd be right... if it was 2002.

Originally Posted by PaperNotes View Post
They don't demand the amount of power those games above do.
For even just Core Image, the GPU makes a humungous difference.

     
D.O.G.S. CEO
Baninated
Join Date: Aug 2007
Status: Offline
Reply With Quote
Aug 14, 2007, 02:47 PM
 
Originally Posted by PaperNotes View Post
There's 58 and 59 listed there. 1280x800 is not medium resolution to most people. There aren't THAT many people who play higher than that.

They don't demand the amount of power those games above do.
Uh, 1280X800 is definitely medium resolution when your screen is 1900X1200. That is the point. I would have no problem if the LCD was a 17 or MAYBE 20" widescreen, but that graphics card coupled to a 1900X1200 widescreen LCD is just a POOR choice, as it will not be able to run anything very well at native resolutions. If you've ever seen an LCD run at non native res, you know it's ugly, and that it's pretty slow, since the GPU has to do more calculations to double pixels and whatnot.
     
PaperNotes
Registered User
Join Date: Jan 2006
Status: Offline
Reply With Quote
Aug 14, 2007, 02:53 PM
 
Originally Posted by Eug View Post
You'd be right... if it was 2002.
The majority of games are played on consoles at 720p tops. Consider that gamers all around the world are enjoying 30-60FPS at 720p or less on their consoles. 1280x800 is plenty for most people on a general purpose computer than offers more CPU power and a bigger screen for less money than the last generation iMac. Apple should have offered BTO options for better graphics for those who want it. They could still do so in the future.


For even just Core Image, the GPU makes a humungous difference.
The numbers you are showing there don't present realistic real world use (Morph, like how much morphing is done on a daily basis?). And ALL those numbers are high. Even the lowest numbers in the CoreImage tests are going to be good enough for a long time yet. I have a meesly Geforce FX 5200 and it handles just about everything easily except Motion and Aperture. The ATI 2600 can easily handle realtime effects in those apps and will do so for 2-3 years to come. Once more, it is a general purpose computer.
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Aug 14, 2007, 02:58 PM
 
Originally Posted by PaperNotes View Post
The numbers you are showing there don't present realistic real world use (Morph, like how much morphing is done on a daily basis?). And ALL those numbers are high. Even the lowest numbers in the CoreImage tests are going to be good enough for a long time yet.
Core Image filters are often stacked. You don't just run them one at a time.

I have a meesly Geforce FX 5200 and it handles just about everything easily except Motion and Aperture. The ATI 2600 can easily handle realtime effects in those apps and will do so for 2-3 years to come. Once more, it is a general purpose computer.
I have used machines with the 5200. In my opinion it royally sucks, even for light 3D usage.

I use Aperture on my iMac 2.33 with 7600 GT by the way. It is acceptable, but not blazing fast. The 2.16 GHz with 7300 GT is quite a bit slower in comparison. A 2.8 GHz iMac with 8600 GT would do quite well.
     
Si-man
Fresh-Faced Recruit
Join Date: Aug 2007
Status: Offline
Reply With Quote
Aug 14, 2007, 03:17 PM
 
It strikes me that this sort of thing happens over and over again. I've been watching the new releases very closely over the last few years. Am I the only one to notice this:

Whenever there is an upgrade, Apple also takes a little something back. I think it's a very clever strategy. Why make your latest offering so amazing that the previous incarnation becomes obsolete? How could the shops with old stock sell your last model at all? Wouldn't the people who bought a white imac a week before the alu one came out feel a bit gutted otherwise?

Has anyone else noticed this?

There must be a market research analyst somewhere saying: "Listen Steve, you can get away with underpowering the gfx card a little because people will want an a new imac anyway. You'll sell XXXX units. In three months they'll buy an imac because of Leopard, so your still in the clear with XXXX units. Once the novelty of both have worn off, say in six months, revamp the graphics as an upgrade and increase sales by XX%. "

Hey, I'm not knocking it. But maybe we should drop the shock

Here's some previous examples to back up what I'm saying:

ibook - screen size went down a size when it became MacBook
mini - went to intel chips but started using onboard gfx
     
Tegeril
Fresh-Faced Recruit
Join Date: Feb 2007
Status: Offline
Reply With Quote
Aug 14, 2007, 03:56 PM
 
Originally Posted by CheesePuff View Post
Uhhh, want to try that one again? The ATI Radeon 2600 XT is *much* faster then the 2600 PRO... it has a much faster processor and VRAM. Sadly, it's also not much more expensive, either.
Pretty sure you misread. I stated that the 2400XT is faster (or equivalent) to the 7300GT on average, it will lose some and win some, but it wins more. And then I stated that the 2600 Pro is faster than the 2400XT. I made no claims about the 2600XT being faster or slower than anything specifically except in my original post where I stated that it was quite fast =)

Now, the addition of DX10 doesn't matter much to the Mac side of things, but the HD series has superior HD video decoding which will pave the way for Blu-Ray/HDDVD to come to the Mac (setting aside the fact that the Core 2 Duos in Macs now can more or less do the decoding on their own except a few select h.264 encoded titles - until more of those titles become prevalent).
( Last edited by Tegeril; Aug 14, 2007 at 04:06 PM. )
     
Meritocracy
Junior Member
Join Date: Feb 2004
Location: Earth
Status: Offline
Reply With Quote
Aug 14, 2007, 06:08 PM
 
Originally Posted by Tegeril
but the HD series has superior HD video decoding which will pave the way for Blu-Ray/HDDVD to come to the Mac (setting aside the fact that the Core 2 Duos in Macs now can more or less do the decoding on their own except a few select h.264 encoded titles - until more of those titles become prevalent).
True, however, as I've mentioned before, this aspect shouldn't be set aside considering Apple continues to not write drivers leveraging the ability of the GPU with regard to video decoding, never-mind HD content which the latest series of cards can help accelerate.. Currently, all playback is still CPU based on OSX. As such, that advantage, really isn't one, at least as far as playback on OSX is concerned.
( Last edited by Meritocracy; Aug 14, 2007 at 06:14 PM. )
What exactly is rotten in Denmark?
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Aug 14, 2007, 11:55 PM
 
The other point is that the 8600 GT is supposed to be pretty good for H.264 decoding too (if drivers existed on the Mac that is).

nVidia PureVideo HD

Revolutionary New Video Processing Architecture
NVIDIA GeForce 8400, 8500, and 8600 GPUs for Desktop and GeForce 8400M and 8600M for Notebooks, incorporate a revolutionary new video processing architecture, making them the world’s first GPU video processors to offload 100% of Blu-ray and HD DVD H.264 video decoding from the CPU.** This added processing power gives PureVideo HD technology the ability to support more complex features as they are added to Blu-ray and HD DVD movies, including “picture-in-picture” movies, interactive games and menus, and higher bit-rate / higher quality movie pictures.
**Currently supported in Windows Vista only


     
Simon
Posting Junkie
Join Date: Nov 2000
Location: in front of my Mac
Status: Offline
Reply With Quote
Aug 15, 2007, 03:40 AM
 
Originally Posted by Eug View Post
Bare Feats
...
That's an iMac 2.8 GHz 2600 Pro vs the old 2.33 GHz 7600 GT.

Ouch.
Yep, that's definitely a bummer. If you're a gamer or a Core Image user, you'd probably be better off getting the old 24" 2.33/7600GT iMac. I really don't understand what Apple was thinking here. At the very least they should have offered a HD 2600XT BTO option. I'm wondering if they might end up adding it later on if new iMac sales indeed reflect the poor GPU choice.

Quite frankly, the pairing of the 2.8 GHz C2D with the Radeon 2600 Pro seems like some sort of sick joke.
I agree. Unless you are entirely GPU-independent the 2.8 GHz Merom XE is overkill for a a 24" machine with that GPU.

This strange GPU spec does give credibility to the notion that GPU choice was mainly the result of market/business considerations rather than performance analysis. The MBP and iMac are the most selling lines when it comes to Macs with dedicated GPUs. With the MBP going NVIDIA across the line maybe Apple simply wanted to put ATI into the iMac to keep them aboard. ATI GPUs were in iMacs and MBPs/PBs a lot. In the portable line they were used almost exclusively for the past 4 years. And now NVIDIA has taken over the top selling MBP. If Apple had put the 8600 into the iMac ATI might simply have left the Mac altogether.
     
 
Thread Tools
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 04:10 AM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,