Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Hardware - Troubleshooting and Discussion > Mac Desktops > Why most G5 buyers should avoid the GeForce FX 5200 like the plague

Why most G5 buyers should avoid the GeForce FX 5200 like the plague
Thread Tools
Eug
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Sep 14, 2003, 12:44 AM
 


Now this is testing on a PC (P4 3.0C), and not everyone is a gamer, but for $50 you can't go wrong by upgrading the stock GeForce FX 5200 Ultra on the G5 1.6 and 1.8 to the Radeon 9600 Pro, even if only for Quartz Extreme.
     
Cipher13
Registered User
Join Date: Apr 2000
Status: Offline
Reply With Quote
Sep 14, 2003, 12:47 AM
 
nVidia graphic cards blow right now, and the GeForce FX blows real hard.
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Sep 14, 2003, 12:50 AM
 
Yeah, I always knew the 5200 was slow, I but I just didn't quite realize how slow. I had thought it would have been more competitive to the Radeon 9200, but even that card spanks the 5200 good.
     
benh57
Senior User
Join Date: Aug 2001
Location: CA
Status: Offline
Reply With Quote
Sep 14, 2003, 01:06 AM
 
They aren't that slow. This is due to bad drivers and DX interaction.

See Slashtot Story and the NVidia response:

Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe referred to in his presentation. It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.

Brian Burke NVIDIA Corp.
Dual 800 - GF3 - 1.5GB
     
GeoMac
Fresh-Faced Recruit
Join Date: Aug 2003
Location: Boston
Status: Offline
Reply With Quote
Sep 14, 2003, 01:11 AM
 
Originally posted by benh57:
They aren't that slow. This is due to bad drivers and DX interaction.

See Slashtot Story and the NVidia response:

it's called spin control. Nvidia lost the performance crown the day ATI came out with the radeon 9700, and the gap has only been getting wider since then.
iBook G3 800 MHz/640mb RAM/30
GB HD/CDROM/Airport/12.1" LCD
     
Catfish_Man
Mac Elite
Join Date: Aug 2001
Status: Offline
Reply With Quote
Sep 14, 2003, 01:36 AM
 
Originally posted by GeoMac:
it's called spin control. Nvidia lost the performance crown the day ATI came out with the radeon 9700, and the gap has only been getting wider since then.
True, but the FX doesn't do NEARLY as badly in other benchmarks. Yes, it's slower, but not THAT much slower. Still, the ATI is definitely better, and if you want to play half life 2, it looks like you're going to need the ATI card.
     
mac freak
Mac Elite
Join Date: Jun 2000
Location: Highland Park, IL / Santa Monica, CA
Status: Offline
Reply With Quote
Sep 14, 2003, 02:45 AM
 
Actually, if you head over to Anandtech or Tom's Hardware, and look at reviews of the GeForceFX 5900, you'll find that, in the end, the GFFX 5900 and R9800Pro are basically equal speed-wise.

The bad rep the GFFX has is the result of the 5800 -- nVidia's initial flagship offering, which frankly STUNK.

Oh, and I believe nVidia when they say the benchmarks aren't accurate. The same thing happened with the initial Doom 3 marks (except flipped), with ATI complaining that they had not properly "optimized" their drivers yet.
Be happy.
     
Commodus
Mac Elite
Join Date: Feb 2002
Location: Ottawa, Canada
Status: Offline
Reply With Quote
Sep 14, 2003, 07:26 AM
 
The problem here isn't that nVidia performs badly in Half-Life 2, but in most any Pixel Shader 2.0 test. It's questionable as to whether new drivers will fix that or not.

Even if they do, the FX 5200 is a better parallel to the Radeon 9200 than the 9600 (which is a mid-range card to the FX 5200's low-end). The only reason I'd keep the nVidia card in a config is if I wasn't really concerned with absolute 3D performance. But if you play games much at all, you owe it to yourself to get a 9600.
24-inch iMac Core 2 Duo 2.4GHz
     
Eriamjh
Addicted to MacNN
Join Date: Oct 2001
Location: BFE
Status: Offline
Reply With Quote
Sep 14, 2003, 08:22 AM
 
If it's that bad on a PC it will be worse on a Mac.

I'm a bird. I am the 1% (of pets).
     
GeoMac
Fresh-Faced Recruit
Join Date: Aug 2003
Location: Boston
Status: Offline
Reply With Quote
Sep 14, 2003, 10:28 AM
 
Originally posted by mac freak:
Actually, if you head over to Anandtech or Tom's Hardware, and look at reviews of the GeForceFX 5900, you'll find that, in the end, the GFFX 5900 and R9800Pro are basically equal speed-wise.

The bad rep the GFFX has is the result of the 5800 -- nVidia's initial flagship offering, which frankly STUNK.

Oh, and I believe nVidia when they say the benchmarks aren't accurate. The same thing happened with the initial Doom 3 marks (except flipped), with ATI complaining that they had not properly "optimized" their drivers yet.
huh? Did we read the same review? If you look at the benches again, the 5900 ultra loses EVERY benchmark by 20-100%, sometime losing to the even the 9700 & 9600 Pro!



Nvidia owned
iBook G3 800 MHz/640mb RAM/30
GB HD/CDROM/Airport/12.1" LCD
     
Link
Professional Poster
Join Date: Jun 2003
Location: Hyrule
Status: Offline
Reply With Quote
Sep 14, 2003, 10:43 AM
 
This has been known on PC boards for years now. I assumed mac users knew this well too!!



Well IT'S TRUE. But for ~250 you can get a geforce 4ti or for another 200 bucks you can have a shiny new Radeon 9800/mac

yeah sure maybe twice the performance but:
1. dual monitors suck on ati cards (but I've only heard this.. tho I am not terribly surprised.. watch the flames come lol)
2. That "HUGE DIFFERENCE" won't be visible in real life.. not even in game FPS marks.. at least as I've heard...
3. Even if that huge difference is there.. if it's no more then 50%, the price difference isn't worth it. <then again the ~250 nvidia cards are refurbs>
Aloha
     
GeoMac
Fresh-Faced Recruit
Join Date: Aug 2003
Location: Boston
Status: Offline
Reply With Quote
Sep 14, 2003, 11:12 AM
 
Originally posted by Link:
This has been known on PC boards for years now. I assumed mac users knew this well too!!



Well IT'S TRUE. But for ~250 you can get a geforce 4ti or for another 200 bucks you can have a shiny new Radeon 9800/mac

yeah sure maybe twice the performance but:
1. dual monitors suck on ati cards (but I've only heard this.. tho I am not terribly surprised.. watch the flames come lol)
2. That "HUGE DIFFERENCE" won't be visible in real life.. not even in game FPS marks.. at least as I've heard...
3. Even if that huge difference is there.. if it's no more then 50%, the price difference isn't worth it. <then again the ~250 nvidia cards are refurbs>

I've seen 9800 Pro's for $350 for the Mac, and I would never buy refurbed video card for any reason (where did you see the Ti anyway for $250 anyway, I keep seeing for $399, the same as the retail price of the 9800 Pro). You will notice the differece in the newer games, like Doom 3 and HL2. And don't believe what you've heard, as hydravison for ATI is second only to Matrox for dual monitor setups. I know, because I have a dual 9800 Pro setup for my PC a few months back. Trust me, if you're a gamer, there is NO comparison between the Ti and 9800pro at all.
iBook G3 800 MHz/640mb RAM/30
GB HD/CDROM/Airport/12.1" LCD
     
Link
Professional Poster
Join Date: Jun 2003
Location: Hyrule
Status: Offline
Reply With Quote
Sep 14, 2003, 11:33 AM
 
lol. Of course you know. It's the price of opinion.

Personally I don't think it matters, as the 2 cards are supposed to be as good as their HW capabilities in dual monitors.

:shrug: prices, you're right, and I wouldn't buy a refurb card for the same reasons, but I doubt it's refurbed because it blew a circuit or something... more like it's a pull from a g4 someone traded in and they dusted it.

I believe it was from "we love macs" but anway, yeah. ok. *recalls all the lack of driver problems from ati too*
Aloha
     
GeoMac
Fresh-Faced Recruit
Join Date: Aug 2003
Location: Boston
Status: Offline
Reply With Quote
Sep 14, 2003, 02:50 PM
 
Originally posted by Link:
lol. Of course you know. It's the price of opinion.
don't know what you mean by that, but you have to make a choice. Are you going to believe the black & white benchmarks you you have chosen not to, ot are you going to buy into consumer opinion, which you seem to be mocking too. If you're to ignore both, then you run into the token "fanboyism"' demographic, which you seem not to be, but i may be wrong. The fact is that a the present moment, there is no level that nvidia is the better buy, whether you're looking at 3200 vs. 9200, 5600 vs. 9600, or 5900 vs. 9800. ATI has the performance and price advantage at every level. And up until the 9700 pro, I alsway owned Nvidia, including a ti 4600, so i think i have seen quite a bit of alternatives to make an educated argument...
iBook G3 800 MHz/640mb RAM/30
GB HD/CDROM/Airport/12.1" LCD
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Sep 14, 2003, 03:04 PM
 
1) Drivers could be an issue, but if it isn't then the 5200 totally sucks.

2) If drivers are an issue, then maybe the 5200 might be as fast as a Radeon 9200. If so, for most people I'd still recommend spending the extra $50 to upgrade to the 9600 Pro on a new machine. It only adds 2% to the cost of a new Power Mac.

3) ATI has very good dual head capabilities on the PC. Also, on average, ATI image quality via VGA is superior to nVidia image quality on PCs, partially because with nVidia, the VGA output quality really varies from brand to brand, and sometimes even model to model within the same brand. With ATI, you consistently get good VGA quality. With nVidia on PCs, you have to know the specific brand and model or you could get burned.

4) The Ti on the Mac compares favourably with the Radeon 9600 Pro in general, but neither are even in the same league as the 9800 Pro.
     
anonx
Fresh-Faced Recruit
Join Date: Jun 2002
Location: Cincinnati, OH
Status: Offline
Reply With Quote
Sep 14, 2003, 06:06 PM
 
Is it worth putting a Radeon 9800 in anything less than the newsest systems? I was thinking on updating my 9000, but the 867MHz G4 may be just as big of a problem.

Originally posted by Link:
This has been known on PC boards for years now. I assumed mac users knew this well too!!



Well IT'S TRUE. But for ~250 you can get a geforce 4ti or for another 200 bucks you can have a shiny new Radeon 9800/mac

yeah sure maybe twice the performance but:
1. dual monitors suck on ati cards (but I've only heard this.. tho I am not terribly surprised.. watch the flames come lol)
2. That "HUGE DIFFERENCE" won't be visible in real life.. not even in game FPS marks.. at least as I've heard...
3. Even if that huge difference is there.. if it's no more then 50%, the price difference isn't worth it. <then again the ~250 nvidia cards are refurbs>
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Sep 14, 2003, 06:36 PM
 
Originally posted by anonx:
Is it worth putting a Radeon 9800 in anything less than the newsest systems? I was thinking on updating my 9000, but the 867MHz G4 may be just as big of a problem.
It will speed it up to a certain extent, but your CPU in the bottleneck. A 9800 Pro would be a total waste in a G4 867.
     
rhogue islander
Dedicated MacNNer
Join Date: Apr 2002
Location: rodeo island
Status: Offline
Reply With Quote
Sep 14, 2003, 07:03 PM
 
I had a gf4ti 4600 in my DP 1000 w/ 1.5 gb of ram and recently 'upgraded' to an ATI 9800 Pro Mac Edition.

It actually runs about 10 fps slower than the gf4ti but with much better image quality. This is true for high resolution desktop modes as well as for games.

The Ati displays control panel allows one to customize fsaa and anisotropic filtering for any game app, which is a welcome addition to Mac OS X display control

I'm quite happy with it but am still mystified by the slower performance. On my pentium 4 winxp box a 9800 pro is generally about 50% faster than a gf4ti 4600 at any resolution ingame.
     
Simon
Posting Junkie
Join Date: Nov 2000
Location: in front of my Mac
Status: Offline
Reply With Quote
Sep 15, 2003, 09:28 AM
 
In another thread I wanted to get an idea of how GPUs that are available for Macs compare. Maybe this thread is a good place to try to update the list.

The list is meant for people like me who don't have the time to read 300 gamer threads on some gamer site, but still want to get an idea of how different GPUs compare in general performance. It's by no means scientific. It's just supposed to be a geneal rule of thumb.

Please update the list, add or correct. It's rather small at the moment and certainly not 100% correct.

Desktop:
GeForce 2MX
Radeon 7500
GeForce 4MX
GeForce FX 5200
Radeon 9000
GeForce 3Ti
GeForce 4Ti 4200
GeForce FX 5600
Radeon 9600
GeForce 4Ti 4600
Geforce FX 5900
Radeon 9700
Radeon 9800


Notebook:
Radeon 7500 Mobility
GeForce 420 Go
GeForce 440 Go
Radeon 9000 Mobility
GeForce 460 Go
Radeon 9600 Mobility


Any other corrections/suggestions?
     
Judge_Fire
Mac Elite
Join Date: Jan 2001
Location: Helsinki, Finland
Status: Offline
Reply With Quote
Sep 15, 2003, 02:56 PM
 
So there obviously is a lot of information on the performance on Windows drivers, but what about the quality of ATI and nVidia Mac drivers?

J
     
mac freak
Mac Elite
Join Date: Jun 2000
Location: Highland Park, IL / Santa Monica, CA
Status: Offline
Reply With Quote
Sep 16, 2003, 04:52 PM
 
Originally posted by GeoMac:
huh? Did we read the same review? If you look at the benches again, the 5900 ultra loses EVERY benchmark by 20-100%, sometime losing to the even the 9700 & 9600 Pro!



Nvidia owned
You misread my message

Indeed, nVidia's performance in HL2 has been poor. This is however not consistent with general FX 5900 reviews, such as:
http://anandtech.com/video/showdoc.html?i=1821
or
http://www.tomshardware.com/graphic/20030512/index.html .

While I do prefer ATI as a company, nVidia deserves more credit than you give them. One pre-beta (pre-alpha?) game which one company has not optimized for while the other has, does not accurately represent the performance of a given video card.
Be happy.
     
Eug  (op)
Clinically Insane
Join Date: Dec 2000
Location: Caught in a web of deceit.
Status: Offline
Reply With Quote
Sep 18, 2003, 04:28 PM
 
John Carmack agrees the GeForce FX ain't so good.

Hi John,

No doubt you heard about GeForce FX fiasco in Half-Life 2. In your opinion, are these results representative for future DX9 games (including Doom III) or is it just a special case of HL2 code preferring ATI features, as NVIDIA suggests?


Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.

John Carmack
     
Socially Awkward Solo
Professional Poster
Join Date: Jul 2002
Location: Hanging on the wall at Jabba's Palace
Status: Offline
Reply With Quote
Sep 18, 2003, 04:36 PM
 
I have always liked ATI. It is strange to watch companies to go up and down in quality like this.

"Laugh it up, fuzz ball!"
     
GeoMac
Fresh-Faced Recruit
Join Date: Aug 2003
Location: Boston
Status: Offline
Reply With Quote
Sep 18, 2003, 05:55 PM
 
Originally posted by mac freak:

While I do prefer ATI as a company, nVidia deserves more credit than you give them. One pre-beta (pre-alpha?) game which one company has not optimized for while the other has, does not accurately represent the performance of a given video card.
HL2 is hardly "pre-beta" or "pre-alpha". It's shipping in November. And even so, you really need to judged the latest high end cards on the games that ARE in beta, since you're pretty much buying the hi end for the games of tomorrow.
iBook G3 800 MHz/640mb RAM/30
GB HD/CDROM/Airport/12.1" LCD
     
olePigeon
Clinically Insane
Join Date: Dec 1999
Status: Offline
Reply With Quote
Sep 18, 2003, 09:15 PM
 
Couple questions:

1) Anyone know how soon the 256MB 9800 Pro will be for the Mac? If ever?

2) Why are we arguing over Half-Life 2? Valve and Microsoft are having kids already, you're never going to see the game on anything other than Windows... ever.
"…I contend that we are both atheists. I just believe in one fewer god than
you do. When you understand why you dismiss all the other possible gods,
you will understand why I dismiss yours." - Stephen F. Roberts
     
Ken Masters
Banned
Join Date: Mar 2003
Location: In your backyard!!!
Status: Offline
Reply With Quote
Sep 18, 2003, 09:52 PM
 
I guess all the cheating that NVIDIA has done in the past has caught up with them!!!

HEHEHE.....

Driver, YES that is all that NVIDIA has to do, release a driver that works hell fast, with at the expense of image quality.

THE REAL REASON WHY NVIDIA SUCK SO BAD, is because they didn't follow standards.

Their doing what 3DFX did,
Individual games must have tailored written software for each manufacture card to work properly.

I hope they end up like 3DFX, those cheaters (their claiming 16 bit precision has no lost of image quality)
when all image comparison test shows shows that is does!
( Last edited by Ken Masters; Sep 18, 2003 at 09:58 PM. )
     
   
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 10:53 PM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,