Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Hardware - Troubleshooting and Discussion > Mac Desktops > When the HELL is apple going to make an iMac with a decent video card?!?!?!

When the HELL is apple going to make an iMac with a decent video card?!?!?! (Page 6)
Thread Tools
jdevalk
Fresh-Faced Recruit
Join Date: Nov 2003
Status: Offline
Reply With Quote
Sep 11, 2004, 04:54 AM
 
Fact 1: This iMac is much improved over the previous version. This is not in question. However, there are some expectations that come with a 2K computer and a graphics card that is not 1 1/2 years old is one of them.

Fact 2: Apple believes that gaming is important and devotes a good portion of 2 pages primarily to this fact - http://www.apple.com/imac/processor.html and http://www.apple.com/imac/graphics.html. However, 'unparalleled 2D and 3D graphics' is simply not true.

Fact 3: Heat is not a reason that they stuck with this chipset. They included the power adapter in the machine, which itself takes up a fair amount of space and gives off heat. This could be been external leaving a ton of flexibility for an improved graphics card. Maybe not ideal, but still an option.

Fact 4: Even so, from what I have seen in laptops they could have done better even without such a change.

Fact 5: Many computer users are casual gamers and a new machine like this should be more than capable without having to be gaming specific. To contradict some earlier posts, I am fairly sure that more than .1% of computer users play games and most of these users would notice a difference with an improved card.

Fact 6: Arguing that a user should buy a G5 tower and stop complaining about the iMac in this regard is ridiculous. After all, should Mac users really need the 'worlds fastest desktop computer� just to play a first person shooter?

Fact 7: Gaming is a part of the consumer computer experience and the iMac is targeted at the consumer market. If Apple had a Cube-like machine to fill the void (with a decent price) then this would not be as much of an issue, but then do not. The iMac needs to step up. It is just too obvious of an omission.

Fact 8: Graphics constantly is an issue with the iMac as pointed out by many of the reviews out there. BTW, most of these reviews are gushing with praise except for the graphics. Forget PCI slots, most consumers if they upgrade do so with memory, hard drives, and their graphics card. If the card cannot be upgraded later, it needs to be pretty damn good from the start.

Fact 9: This is not just about games. Apple has bragged that Tiger with Core Image/Video will offload more to the graphics card, having stated that graphics cards have gotten so powerful over the last couple of years. The iMac is again missing the boat here. Also, with DVD editing, managing thousands of photos, OSX eye candy, etc., a powerful graphics card can and does help with more than just games.

Fact 10: I so much want to buy this machine, but cannot pull the trigger because of this problem. I am not even asking for a free ride�I will pay for the upgrade�if only I had the option.
     
Thain Esh Kelch
Mac Enthusiast
Join Date: May 2001
Location: Denmark
Status: Offline
Reply With Quote
Sep 11, 2004, 07:26 AM
 
It would have been better if Apple had went with the Radeon 9700 Mobility instead of the CrapFroce 5200 in the iMac...
I mean, the only point in which it doesn beat it, is in price.. But speed and heat wise this baby kicks the CrapForce's butt 10 times.
     
jamesa
Grizzled Veteran
Join Date: Sep 2000
Location: .au
Status: Offline
Reply With Quote
Sep 11, 2004, 07:53 AM
 
Originally posted by jdevalk:
Fact 10: I so much want to buy this machine, but cannot pull the trigger because of this problem. I am not even asking for a free ride�I will pay for the upgrade�if only I had the option.
Good post - I agree with pretty much everything, and like you want to buy one, but not with the GPU.

-- james
     
pliny
Mac Elite
Join Date: Apr 2001
Location: under about 12 feet of ash from Mt. Vesuvius
Status: Offline
Reply With Quote
Sep 11, 2004, 08:34 AM
 
The g5 iMac has overtaken the iPod and moved to the top sales spot at the Apple Store.

Apple are probably at us.

I was thinking about the gpu in my iMac dv from 1999. This is a paltry little 8mb ATI Rage and the graphics and pictures and websites and text on my screen look better (read: crisper, clearer, sharper, cleaner) than the same do on some of the much more recent PCs I also use that have faster gpus and chip frequencies in excess of 2ghz.

No doubt this is due not the advanced or superior hardware in my machine, because it's not, but what Apple's software engineers have done with the Macintosh OS.

This is the true underlying strength of any Macintosh computer, and in the end what sets it apart from a Windows PC. It's not the hardware or price points, because for a long time now the PC side has been able to roll out cheaper more expandable machines in much greater volumes.

I certainly don't like that I can't upgrade or even bto a gpu in the new iMac, and some posters say it might not matter outside of frame rates for some games and Motion; and they may be right, because we haven't seen Tiger running on one yet and what the engineers are doing under the hood.
i look in your general direction
     
loren s
Senior User
Join Date: Jul 2001
Status: Offline
Reply With Quote
Sep 11, 2004, 11:07 AM
 
Extras..

I waiting for innovation from Apple and there long rumored tablet, had to just simply buy a real tablet pc from Windows. And you know what ,, It is the very best graphics tool there is. It lets you go where you want and affect what you need for graphics photoshop zbrusn 3d and flash layout. Apple is really missing out by not listening.

also,, why are there no extrenal video cards ???. Like avid has the MOJO system which is real just want I want, an grahipcs accelerator. Sounds smart for quick upgrades and firewire 800 is greatly fast.
     
Chulo
Junior Member
Join Date: Sep 2003
Status: Offline
Reply With Quote
Sep 11, 2004, 12:50 PM
 
I have got to agree, the video card is the sour point on the new iMac. It would be nice to have a few extra items (gig ethernet & FW800) but if you want it (or really need it that bad) the PowerMac is the system for it since it is geared for professionals. I really don't think it would have cost too much more to have even an older ATI 9600 which is a much better card and is already a couple of generations old. Obviously the ATI 9700 would be evn better but for the sake of "saving" a couple of dollars in production let just say it came with the ATI 9600. It can definitely play a large number of games at somewhat decent frame rates (which is something the kids would want) and process your digital pictures on iPhoto and home videos on iMovie (which parents and teens will want to do) better than the fx5200. Isn't that the purpose of a family computer to do these things?? You are using some of the latest technology (a G5 CPU, faster bus speed, design with lcd monitor) so why cheapen it with this GPU?

I completely agree with jdevalk that if the iMac is to work correctly with the new upcoming OSX operating system it can't have an outdated video card that can not handle the operations needed for the new OSX to operate properly. I also agree that Apple should offer an upgrade option (BTO) which would at least satisfy most peoples desire and ultimately increase their sales (you're right jdevalk, we're not asking for a free ride). If I could afford one (which I can't right now), I probably wouldn't purchase this one strictly because of the video card. Once I pay off my Rev. A 15" Alu PBook I will definitely get a new iMac (but only if they update the video card or offer an option to order a different GPU).

I have no beef with Apple, I just think they sold themselves a bit short and need to rethink this over.
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 11, 2004, 01:49 PM
 
Originally posted by Chulo:
It can definitely play a large number of games at somewhat decent frame rates (which is something the kids would want) and process your digital pictures on iPhoto and home videos on iMovie (which parents and teens will want to do) better than the fx5200.
The difference between the 9700 and 5200 is largely in the supported 3D acceleration functions. You're correct that this has a significant impact on gaming performance. However, it will make no difference at all on iPhoto or iMovie.
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 11, 2004, 03:02 PM
 
Originally posted by jdevalk:
Fact 1: This iMac is much improved over the previous version. This is not in question. However, there are some expectations that come with a 2K computer and a graphics card that is not 1 1/2 years old is one of them.

Fact 2: Apple believes that gaming is important and devotes a good portion of 2 pages primarily to this fact - http://www.apple.com/imac/processor.html and http://www.apple.com/imac/graphics.html. However, 'unparalleled 2D and 3D graphics' is simply not true.

Fact 3: Heat is not a reason that they stuck with this chipset. They included the power adapter in the machine, which itself takes up a fair amount of space and gives off heat. This could be been external leaving a ton of flexibility for an improved graphics card. Maybe not ideal, but still an option.

Fact 4: Even so, from what I have seen in laptops they could have done better even without such a change.

Fact 5: Many computer users are casual gamers and a new machine like this should be more than capable without having to be gaming specific. To contradict some earlier posts, I am fairly sure that more than .1% of computer users play games and most of these users would notice a difference with an improved card.

Fact 6: Arguing that a user should buy a G5 tower and stop complaining about the iMac in this regard is ridiculous. After all, should Mac users really need the 'worlds fastest desktop computer� just to play a first person shooter?

Fact 7: Gaming is a part of the consumer computer experience and the iMac is targeted at the consumer market. If Apple had a Cube-like machine to fill the void (with a decent price) then this would not be as much of an issue, but then do not. The iMac needs to step up. It is just too obvious of an omission.

Fact 8: Graphics constantly is an issue with the iMac as pointed out by many of the reviews out there. BTW, most of these reviews are gushing with praise except for the graphics. Forget PCI slots, most consumers if they upgrade do so with memory, hard drives, and their graphics card. If the card cannot be upgraded later, it needs to be pretty damn good from the start.

Fact 9: This is not just about games. Apple has bragged that Tiger with Core Image/Video will offload more to the graphics card, having stated that graphics cards have gotten so powerful over the last couple of years. The iMac is again missing the boat here. Also, with DVD editing, managing thousands of photos, OSX eye candy, etc., a powerful graphics card can and does help with more than just games.

Fact 10: I so much want to buy this machine, but cannot pull the trigger because of this problem. I am not even asking for a free ride�I will pay for the upgrade�if only I had the option.
Fact 11: Mac's make lousy gaming machines. The variety of games is pitiful, the machine performance-to-price ratio is lousy compared with a PC, the number of supported video cards is embarassing and the number of gaming controllers equally disasterous. If you want to game, get a PC and get over it.

Fact 12: Other graphics chipsets not only run hotter, they require more power and hence a larger power supply, which in itself generates more heat. Unless you somehow have access to Apple's thermal design calculations I wonder how you can state Fact 3 with a straight face.

Fact 13: Despite everyone talking about Core Video and Core Image with great authority, I have yet to see anyone appreciate that these are simply programming API's or to show any evidence that any mainstream applications (other than games), or Quartz Extreme itself, would be unable to perform perfectly adequately. Name the applications that fail, not some vague reference to a programming API.

Fact 14: More powerful 3D graphics engines make absolutely no difference to DVD editing, photo editing or managing photographs.

Fact 15: What evidence is there to show that current 5200 cards in the existing PowerMac range are even slightly taxed by Quartz, and hence why the certainty that Tiger will certainly impose this huge load. Exactly what massive eye candy are you expecting with Tiger?

Fact 16: Altbough not the greatest graphics card, the 5200 Ultra actually got reasonable reviews. I'm not trying to claim this is the greatest graphics card on earth - and you wouldn't want to do heavy gaming on it, but it is not the disaster many in this thread would suggest. Most of these reviews are about 12 months old. Given that no serious gamer would ever buy a Mac, it seems reasonably well matched to the iMac G5's likely audience. Read on:

"2D graphics Quality is superb. Both cards [5200U and 5600U] excellently work at 1600x1200@85Hz." - Digit-Life.com

"No wonder that graphics card makers are especially excited about the low-cost GeForce FX 5200 graphics chips. One of the first mass graphics cards based on NVIDIA GeForce 5200 chip, which we managed to test, a solution from Albatron, is a real proof to the point. This graphics card combines not very high price with very high quality, good performance and contemporary 3D technologies support. Of course, GeForce FX 5200 based graphics cards, just like Albatron’s solution, will feature the same power consumption and heat dissipation, alongside with low noise level. These qualities will make graphics cards like Albatron GeForce FX 5200 an excellent choice for low-cost home gaming systems and as a worthy replacement to morally outdated ATI RADEON 9000 Pro and NVIDIA GeForce4 MX440." XBit-Labs.com

"Considering the performance of the Albatron GeForceFX 5200 Ultra and considering it's a budget videocard which retails for $200 CDN ($155 US), we're very pleased with the results. Often the GeForceFX 5200 Ultra is able to nip at the heels of the much more expensive Albatron GeForceFX 5600P Turbo!" - PCStats.com

"Well, overall we liked the card. We honestly weren't expecting much from it in terms of performance, but it did do well in our tests. Watching the benchmarks, most users could theoretically play games that have great graphics like UT2K3 but if you are purchasing this card for you gaming machine, perhaps you should spend the extra money to get something like the FX5600 Ultra. Overall the card is good for end-users seeking decent performance at a budget price. The major features that make this card appealing is its DirectX9 support ... There are probably some people reading this review thinking, "These guys are idiots. How could they possibly have liked this card?" It's simple, the card does exactly what it was designed to do. When NVIDIA released the FX5200, they only had a few goals in mind. They wanted to release a card that was affordable to the average user that support the latest technology and gave decent performance with some nice features. This card fulfills those goals. It's not intended to be the fastest video card or a mind blowing performer, it's all about cost vs. performance." - ExtremeOverclocking.com

"By contrast, we were more pleased by the FX 5200 Ultra. With respect to performance, the card can hold its own quite well against the Radeon 9000 PRO/ 9200. As a DirectX 9 card, it is already superior to the others on paper, and it also offers multi-sampling FSAA and relatively fast anisotropic filtering. Because of ATi's driver problems, it wasn't possible to make direct comparisons with the 9000/ 9200 series, which still use the slow SuperSampling, but from past experiences with the 9000/ 8500 FSAA test, it can be concluded that the FX 5200 Ultra should be the better performer here. The gains compared to the GeForce4 MX440-8x are quite clear as well. In the entry-level segment, the FX 5200 Ultra is therefore a good choice." - Tom's Hardware
     
funkboy
Professional Poster
Join Date: May 2001
Location: North Dakota, USA
Status: Offline
Reply With Quote
Sep 11, 2004, 03:48 PM
 
Originally posted by PEHowland:
Fact 11: Mac's make lousy gaming machines. The variety of games is pitiful, the machine performance-to-price ratio is lousy compared with a PC, the number of supported video cards is embarassing and the number of gaming controllers equally disasterous. If you want to game, get a PC and get over it.
This is a chicken-and-egg problem: if Macs had more powerful hardware, would we see better games? Who knows.
I want to pay Apple more for the opportunity to have a good graphics card. This is not an unreasonable request, to want to pay more money.

I like playing games occasionally on my Mac simply because it's my Mac - I have a GameCube, yes, but I like to play a game on my Mac sometimes, too.
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 11, 2004, 05:07 PM
 
Originally posted by funkboy:
This is a chicken-and-egg problem: if Macs had more powerful hardware, would we see better games? Who knows.
I want to pay Apple more for the opportunity to have a good graphics card. This is not an unreasonable request, to want to pay more money.

I like playing games occasionally on my Mac simply because it's my Mac - I have a GameCube, yes, but I like to play a game on my Mac sometimes, too.
Maybe, but I think it is actually more subtle. One of the reasons the Mac is so stable and so well-designed, is that it supports only a very limited range of hardware and has a relatively slow update cycle. To satisfy the gaming market the Mac would need to support an update cycle measured in weeks, not months. If we imagine, for a moment, that the Mac suddenly started supported several hundered diffent graphics cards, dozens of different sound cards and many different game controllers, Mac OS would suddenly be in the position of Microsoft Windows, having to support a bewildering range of possible combinations of hardware and drivers from many manufacturers. This would inevitably lead to the kind of stability issues that Windows faces. Furthermore, if the Mac hardware became as expandable as the PC, it would also lose the elegance of its design, as it suddenly becomes just a generic assembly of parts, rather than an engineered solution. It then becomes difficult to see how the Mac would differentiate itself from a PC.

Also, more to the point, it would take much more than a change in graphics card for the Mac to take on the gaming market. Gamers are interested in low-cost raw power not premium-priced elegant design. They want overclocking options, not classy engineering. They need DirectX, not OpenGL. To take on the gaming market would mean too great a departure from Apple's core values.

Has Apple ever supported the gamer? Why the sudden surprise that the new iMac ignores this market too?
     
jamesa
Grizzled Veteran
Join Date: Sep 2000
Location: .au
Status: Offline
Reply With Quote
Sep 12, 2004, 12:22 AM
 
Originally posted by PEHowland:
Fact 11: Mac's make lousy gaming machines. The variety of games is pitiful, the machine performance-to-price ratio is lousy compared with a PC, the number of supported video cards is embarassing and the number of gaming controllers equally disasterous. If you want to game, get a PC and get over it.
It's not just the games, stupid. Quartz extreme (do you know what this is, and how it works?), and now Core Video are both non-gaming activities that rely on the GPU. Quartz extreme determines in part how fast your windowing system works. Which is important.

And also, you might want to let Apple know that games aren't important; they feature prominently on two of the (?five) iMac pages that Apple have put up.


Fact 12: Other graphics chipsets not only run hotter, they require more power and hence a larger power supply, which in itself generates more heat. Unless you somehow have access to Apple's thermal design calculations I wonder how you can state Fact 3 with a straight face.
Well, gee, the fact that Apple can put better graphics cards in their laptops, which are thinner and don't have a guaranteed power source might have something to do with justifying the complaints. Along with the fact they managed to fit in a Power supply.


Fact 13: Despite everyone talking about Core Video and Core Image with great authority, I have yet to see anyone appreciate that these are simply programming API's or to show any evidence that any mainstream applications (other than games), or Quartz Extreme itself, would be unable to perform perfectly adequately. Name the applications that fail, not some vague reference to a programming API.


There might have been discussions about failure, but more importantly is just outright performance.

If outright performance doesn't matter, then there's no need for a G5 - it's just a waste of money. If it does, then applications relying on the GPU are going to suck. Mitigating the need to put in a decent CPU in the first place.


Fact 14: More powerful 3D graphics engines make absolutely no difference to DVD editing, photo editing or managing photographs.


Hands up if you do more than just that with your computer

::hand up::

Plus, with quartz extreme, the 3d part of the GPU is often involved in the rendering of these 2d windows.


Fact 15: What evidence is there to show that current 5200 cards in the existing PowerMac range are even slightly taxed by Quartz, and hence why the certainty that Tiger will certainly impose this huge load. Exactly what massive eye candy are you expecting with Tiger?
Well, none of them are taxed by quartz. It's quartz extreme that everyone is talking about.


Fact 16: Altbough not the greatest graphics card, the 5200 Ultra actually got reasonable reviews. I'm not trying to claim this is the greatest graphics card on earth - and you wouldn't want to do heavy gaming on it, but it is not the disaster many in this thread would suggest. Most of these reviews are about 12 months old. Given that no serious gamer would ever buy a Mac, it seems reasonably well matched to the iMac G5's likely audience. Read on:
Let me just recoup that bit:
Most of these reviews are about 12 months old. Given that no serious gamer would ever buy a Mac, it seems reasonably well matched to the iMac G5's likely audience.

that is the most outlandish bit of logic I've heard in my life.

You expect 18 month old reviews to placate people about a crap GPU? That people were saying that card was an ok value-for-money proposition 18 months ago should pretty much say everything that needs to be said. What happens if I took an 18 month look back at the "value" end of the Apple line, the iMac: well, gee, they were shipping with a 1ghz G4 at the top of the line, and a 800mhz G4 in the cheaper one.

If Apple shipped these CPUs in your machine NOW, would you try to justify those as being adequate for today by digging up 18 month old reviews?


"By contrast, we were more pleased by the FX 5200 Ultra. With respect to performance, the card can hold its own quite well against the Radeon 9000 PRO/ 9200. As a DirectX 9 card, it is already superior to the others on paper, and it also offers multi-sampling FSAA and relatively fast anisotropic filtering. Because of ATi's driver problems, it wasn't possible to make direct comparisons with the 9000/ 9200 series, which still use the slow SuperSampling, but from past experiences with the 9000/ 8500 FSAA test, it can be concluded that the FX 5200 Ultra should be the better performer here. The gains compared to the GeForce4 MX440-8x are quite clear as well. In the entry-level segment, the FX 5200 Ultra is therefore a good choice." - Tom's Hardware
The others are all ok sites but tom's hardware normally tells it like it is, so I did a bit of digging as to what they think now about the GeForce 5200:
"... as well as a series of inexpensive entry-level graphics cards: NVIDIA GeForce FX 5200 64MB (64-bit), GeForce FX 5200 128MB (128-bit)..."
"...FX 5200 and 5600 are still horribly slow so Valve descited to set the DX8 codepath as default for those cards..."
This one I really like. It's from April 2003:
"...ou could say that NVIDIA has already "played its cards" with the introduction of its new product line-up - the high-end FX 5800, the mid-range FX 5600, and the entry-level FX 5200."
It was considered entry-level 18 months ago!
From July 2003:
"Considering the chip's limited performance and its comparatively low clock speeds, the DirectX 9 support is more of a paper feature than a real bonus. In practice, the chip is simply too slow for DirectX 9 calculations at resolutions of 1024x768 and above"

And that's just a cursory examination. I certainly don't think it is how Apple would like the GPU described in their $2000 consumer machine, which they would think is anything but "entry-level", especially when considering it as "entry-level" 18 months ago.

Oh, and for a bit of fun, type GeForce 5200 sucks into google.

-- james
     
Chulo
Junior Member
Join Date: Sep 2003
Status: Offline
Reply With Quote
Sep 12, 2004, 01:34 AM
 
PEHowland,
The review from Tom's Hardware was good to see and it sheds some light as to why possibly Apple choose it but that review was pitted against the ATI 9000 cards. My Alu 15" PBook uses the ATI 9600 and I think it works quite well. It's not a powerhouse like today's high tech cards but it's more capable than the 5200 and shouldn't cost much more. My point is, if a better GPU is much better for games and to a lesser extent the video and digital photo arena wouldn't offering a better video card just be a win-win situation for everyone?

None-the-less, thanks for the explanation. I also agree with you on your position concerning the gaming market. I'm not a big gamer myself but of course my son is (I do dabble in it every so often).
     
hldan
Mac Elite
Join Date: May 2003
Location: Somewhere
Status: Offline
Reply With Quote
Sep 12, 2004, 02:45 AM
 
Damn, a lot of peeps here are rough on the iMac. After doing some homework I found that it really takes a custom built PC to be a gaming PC. Several of the current Sony Vaio's use the Geforce FX 5200 some with 64MB and some with 128MB, like that really matters. I also noticed that several other Vaio's and Dells and mid priced PC's use IGP. Now that's horrible to play games with.
The top end Vaio towers had a pro GPU in them.
I don't think Apple is out of line building the iMac with the 5200 GPU.
I understand that a PC can be custom built by anyone and a Mac can only be built by Apple so there's no way to make it like you want it but out of the box most PC manufacturers give about the same as Apple pound for pound.
iMac 24" 2.8 Ghz Core 2 Extreme
500GB HDD
4GB Ram
Proud new Owner!
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 12, 2004, 03:32 AM
 
Originally posted by jamesa:

The others are all ok sites but tom's hardware normally tells it like it is, so I did a bit of digging as to what they think now about the GeForce 5200:
"... as well as a series of inexpensive entry-level graphics cards: NVIDIA GeForce FX 5200 64MB (64-bit), GeForce FX 5200 128MB (128-bit)..."
"...FX 5200 and 5600 are still horribly slow so Valve descited to set the DX8 codepath as default for those cards..."
This one I really like. It's from April 2003:
"...ou could say that NVIDIA has already "played its cards" with the introduction of its new product line-up - the high-end FX 5800, the mid-range FX 5600, and the entry-level FX 5200."
It was considered entry-level 18 months ago!
From July 2003:
"Considering the chip's limited performance and its comparatively low clock speeds, the DirectX 9 support is more of a paper feature than a real bonus. In practice, the chip is simply too slow for DirectX 9 calculations at resolutions of 1024x768 and above"

And that's just a cursory examination. I certainly don't think it is how Apple would like the GPU described in their $2000 consumer machine, which they would think is anything but "entry-level", especially when considering it as "entry-level" 18 months ago.
[/B]
I haven't the energy to go and find the originals of the quotes you give above, but it seems that all of these are referring to the FX5200, not the FX5200 Ultra. If you look at the reviews, you'll see that the FX5200 got universally bad reviews, whereas the FX5200 Ultra actually got quite positive reviews - when judged as a low/mid range card. The main criticism of it was usually its pricing.

Well anyway, this is going round in circles. Frankly, the 5200 Ultra is more than enough for my needs and I suspect for many. I have a Radeon 9700 Pro in my PC, and if I want to game, I use that. I think it highly unlikely that Quartz Extreme will be beyond the capabilities of the card - I'm not sure what kind of eye candy you are expecting from Tiger, but you clearly have a different vision than me. Only time will tell however.
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 12, 2004, 05:44 AM
 
Originally posted by jamesa:
[B]It's not just the games, stupid. Quartz extreme (do you know what this is, and how it works?), and now Core Video are both non-gaming activities that rely on the GPU. Quartz extreme determines in part how fast your windowing system works. Which is important.
Are normally this rude and patronising? Or just when you're sitting safely behind your keyboard? Try not to get emotional over a piece of hardware James, it's very undignified.

I know perfectly well what Quartz Extreme is. QE is an OpenGL acceleration of Quartz Compositor (the bit of MacOS that assembles each application's window onto the screen to form the final image). Essentially an OpenGL window server. It means the graphics card's GPU is used to render the final image on the screen, rather than the CPU. It doesn't accelerate the generation of the window contents itself, just their final composition on the screen. It has been in MacOS X since 10.2 and more pertinent to this discussion, I haven't yet seen anyone provide a link that suggests that the GeForceFX 5200 Ultra (as supplied in the PowerMacs) can't handle it. Perhaps you could explain why it works fine in the PowerMacs but isn't going to work in the iMac?

I suggest you learn what Core Image and Core Video are. They are a programming API's. They introduce nothing that isn't already available. They are simply a way of allowing the programmer to develop hardware accelerated applications with a higher level of programming abstraction. If the programmer has already implemented an effect at the pixel level, introducing Core Image/Video will make no difference whatsoever. It's just that the program will be easer to develop and maintain and less bloated in terms of source code. Again, if you could explain why switching to Core Video would suddenly cause an application to deteriorate with the 5200 Ultra, I'd be grateful.

The implication that the 5200 Ultra can support MacOS 10.3 but won't be able to handle 10.4 (or will in some way deteriorate) simply demonstrates a misunderstanding of what these technologies are all about.
     
acknak
Fresh-Faced Recruit
Join Date: Sep 2004
Location: Australia
Status: Offline
Reply With Quote
Sep 12, 2004, 06:01 AM
 
I thought only the Apple towers had separate video cards. Every other Mac model incorporates the graphic-card as chips integrated into the main logic board. This holds true for every laptop and every iMac/eMac. So for Apple to make a build-to-order iMac/G5 with a better GPU would mean a different logic board, with different power consumptons, colling requirements, etc. Not a trivial problem.
Don't whinge to me, just do it.
     
jamesa
Grizzled Veteran
Join Date: Sep 2000
Location: .au
Status: Offline
Reply With Quote
Sep 12, 2004, 07:32 AM
 
Originally posted by PEHowland:
Are normally this rude and patronising? Or just when you're sitting safely behind your keyboard? Try not to get emotional over a piece of hardware James, it's very undignified.
I suggest that you focus on your grammar instead of using big words like "patronising". No, I am not normally rude and patronising, but neither was I then. Using the turn of phrase "it's the XXX, stupid" is really quite common and not meant to be patronising.

And I'm not getting emotional over a piece of hardware. But I am getting frustrated at all the people who seem to have all this technical knowledge but don't seem to understand what many of these technologies are predicated on - the GPU.


I know perfectly well what Quartz Extreme is. QE is an OpenGL acceleration of Quartz Compositor (the bit of MacOS that assembles each application's window onto the screen to form the final image). Essentially an OpenGL window server. It means the graphics card's GPU is used to render the final image on the screen, rather than the CPU. It doesn't accelerate the generation of the window contents itself, just their final composition on the screen. It has been in MacOS X since 10.2 and more pertinent to this discussion, I haven't yet seen anyone provide a link that suggests that the GeForceFX 5200 Ultra (as supplied in the PowerMacs) can't handle it. Perhaps you could explain why it works fine in the PowerMacs but isn't going to work in the iMac?
Sorry, where did I say that it wasn't going to work on the iMac?

My beef is simple: there is no point charging people $2000 for a computer, putting in a fast CPU when one of the components the computer is becoming increasingly reliant on (the GPU) is hamstrung. If it's worthy of having a decent CPU, then it's worthy of a decent GPU as well. Having described the workings of quartz extreme to us all, will you admit that it's performance is based on the performance of the GPU in the system it is running on?


I suggest you learn what Core Image and Core Video are. They are a programming API's. They introduce nothing that isn't already available. They are simply a way of allowing the programmer to develop hardware accelerated applications with a higher level of programming abstraction.
So much for not being "rude and patronising".

I am fully aware of what Core Image and Core Video are. Yes, they are APIs that will be implemented by programmers, but no, they are not only going to be used by programmers. They will be implemented by programmers and then used by the masses.

The way Apple has implemented these technologies? A reliance on the GPU. They don't tax the CPU as much as has been traditionally done, because again Apple have realised that the GPU is sitting pretty for these tasks. Therefore, many of the tasks that are implemented by these APIs (and which many programmers will use because it's already done for them, and done fast) will be used by ordinary people using ordinary apps.

And guess what? Because they're GPU reliant, GPU performance is going to be important.

And guess what else? The GPU in the new iMac sucks, so it's going to be dead slow on these machines despite having a G5 processor in it.


If the programmer has already implemented an effect at the pixel level, introducing Core Image/Video will make no difference whatsoever. It's just that the program will be easer to develop and maintain and less bloated in terms of source code. Again, if you could explain why switching to Core Video would suddenly cause an application to deteriorate with the 5200 Ultra, I'd be grateful.
Because the new bottleneck becomes the GPU, not the CPU. The CPU on the iMac is quite reasonable for a $2000 machine - but the GPU is not. Therefore all core video tasks will be slower, to the tune of the benchmarks that were posted on these forums earlier.


The implication that the 5200 Ultra can support MacOS 10.3 but won't be able to handle 10.4 (or will in some way deteriorate) simply demonstrates a misunderstanding of what these technologies are all about.
no, you're the one demonstrating the misunderstanding - you replied to my post, but I never said that.

What I said was that in a $2k machine, with an increasing number of technologies relying on the GPU as well as the number of people that would like to play 3d games (enough for Apple to make numerous mention of them on their site), shipping a $30 GPU in the machine does not make sense and is insufficient, especially in light of the fact the GPU cannot be changed, either at the time of purchase or any time after it.

This is also with regard to the fact that it's Apple's only mainstream, affordable desktop machine. I can identify very few other hardware manufacturers (which is what Apple still remains) that locks users into this same limited choice.

-- james
     
jamesa
Grizzled Veteran
Join Date: Sep 2000
Location: .au
Status: Offline
Reply With Quote
Sep 12, 2004, 07:34 AM
 
Originally posted by acknak:
I thought only the Apple towers had separate video cards. Every other Mac model incorporates the graphic-card as chips integrated into the main logic board. This holds true for every laptop and every iMac/eMac. So for Apple to make a build-to-order iMac/G5 with a better GPU would mean a different logic board, with different power consumptons, colling requirements, etc. Not a trivial problem.
absolutely not, but one that Apple have already managed to surmount. As posted by JustinD in the previous page of this thread:


Please note; iMac, different graphics cards.

Enough excuses for Apple already!

-- james
     
loren s
Senior User
Join Date: Jul 2001
Status: Offline
Reply With Quote
Sep 12, 2004, 09:39 AM
 
wow so much to reaD....

What I gathered. Tiger will suck on core image and core video. The computer as it is now will be fine in 10.3 but when 10.4 comes around all of the graphics stuff will not hold up...

Apple is just not in the job to give hard core fans what they want, just too gather more market share and save where they can.

Eh,,�

stuff will work, not as fast as they could but they will work.
     
pliny
Mac Elite
Join Date: Apr 2001
Location: under about 12 feet of ash from Mt. Vesuvius
Status: Offline
Reply With Quote
Sep 12, 2004, 10:04 AM
 
Originally posted by jamesa:


Therefore all core video tasks will be slower, to the tune of the benchmarks that were posted on these forums earlier.
This is a bit of an overstatement. The bulk of your post suggests that every OS task or application running on Tiger will require Core video or Core Image hardware acceleration.

You also suggest that every machine running Tiger will require running and producing fully hardware accelerated effects and that every machine will have to run with the greatest hardware accelerated effects, in order to run well or even acceptably.

As has been pointed out, Core Image and Core Video are API's, which will allow developers to easily access these features. Many applications obviously will not require them. Tiger is not going to be doing all these fancy effects and transitions and blurs and ripples every time you open and close a window. (Although I suppose I can see all sorts of ways to integrate eye popping effects when checking my email. )

It is not reasonable to suggest that Tiger expects maximum hardware acceleration for everything, and that therefore the g5 iMac will run unacceptably in Tiger because of Core Image or Core Video API availability, because this is not the case.

If it were, it would require very expensive systems for everyone. And every company with an application would have to insert these calls into their program code blah blah blah. The ability to call these functions is available but not every developer will know or even care about them and they are not part of every function of the OS.

These API's obviously seem designed for higher end machines with higher end cards. In time maybe there will be trickle down and at some point many more applications, perhaps, will feature some of these calls. And no doubt Apple will be in the lead in this area.

And then there are the other enhancements to Tiger that can reasonably be expected to deliver improved performance outside of these API's, which the iMac will certainly take advantage of.

And all of this is not meant to suggest I'd not prefer a gpu BTO or even <gasp> a swappable version.
( Last edited by pliny; Sep 12, 2004 at 10:10 AM. )
i look in your general direction
     
pliny
Mac Elite
Join Date: Apr 2001
Location: under about 12 feet of ash from Mt. Vesuvius
Status: Offline
Reply With Quote
Sep 12, 2004, 10:08 AM
 
edit: double post sorry
i look in your general direction
     
loren s
Senior User
Join Date: Jul 2001
Status: Offline
Reply With Quote
Sep 12, 2004, 10:08 AM
 
wow go read the forums at apple, it is much worse there. More like a mine field of yelling.

My only last thought to this is small.


One feature, just add one more feature, Built to order or upgrade. So simple so small with there great skillz.

Now the only problem with all of this is the sad fact that the the computer will sell it will sell so much so that apple will not even give it a second thought, just like the price of ipod mini, ...
     
JustinD
Dedicated MacNNer
Join Date: Jan 2001
Location: NYC
Status: Offline
Reply With Quote
Sep 12, 2004, 12:11 PM
 
Originally posted by acknak:
I thought only the Apple towers had separate video cards. Every other Mac model incorporates the graphic-card as chips integrated into the main logic board. This holds true for every laptop and every iMac/eMac. So for Apple to make a build-to-order iMac/G5 with a better GPU would mean a different logic board, with different power consumptons, colling requirements, etc. Not a trivial problem.
In fact, as jamesa mentioned above, I'm pretty sure there are three different LoBos - one for the edu, one fit for the 17", and a slightly larger one for the 20". They should have just made a fourth (and fifth maybe), or just modified them to work with an expansion slot so they could very very easily offer BTO options.
*justin

Isn't logic swell? It gives answers without really answering anything!
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 12, 2004, 12:37 PM
 
Originally posted by JustinD:
In fact, as jamesa mentioned above, I'm pretty sure there are three different LoBos - one for the edu, one fit for the 17", and a slightly larger one for the 20". They should have just made a fourth (and fifth maybe), or just modified them to work with an expansion slot so they could very very easily offer BTO options.
It's not that simple.

We already know that Apple had thermal problems with the iMac which have delayed its release. Whilst introducing a slower GPU for the education market would help thermal issues, introducing a faster GPU as a BTO option would increase the thermal issues. My guess is they don't have any margin to play with. Faster graphics cards generate significant extra heat (you only have to compare the cooling solutions on top-end cards with those on budget cards to see that) and also draw significant extra power (adding a Radeon 9800 to some PC's can cause them to stop booting, simply because of the additional power drawn). So, adding a GPU that is significantly better than the 5200U may well have been simply beyond the thermal design of the iMac - and possibly beyond the power supply as well (and larger PSU's also generate more heat). Talking about better graphics cards in the Powerbook is somewhat of a red herring - Apple have already said they having difficulties getting a G5 Powerbook made due to - again - thermal issues. I don't think it's any accident that the GPU is the 5200U. I think anything faster was beyond the current thermal design of the iMac.

Who knows? We're both guessing - although I do have direct experience of both thermal and power supply problems with the Radeon 9700 Pro in a small form factor PC, when a GeForce Ti4200 worked fine. But saying that Apple could "very very easily offer BTO options" seems like a gross underestimation of the engineering problems. Perhaps they could offer a BTO option which also involved souped-up high-speed fans, but then other design characteristics, such as 25dBA sound levels, would be sacrificed.

The iMac is an all-in-one machine. If you buy an all-in-one machine, you need to accept certain compromises. Just like when you buy a laptop. If you don't like those compromises, you buy a tower - be it a Powermac or a PC. I don't think shouting at Apple will change the physics or make a faster graphics card any easier to provide.
     
Lateralus
Moderator Emeritus
Join Date: Sep 2001
Location: Arizona
Status: Offline
Reply With Quote
Sep 12, 2004, 12:42 PM
 
Originally posted by PEHowland:
It's not that simple.

We already know that Apple had thermal problems with the iMac which have delayed its release.
That's the first I've heard of it then.

As far as I know, the G5 iMacs were delayed due to G5 supply problems.
I like chicken
I like liver
Meow Mix, Meow Mix
Please de-liv-er
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 12, 2004, 12:50 PM
 
Originally posted by PowerMacMan:
That's the first I've heard of it then.
As far as I know, the G5 iMacs were delayed due to G5 supply problems.
Sorry, I think you're correct. I thought I read this online, but maybe I was geting confused with the reasons for the lack of G5 Powerbook.

Nevertheless, whether the new iMac was delayed due to thermal issues or not, I think the basic argument that the thermal design of the iMac has little headroom remains correct. The fact that the iMac is very similar in design to a laptop, and yet they can't produce a G5 laptop, tells me that thermal design is obviously a limitation.
     
Simon
Posting Junkie
Join Date: Nov 2000
Location: in front of my Mac
Status: Offline
Reply With Quote
Sep 12, 2004, 12:57 PM
 
Originally posted by PEHowland:
Nevertheless, whether the new iMac was delayed due to thermal issues or not, I think the basic argument that the thermal design of the iMac has little headroom remains correct. The fact that the iMac is very similar in design to a laptop, and yet they can't produce a G5 laptop, tells me that thermal design is obviously a limitation.
Sorry, but that's pure bull.

How many times does it have to be stated on this forum that it's not true until everybody stops posting this kind of FUD?

1. There are faster GPUs that don't produce more heat.

2. Apple itself uses such GPUs in even tighter enclosures (like the ultra-thin 17" PB)

3. If they could fit a big fat hot PS in there instead of using an external brick, they could surely sacrifice it for a decent GPU.

Damit, some people here just need to stop defending Apple and instead learn to use their brains. All people here are asking for is a simple BTO option. Nobody needs to take it, but the choice should be offered. It's certainly not too much to ask if Apple thinks a upper middle range priced computer should have the lowest end $50 GPU. Just because Apple makes something doesn't mean it's a good idea. Learn to live with that.
     
Jablabla
Fresh-Faced Recruit
Join Date: Jan 2000
Location: mars,ca,usa
Status: Offline
Reply With Quote
Sep 12, 2004, 01:11 PM
 
Just buy a dual g4 tower for less and pop in a high end video card. End of story.

I have to sympathise a little with the iMac graphics lacking. Ya, the parents dont need it but the kids need it! When pop is at the store the kid wants the game!

Apple should have gone to the video card manufacturer and said: look guys we are going to sell many of these can you give us a discount? Then pass on the savings...

On the other hand, frame rate of over 28 per second is not going to be noticed by the human eye. Though, newer games are just going to have models with more polygons. The result being they move more hardware. Its a scam in a way...

I can also see the evil Apple in this. They're thinking well if its does video well enough then people might not buy the tower...

     
bbxstudio
Grizzled Veteran
Join Date: Sep 2002
Location: Canada
Status: Offline
Reply With Quote
Sep 12, 2004, 01:12 PM
 
Originally posted by turtle777:
Shut up, Ca$h !
There is a bitchin' thread for folks like you, please rant out there !
The self-apointed forum police have spoken.
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 12, 2004, 01:28 PM
 
Originally posted by Simon:
[B]
1. There are faster GPUs that don't produce more heat.
Such as?
2. Apple itself uses such GPUs in even tighter enclosures (like the ultra-thin 17" PB)
With a G5 processor inside?
3. If they could fit a big fat hot PS in there instead of using an external brick, they could surely sacrifice it for a decent GPU.
True.
Damit, some people here just need to stop defending Apple and instead learn to use their brains. All people here are asking for is a simple BTO option. Nobody needs to take it, but the choice should be offered. It's certainly not too much to ask if Apple thinks a upper middle range priced computer should have the lowest end $50 GPU. Just because Apple makes something doesn't mean it's a good idea. Learn to live with that.
I quite agree. In fact, I don't even own an Apple (yet) although I do have a Powerbook 1.5GHz and iMac 1.8GHz on order. I'm no apologist for Apple - in fact, I've regarded their systems as overpriced, underpowered machines for a long time. The iMac G5 actually changed that, believe it or not. I'm just unconvinced adding in a significantly faster GPU is as trivial as you suggest.
     
Lateralus
Moderator Emeritus
Join Date: Sep 2001
Location: Arizona
Status: Offline
Reply With Quote
Sep 12, 2004, 01:34 PM
 
Originally posted by PEHowland:
Such as?

...the Radeon 9600.
I like chicken
I like liver
Meow Mix, Meow Mix
Please de-liv-er
     
Simon
Posting Junkie
Join Date: Nov 2000
Location: in front of my Mac
Status: Offline
Reply With Quote
Sep 12, 2004, 01:35 PM
 
Originally posted by PEHowland:
Such as?
All mobile versions of the 9600, 9700 or even the 9800.

With a G5 processor inside?
Of course not. But the 1.5GHz 7445A runs hotter than a 1.8GHz 970fx anyway. So that actually supports my argument even more so.

I'm just unconvinced adding in a significantly faster GPU is as trivial as you suggest.
I can't judge how trivial this task is. I'm not an engineer and I don't work for Apple. But as a physicist I'm used to gathering facts and then using simple deduction. The facts I gather here tell me Apple already can do it, they just chose to not give us the choice. And that, in one word, just sucks.
     
pliny
Mac Elite
Join Date: Apr 2001
Location: under about 12 feet of ash from Mt. Vesuvius
Status: Offline
Reply With Quote
Sep 12, 2004, 02:21 PM
 
I wish somebody would post some specs for heat and cooling for various cards, and maybe as they've been used in various current Apple configurations. That might give us a better idea of what some of the ranges and design problems might be.

Even without this I still think the best reaosn I've heard for not offerign a BTO is that doing so might cut into PowerMac sales. It would also be a first for the iMac.

Still, the g5 iMac at least offers the chance to swap out the hdd and optical drive in much easier way than past iMacs (and RAM seems easier to deal with than in the lamp); and also the LCD.
i look in your general direction
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 12, 2004, 02:47 PM
 
Originally posted by PowerMacMan:
...the Radeon 9600.
Really? So what is the power consumption of the 5200 Ultra and the Radeon 9600?

Same question to Simon on the mobile 9700 and 9800.

I'm genuinely curious to learn how much less power these cards draw.
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 12, 2004, 02:49 PM
 
Originally posted by Simon:
Of course not. But the 1.5GHz 7445A runs hotter than a 1.8GHz 970fx anyway. So that actually supports my argument even more so.
I bow to your greater knowledge here. But why then would Apple consistently claim that we shouldn't expect a G5 Powerbook any time soon due to thermal difficulties?

Why not just whip out the G4 and slap in a 1.8GHz G5. Seems too good to be true...
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 12, 2004, 02:52 PM
 
Originally posted by Simon:
I can't judge how trivial this task is. I'm not an engineer and I don't work for Apple. But as a physicist I'm used to gathering facts and then using simple deduction. The facts I gather here tell me Apple already can do it, they just chose to not give us the choice. And that, in one word, just sucks. [/B]
I am an engineer (PhD in EE) but don't work for Apple either. As an engineer I'm very familiar with the reality that "the facts" and the reality of actually implementing something in hardware are often two very different beasts. That's why I'm prepared to give Apple the benefit of the doubt on this one. The facts I've heard so far don't convince me that the problem is necessarily as trivial as people are hoping.
     
dws
Forum Regular
Join Date: Apr 2001
Location: Minneapolis, MN, USA
Status: Offline
Reply With Quote
Sep 12, 2004, 03:19 PM
 
A note about the Core Image and Core Video API sets; since there seems to be a great deal of confusion about this subject...

The API sets are extremely intelligent. They look at the combination of CPU and GPU and act accordingly. If the GPU is incapable of handling a certain action by itself, then the API attempts to use the GPU and CPU in combination. If neither is capable, then special fail-safe routines occur which allow for the program to keep running, even though the affect is not actually accomplished. For example, many have seen the Dashboard ripple affect. On a G5 iMac, the GPU would handle the affect alone. On a G4 iMac, a combination of the CPU and GPU might make the ripple do its thing. On a G3 iMac, the API call is bypassed entirely; with no CPU or GPU usage spike from the failed attempt - and no ripple affect.

Basically, what this means for the G5 iMac is that all Tiger eye-candy will appear; though some affects might utilize the CPU in addition to the GPU.

How about other programs? Let's say that Apple puts out a new version of iMovie that incorporates the Core Video APIs for transitions. Depending on the complexity of the transitions, some of them might utilize the CPU in addition to the GPU; while most of them will be handled by the GPU alone. Everything will still work, but some transitions will simply be a little slower to render than others (though much faster than in a G4 iMac).

What about programs that fully utilize the bleeding edge functions within Core Image and Core Video. Again, the CPU will take up the slack. Eventually, programs will come out that will be so dog-slow (due to huge CPU utilization and the 5200 GPU) that they might not be a lot of fun to use on the G5 iMac; but these ultra-high-end video processing applications will certainly be meant for an audience that has something other than a consumer desktop.

One question that nobody outside of Apple knows the answer to is how much of a hit will the CPU take if it is used too much by APIs that look at the 5200 GPU and shake their heads in disgust! I'm sure that Apple will not include anything in the Tiger GUI that would create CPU usage spikes that would hamper the functionality of other programs, but other programs might easily do so. This is especially true for Core Image APIs that offer the coolest (and very easy to program) compositing functions. Too many of these calls, in too short a time period, just so your app looks way cool, could show significant degradation in the overall G5 iMac performance.

The bottom line is that the combination of the G5, the faster FSB, and the (somewhat skimpy) GPU in the new iMac will create a desktop computer that will handle everything Core Image and Core Video throws at it; though possibly at speeds that won't thrill some people.
     
jamesa
Grizzled Veteran
Join Date: Sep 2000
Location: .au
Status: Offline
Reply With Quote
Sep 12, 2004, 09:33 PM
 
Originally posted by PEHowland:
I bow to your greater knowledge here. But why then would Apple consistently claim that we shouldn't expect a G5 Powerbook any time soon due to thermal difficulties?

Why not just whip out the G4 and slap in a 1.8GHz G5. Seems too good to be true...
I'm not claiming to know that one. But there's an interview floating round (think it was originally on MacWorld) with an Apple guy that says that the G5 was no big probs to get into the iMac simply because of the additional thickness as compared to the Powerbooks. They can set up their cooling "zones".

If they can fit a goddamm PSU in there, they can work out a way to put a better GPU in there as well. I think heat is a distraction from the issue.

-- james
     
george68  (op)
Banned
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 12, 2004, 10:42 PM
 
Originally posted by Jablabla:

On the other hand, frame rate of over 28 per second is not going to be noticed by the human eye. Though, newer games are just going to have models with more polygons. The result being they move more hardware. Its a scam in a way...
:
But guess what? If you run UT 2004 and you ONLY get 28fps, as soon as you start driving around a vehicle shooting at stuff you're going to be getting 10-15fps. This is why you want 50-60fps, so when the action starts ahppening it doesn't get all laggy.

- Rob
     
Simon
Posting Junkie
Join Date: Nov 2000
Location: in front of my Mac
Status: Offline
Reply With Quote
Sep 13, 2004, 03:35 AM
 
Originally posted by PEHowland:
But why then would Apple consistently claim that we shouldn't expect a G5 Powerbook any time soon due to thermal difficulties?
Well, they claimed this regarding the iMac after the PowerMac intro as well. The G5 PowerBook is on its way. It's just going to take some time. However, this has nothing to do with the GPU in the iMac.

Why not just whip out the G4 and slap in a 1.8GHz G5. Seems too good to be true...
Because the G5 comes along with a new board design including a much accelerated bus and a novel type system controller that produces large amounts of heat. Eug posted some info on this chip in another thread, but I'm too lazy to go and find it. And then, as we all know, the PowerBook doesn't offer two inches of thickness like the iMac does. And since Apple chose to deal with the heat limitations of a narrow case, they should have maybe also looked at mobile GPUs instead of desktop versions. I'm ready to guess that a 9800 mobile would still slaughter a 5200. Makes me wonder, if once again Apple just chose the cheaper over the better. Even if BTO would have catered to both issues...

The issue here is not that it's easier to build a thin G5 computer, the issue is that some people on this board, in their crusade to defend Apple by all means, are trying to tell me that a 5200 FX Ultra plus an internal PS run cooler than a 9700 alone. And that is just pure baloney. But, what is Apple telling us with this? They rather force us to buy sucky 3d performance on a $2000 computer than an external PS brick. I'd say that speaks for itself.
( Last edited by Simon; Sep 13, 2004 at 03:45 AM. )
     
Simon
Posting Junkie
Join Date: Nov 2000
Location: in front of my Mac
Status: Offline
Reply With Quote
Sep 13, 2004, 03:49 AM
 
Originally posted by PEHowland:
The facts I've heard so far don't convince me that the problem is necessarily as trivial as people are hoping.
So, as an EE, are you ready to tell us that a 5200 ultra and an internal PS together run less hot than a 9700 alone? I don't really hope so.
     
Spliffdaddy
Posting Junkie
Join Date: Oct 2001
Location: South of the Mason-Dixon line
Status: Offline
Reply With Quote
Sep 13, 2004, 06:42 AM
 
heat dissipation/power requirement for most common video cards:

http://www.xbitlabs.com/articles/vid...v-power_9.html


There is no heat issue that's stopping Apple from incorporating a decent GPU in the iMac.
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 13, 2004, 07:26 AM
 
Originally posted by Simon:
So, as an EE, are you ready to tell us that a 5200 ultra and an internal PS together run less hot than a 9700 alone? I don't really hope so.
No, I already agreed with you on that point, a few posts up. But given that Apple decided not to use an external power brick, the question remains whether it is thermally possible to use anything much better than a 5200 Ultra. You have just noted yourself that Apple had thermal difficulties with the iMac and that the G5 requires a heat-producing controller that is making a laptop difficult. Given that a laptop does have an external power source and the iMac doesn't - and Apple still can't get the laptop working, it suggests that the G5 probably uses up much of the advantage of the 2" case, and some.

So we agree that an external PSU would have been a better option. Perhaps this will arrive in Rev B. But given that Rev A uses an internal PSU I remain confident that thermal issues prevented Apple from providing anything faster.

I note that noone has actually been able to show that there are any faster GPU's that run cooler than the 5200 Ultra - and Spliffdaddy's link below is a good indication of how power consumption scales with performance ... not well.
     
SpeedRacer
Senior User
Join Date: Jul 2000
Location: Istanbul
Status: Offline
Reply With Quote
Sep 13, 2004, 08:00 AM
 
As an original iMac owner who upgraded to a Voodoo 2 and competed toe-toe with PC freaks running 80fps faster back in the day I have to say I feel for the cheap gaming Mac perspective.

Admittedly the iMac line has never met the needs of a state of the art FPS shooter market, however "gaming" is a very broad market/description and the iMac (new and old) DOES cater to a good enough portion of it in relation to Apple's overall drive/interest in the market. Apple is not a gaming company. They cannot now (and doubtful ever will) be able to compete with an aftermarket PC industry with a dozen different video card manufacturers, hundreds of different video card models, and tens of thousands of driver programmers. Financially ignorant or not, Apple is not pursuing the elite FPS market in gaming.

However, as I said, I feel for the cheap gaming Mac perspective, so for the purposes of contributing positively to this discussion I do feel there IS a compact, quiet Mac with upgrade possibilities that (complete at under $1k) has better gaming prowess, better upgrade possiblities, and (arguably) better looks than the iMac line...
  • $400 eBay purchase price
  • $200 video card upgrade
  • $300 processor upgrade
  • $99 for whatever else you like!
  • ------------------------------------
  • $999 total

It's hip to be square. Here's even a couple links to get you started...

Second Chance PC

Low-End Mac's Page

Sonnet Upgrades

PowerLogix Clear Case

CubeOwer

Kemplar Aftermarket Mac Systems

Wired News Article

Speed
( Last edited by SpeedRacer; Sep 13, 2004 at 08:07 AM. )
     
pantalaimon
Forum Regular
Join Date: Sep 2003
Status: Offline
Reply With Quote
Sep 13, 2004, 09:13 AM
 
Originally posted by SpeedRacer:
[B]

However, as I said, I feel for the cheap gaming Mac perspective, so for the purposes of contributing positively to this discussion I do feel there IS a compact, quiet Mac with upgrade possibilities that (complete at under $1k) has better gaming prowess, [URL=http:Speed [/B]
"has better gaming prowess"

haha what? the cube faster than the G5 at gaming??
1.33GHz G4 iBook 12"
     
jamesa
Grizzled Veteran
Join Date: Sep 2000
Location: .au
Status: Offline
Reply With Quote
Sep 13, 2004, 09:58 AM
 
Originally posted by SpeedRacer:
However, as I said, I feel for the cheap gaming Mac perspective, so for the purposes of contributing positively to this discussion I do feel there IS a compact, quiet Mac with upgrade possibilities that (complete at under $1k) has better gaming prowess, better upgrade possiblities, and (arguably) better looks than the iMac line...
thank you for the positive contribution

... but that it should come to us having to go to those lengths to get what we want...

-- james
     
Sparkletron
Forum Regular
Join Date: May 2004
Status: Offline
Reply With Quote
Sep 13, 2004, 10:30 AM
 
Originally posted by PEHowland:
No, I already agreed with you on that point, a few posts up. But given that Apple decided not to use an external power brick, the question remains whether it is thermally possible to use anything much better than a 5200 Ultra.
Well there you go; remove the internal power supply; heat problem solved.

One of the problems with the Cube is that it does not allow for additional improvements (i.e., heat) unless a fan is added. Of course, given that fanlessness and not merely the cubish form factor was the overriding engineering design constraint, adding a fan would have made the Cube an engineering failure. In other words: even if the Cube had sold like crazy, there is no way Apple could have improved it without substantially altering the design and almost certainly adding some form of active cooling. As a fanless engineering platform, then, the Cube had nowhere to go.

I bring this up because I find it hard to believe that Apple would make the same mistake twice: to design a platform in which its heat tolerance is so close to the margin that a GPU upgrade will push it over. If this is indeed the case, then the next revision will more likely be a complete and costly redesign.

-S
     
Spliffdaddy
Posting Junkie
Join Date: Oct 2001
Location: South of the Mason-Dixon line
Status: Offline
Reply With Quote
Sep 13, 2004, 12:20 PM
 
Originally posted by PEHowland:
I note that noone has actually been able to show that there are any faster GPU's that run cooler than the 5200 Ultra - and Spliffdaddy's link below is a good indication of how power consumption scales with performance ... not well.
Looks rather linear to me.

You go three times as fast - you use three times the power.

Or, Apple could have upgraded the 5200 to a Radeon 9600XT and saved one watt.

If heat was any sort of an issue for Apple, they would have used a 'mobile' version of a GPU, an external power brick, and maybe a low-power chipset (with a slower FSB).

Excluding poor thermal design, I don't see how 25 extra watts of dissipated heat would create a problem. Heck, each occupied RAM slot contributes just as much heat.

That being said, having a remote power brick would eliminate one-quarter of the internal heat. Figuring that most power supplies are about 75% efficient, give or take.

Having an internal power supply is typically the most appealing solution - but not if the product's design must suffer, and certainly not if it says Apple on it. Maybe they should offer a chinless BTO option that includes a (suddenly attractive) power brick and a Radeon9800 Pro.
( Last edited by Spliffdaddy; Sep 13, 2004 at 12:37 PM. )
     
PEHowland
Forum Regular
Join Date: Sep 2004
Status: Offline
Reply With Quote
Sep 13, 2004, 01:46 PM
 
Originally posted by Spliffdaddy:
Looks rather linear to me.

You go three times as fast - you use three times the power.

Or, Apple could have upgraded the 5200 to a Radeon 9600XT and saved one watt.

If heat was any sort of an issue for Apple, they would have used a 'mobile' version of a GPU, an external power brick, and maybe a low-power chipset (with a slower FSB).

Excluding poor thermal design, I don't see how 25 extra watts of dissipated heat would create a problem. Heck, each occupied RAM slot contributes just as much heat.

That being said, having a remote power brick would eliminate one-quarter of the internal heat. Figuring that most power supplies are about 75% efficient, give or take.

Having an internal power supply is typically the most appealing solution - but not if the product's design must suffer, and certainly not if it says Apple on it. Maybe they should offer a chinless BTO option that includes a (suddenly attractive) power brick and a Radeon9800 Pro.
Or you could buy the Sony Vaio W. That's a nice all-in-one 17" LCD machine costing a mere $1900. It even features integrated SiS651 graphics with a whopping 32MB of video RAM plus external power supply. Compared to the nearest competition - and Sony are no fools when it comes to elegant PC design - Apple seem to have done rather well for a $1500 machine. If it's so easy to include a 9800 in an LCD-based computer one wonders why Sony opted for integrated graphics...
     
jamesa
Grizzled Veteran
Join Date: Sep 2000
Location: .au
Status: Offline
Reply With Quote
Sep 13, 2004, 09:15 PM
 
Originally posted by PEHowland:
Or you could buy the Sony Vaio W. That's a nice all-in-one 17" LCD machine costing a mere $1900. It even features integrated SiS651 graphics with a whopping 32MB of video RAM plus external power supply. Compared to the nearest competition - and Sony are no fools when it comes to elegant PC design - Apple seem to have done rather well for a $1500 machine. If it's so easy to include a 9800 in an LCD-based computer one wonders why Sony opted for integrated graphics...
$$$$, maybe? Being even more el cheapo that Apple did with the GPU?

And maybe that's why they're doing so well?

-- james
     
 
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 06:22 AM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,