Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Software - Troubleshooting and Discussion > Applications > Virutal PC 7 DOES NOT have Native Video Card Support

Virutal PC 7 DOES NOT have Native Video Card Support (Page 2)
Thread Tools
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 19, 2004, 02:11 PM
 
Originally posted by goMac:
No, Ronnie. It's NewWorld, the motherboard design introduced with the first iMac (also called Columbus).

GWorld is a Carbon graphics world you open for drawing.

Before you start throwing insults, please go get informed. I'm starting to doubt your expertise in this considering I actually do OpenGL programming, and know what a NewWorld Mac is.
I misunderstood based on your rambling.

Gworld has nothign to do with VPC. All the graphics in Windows are rendered on the CPU with no Open GL calls possible. Everyone agrees on this but you.

As for the motherboard, Mac hardware is even more PC-like now than ever. The only thing that separates a Mac and a PC is the CPU and OS so I don't know why you brought up this 'NewWorld' didn't exist back then. If anything Windows NT for PPC would be easier to code for Macs now than back then.

No doubt you'll disagree with that but then nobody including MS seems to agree with you. You were adamant that VPC7 had hardware graphic support and still won't admit you were wrong.
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 19, 2004, 02:20 PM
 
Originally posted by RonnieoftheRose:
I misunderstood based on your rambling.

Gworld has nothign to do with VPC. All the graphics in Windows are rendered on the CPU with no Open GL calls possible. Everyone agrees on this but you.

As for the motherboard, Mac hardware is even more PC-like now than ever. The only thing that separates a Mac and a PC is the CPU and OS so I don't know why you brought up this 'NewWorld' didn't exist back then. If anything Windows NT for PPC would be easier to code for Macs now than back then.

No doubt you'll disagree with that but then nobody including MS seems to agree with you. You were adamant that VPC7 had hardware graphic support and still won't admit you were wrong.
No, Ronnie. You don't get it. The image is rendered by the Mac OS. Yes it is rendered on CPU, but by Mac OS X. It is rendered on CPU because a GWorld isn't hardware accelerated by Quartz Extreme. Using the same idea you could forward OpenGL calls made within Virtual PC to the native Mac OS X drivers, and those would be hardware accelerated. Your strange idea of Virtual PC rasterizing the image is completely insane. Your trying to say that Virtual PC within Windows keeps its own graphics buffer, which is then rasterized and sent to the Mac OS. If this were the case the Mac OS would have to unrasterize it from the rasterized form, and then re-rasterize it for display within the Mac OS. This is completely stupid, and unless the Connectix team was on crack when they first wrote Virtual PC, they would not have done this.

A Mac motherboard is nothing like a PC motherboard. Each different kind of Mac uses a different motherboard requiring a different driver. I can't imagine MS wanting to spend time writing a new motherboard driver every time a new Mac comes out. PC's use a standardized motherboard format that does not change.

In addition, power management and startup process on Mac motherboards is a lot different.

I'm really getting the feeling you have no technical experience about what you're talking about.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 19, 2004, 03:06 PM
 
Originally posted by goMac:
No, Ronnie. You don't get it.
You still haven't got it yourself. Nobody agrees with you.

The image is rendered by the Mac OS. Yes it is rendered on CPU, but by Mac OS X
OSX's graphic subsystem is only rendering its own Quartz-based Aqua interface. It doesn't render anything inside VPC's working window.

It is rendered on CPU because a GWorld isn't hardware accelerated by Quartz Extreme. Using the same idea you could forward OpenGL calls made within Virtual PC to the native Mac OS X drivers, and those would be hardware accelerated.
You can't issue Open GL calls from VPC to OSX if the motherboard being emulated in VPC can't see AGP and the graphics card being emulated barely supports 3D acceleration. The S3 Trio isn't even an Open GL spec'd chip.


Your strange idea of Virtual PC rasterizing the image is completely insane. Your trying to say that Virtual PC within Windows keeps its own graphics buffer, which is then rasterized and sent to the Mac OS.
Why do you think we have to allocate emulated graphics memory in VPC? And it isn't sent to Mac OSX. It all stays within VPC's working window.

A Mac motherboard is nothing like a PC motherboard. Each different kind of Mac uses a different motherboard requiring a different driver. I can't imagine MS wanting to spend time writing a new motherboard driver every time a new Mac comes out.
This is regarding NT/XP for PPC. Well, there are far fewer variations of PPC motherboards than Intel and AMD ones.

PC's use a standardized motherboard format that does not change.
PC's have standardized motherboards? This is a new one. Several different chipsets supporting many different Intel and AMD CPUs come out from AMD, NVidia, Intel, VIA and other manufacturers just about every 6-8 weeks.

In addition, power management and startup process on Mac motherboards is a lot different.
Are you saying they could never support that in NT/XP for PPC?

I'm really getting the feeling you have no technical experience about what you're talking about.
An easy accusation to make but nobody will buy it, especially when nobody agrees with you.
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 19, 2004, 03:13 PM
 
Originally posted by RonnieoftheRose:
You still haven't got it yourself. Nobody agrees with you.



OSX's graphic subsystem is only rendering its own Quartz-based Aqua interface. It doesn't render anything inside VPC's working window.



You can't issue Open GL calls from VPC to OSX if the motherboard being emulated in VPC can't see AGP and the graphics card being emulated barely supports 3D acceleration. The S3 Trio isn't even an Open GL spec'd chip.




Why do you think we have to allocate emulated graphics memory in VPC? And it isn't sent to Mac OSX. It all stays within VPC's working window.



This is regarding NT/XP for PPC. Well, there are far fewer variations of PPC motherboards than Intel and AMD ones.



PC's have standardized motherboards? This is a new one. Several different chipsets supporting many different Intel and AMD CPUs come out from AMD, NVidia, Intel, VIA and other manufacturers just about every 6-8 weeks.



Are you saying they could never support that in NT/XP for PPC?



An easy accusation to make but nobody will buy it, especially when nobody agrees with you.
Ronnie, are you dense? P has said the exact same thing as me. He said "1. Sending OpenGL commands directly to the OS X Open GL and translating Direct X commands to OpenGL.". I think no one agrees with you, and P has repeated exactly what I said.

Anything rendered on the screen has to be rendered by OS X. VPC can magically render its own view. You can tell its being rendered by OS X because when you minimize the window it holds its image, unlike something like an OpenGL program, which is bypassing Quartz and going right to OpenGL.

And while there are different manufactures of PC motherboards they all follow the same standard, which came from the original IBM PC motherboards.

Buy yourself a clue stick.

And yes, they could never support NT for the PPC, thats why it was DISCONTINUED.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
Person Man
Professional Poster
Join Date: Jun 2001
Location: Northwest Ohio
Status: Offline
Reply With Quote
Sep 19, 2004, 04:10 PM
 
Originally posted by goMac:
Ronnie, are you dense? P has said the exact same thing as me. He said "1. Sending OpenGL commands directly to the OS X Open GL and translating Direct X commands to OpenGL.". I think no one agrees with you, and P has repeated exactly what I said.

Anything rendered on the screen has to be rendered by OS X. VPC can magically render its own view. You can tell its being rendered by OS X because when you minimize the window it holds its image, unlike something like an OpenGL program, which is bypassing Quartz and going right to OpenGL.

And while there are different manufactures of PC motherboards they all follow the same standard, which came from the original IBM PC motherboards.

Buy yourself a clue stick.

And yes, they could never support NT for the PPC, thats why it was DISCONTINUED.
What they are trying to tell you is that in order to forward Open GL cards to the graphic card from VPC, you need to emulate an Open GL graphic card in the virtual machine. To do that you need to emulate an AGP-capable chipset, as well as a newer graphics card. It is this combination that will allow you to either translate DirectX or forward Open GL calls to the Mac OS.

You can't do it directly. To be the most compatible with all software, the PC needs to think it has AGP (which it does not, because Virtual PC emulates a motherboard chipset THAT CAME BEFORE AGP EXISTED) and think it has an Open GL (or DirectX) capable card installed in it in order for it to even SEND commands to the graphics card.

Once you emulate an AGP-capable PC motherboard and emulate a newer video card (or at least provide a Windows Driver that can forward the calls to the Mac's video card), THEN you can send the calls to the Mac's card. Not before.
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 19, 2004, 04:13 PM
 
Originally posted by Person Man:
What they are trying to tell you is that in order to forward Open GL cards to the graphic card from VPC, you need to emulate an Open GL graphic card in the virtual machine. To do that you need to emulate an AGP-capable chipset, as well as a newer graphics card. It is this combination that will allow you to either translate DirectX or forward Open GL calls to the Mac OS.

You can't do it directly. To be the most compatible with all software, the PC needs to think it has AGP and think it has an Open GL (or DirectX) capable card installed in it in order for it to even SEND commands to the graphics card.
But you don't. The bus just acts as a bridge between the motherboard and the GPU. If you have a GPU on a PCI bus receiving OpenGL commands you can send those into OS X.

There is no reason you need to emulate an AGP card to do OpenGL.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
Person Man
Professional Poster
Join Date: Jun 2001
Location: Northwest Ohio
Status: Offline
Reply With Quote
Sep 19, 2004, 04:17 PM
 
Originally posted by goMac:
But you don't. The bus just acts as a bridge between the motherboard and the GPU. If you have a GPU on a PCI bus receiving OpenGL commands you can send those into OS X.

There is no reason you need to emulate an AGP card to do OpenGL.
No, you don't need to, but to be the most compatible with today's software (which is probably written to expect AGP), it's probably a good idea.

Even so, the fact remains that you would need to emulate a PCI graphics card that is newer than the S3 trio before you can forward calls.
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 19, 2004, 05:16 PM
 
Originally posted by Person Man:

Even so, the fact remains that you would need to emulate a PCI graphics card that is newer than the S3 trio before you can forward calls.
I and others have been telling him this for almost a month, now he thinks everyone agrees with him and not me. The S3 Trio 32/64 barely had video acceleration support let alone Open GL. The card cost 50 bucks back when Open GL supporting cards cost a grand or more. Even the Matrox Millenium barely supported Open GL's full set of features and cost ten times more than the S3 Trio.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 19, 2004, 05:38 PM
 
There are PCI cards for which adequate OpenGL drivers exist - the Radeon 7000 for one. To do 1 on my list, you don't actually need to emulate a graphics card at all when rendering 3D. VPC would install a fake driver that doesn't do anything in OpenGL mode, it just sends the OpenGL calls to Mac OS X. This fake driver would also do the DirectX to Open GL translation and - why not? - 2D to OpenGL translation and send the translated commands on. The problem with approach 1 is the DirectX to OpenGL translation.

To do approach 3, you would have to emulate a new chipset with an AGP bus, simply because most of today's boards to not come in PCI versions, so the driver on the Windows side would not know what was happening. The driver on the Windows side would be a stock driver, and the ATI drivers (say) would not expect to find a Radeon 9600 in the emulated PCI slot.

On the subject of who is rendering what... I think you're both actually saying basically the same thing. The process that owns a window is responsible for what shows up inside the window (the Windowserver is responsible for drawing the window frame etc). The program does not in itself have actual access to the hardware - of course it doesn't, it would need graphics drivers for every single board - but rather calls on built-in OS X APIs that send commands to the driver and then on to the graphics board. If this is the program or operating system rendering the image, I will leave to you.

If MS wants to do 3, which is what I think they're doing, they will have to violate this model. They will bypass the entire OS X graphics subsystem and have the Windows driver send its commands directly to the board in the Mac. This would have to be done in fullscreen mode only, because the Windows side of the system won't know what the Mac side has drawn on the remainder of the screen.

Please note that this would not be the first time anyone has done this. XFree86 for Mac OS X does the same thing in fullscreen mode, its own drivers and everything.
     
Person Man
Professional Poster
Join Date: Jun 2001
Location: Northwest Ohio
Status: Offline
Reply With Quote
Sep 19, 2004, 06:06 PM
 
Originally posted by P:
There are PCI cards for which adequate OpenGL drivers exist - the Radeon 7000 for one. To do 1 on my list, you don't actually need to emulate a graphics card at all when rendering 3D. VPC would install a fake driver that doesn't do anything in OpenGL mode, it just sends the OpenGL calls to Mac OS X. This fake driver would also do the DirectX to Open GL translation and - why not? - 2D to OpenGL translation and send the translated commands on. The problem with approach 1 is the DirectX to OpenGL translation.
Yes, but because most software on the PC today uses DirectX, to be the most compatible, you need to do your approach 3.

If MS wants to do 3, which is what I think they're doing, they will have to violate this model. They will bypass the entire OS X graphics subsystem and have the Windows driver send its commands directly to the board in the Mac. This would have to be done in fullscreen mode only, because the Windows side of the system won't know what the Mac side has drawn on the remainder of the screen.
This may or may not work. Remember that you have Mac ROMs and PC ROMs. A PC card doesn't just work in a Mac unless you are able to flash it with a Mac ROM. There will need to be some layer of translation between the two. A Windows driver will NOT be able to directly drive a Mac card (unless they write special drivers for each type of Mac card, but this would require you to write drivers for each new card as they come out, and would be much more work than emulating one card and using the stock Windows drivers for that card, regardless of what graphics card is actually installed in the Mac).
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 19, 2004, 06:20 PM
 
Originally posted by Person Man:
No, you don't need to, but to be the most compatible with today's software (which is probably written to expect AGP), it's probably a good idea.

Even so, the fact remains that you would need to emulate a PCI graphics card that is newer than the S3 trio before you can forward calls.
Software is not written for AGP. It is either written in OpenGL or DirectX. An application never does direct hardware access and has no clue what bus the card is on. The bus the card is on is irrelevant. When I code OpenGL I never take into account whether the card is on AGP or PCI.

The point I raised with DirectX is that you would need a DirectX -> OpenGL wrapper to translate DirectX to OpenGL. This is most likely what MS is stalled on.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 19, 2004, 06:22 PM
 
Originally posted by RonnieoftheRose:
I and others have been telling him this for almost a month, now he thinks everyone agrees with him and not me. The S3 Trio 32/64 barely had video acceleration support let alone Open GL. The card cost 50 bucks back when Open GL supporting cards cost a grand or more. Even the Matrox Millenium barely supported Open GL's full set of features and cost ten times more than the S3 Trio.
Don't you get it? It's an emulated graphics card. They could throw OpenGL support on and they'd be fine.

You keep acting like its emulating a real S3 Trio. It's not.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 19, 2004, 06:36 PM
 
Originally posted by goMac:
Don't you get it? It's an emulated graphics card. They could throw OpenGL support on and they'd be fine.

You keep acting like its emulating a real S3 Trio. It's not.
*cough*
     
arekkusu
Mac Enthusiast
Join Date: Jul 2002
Status: Offline
Reply With Quote
Sep 19, 2004, 08:22 PM
 
This thread is pointless, but just to correct some info (and play devil's advocate a little):

Originally posted by goMac:
Your trying to say that Virtual PC within Windows keeps its own graphics buffer, which is then rasterized and sent to the Mac OS. If this were the case the Mac OS would have to unrasterize it from the rasterized form, and then re-rasterize it for display within the Mac OS. This is completely stupid
But this is exactly what happens if you use the GDI software GL renderer in VPC now!
* emulated app submits GL commands
* GDI renders into Windows-managed buffer, Windows composites to its framebuffer
* VPC passes the framebuffer to Quartz to draw into the VPC window

At no point is there any hardware acceleration in this (excepting Quartz Extreme for the general OS X window management) and there very well may be some "unrasterize" steps (i.e. endian swaps) in there, depending what pixel format all of this happens in. (Probably not, if Connectix used a regular ARGB GWorld.)


Originally posted by goMac:
You can tell its being rendered by OS X because when you minimize the window it holds its image, unlike something like an OpenGL program, which is bypassing Quartz and going right to OpenGL.
This quirk of the Dock is easily worked around in any well-written GL app though. So you can't tell that way if VPC is using GL (but, it isn't.)
( Last edited by arekkusu; Sep 19, 2004 at 08:28 PM. )
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 19, 2004, 09:39 PM
 
Originally posted by arekkusu:
But this is exactly what happens if you use the GDI software GL renderer in VPC now!
* emulated app submits GL commands
* GDI renders into Windows-managed buffer, Windows composites to its framebuffer
* VPC passes the framebuffer to Quartz to draw into the VPC window
Thats because its a software rendering. I wasn't refering to GL in Windows, I was refering to general 2D drawing. Obviously Windows is going to draw with the GL software rendering. Thats composited below the graphics card.

My point is hardware GL could use the same route normal 2D data uses going out. Obviously software GL is rasterized first, than follows the normal display data path.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 19, 2004, 09:49 PM
 
Originally posted by goMac:
Thats because its a software rendering. I wasn't refering to GL in Windows, I was refering to general 2D drawing.
All Windows 2D operations in VPC are software rendered on the CPU. People have been telling you this but you won't accept it. It doesn't hurt to admit being wrong even if you've tried so hard to be right.

     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 19, 2004, 09:58 PM
 
Originally posted by RonnieoftheRose:
All Windows 2D operations in VPC are software rendered on the CPU. People have been telling you this but you won't accept it. It doesn't hurt to admit being wrong even if you've tried so hard to be right.

Ronnie...

He was talking about software OpenGL in Windows. I don't disagree on that. Software OpenGL in Windows is rendered in Windows and then in Mac OS X again.

He said nothing about 2D operations.

Please read the comments.

And once again, P's point #1 is exactly what I'm saying. I am not wrong. He is saying the exact same thing.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 19, 2004, 10:39 PM
 
Originally posted by goMac:
Ronnie...

He was talking about software OpenGL in Windows. I don't disagree on that. Software OpenGL in Windows is rendered in Windows and then in Mac OS X again.
This is going to be my last post about this. First of all Open GL doesn't exist in Windows on VPC anyway. If you try to run any Open GL app like Lightwave or 3DS Max the app will show you that it can't find any Open GL hardware. Any app that has a 3D object will be rendered using a software z-buffer on the CPU alone, no graphics acceleration at all. Just try to run 3DS Max and read the dialogue box that appears when the app launches.

Therefore it is illogical to assume that Open GL calls in Windows will be sent to OSX. nothing will be because there is nothing in the first place.

Lastly, and I really mean it, you used the analogy of thumbnails minimizing to the Dock to support your idea Windows has Open GL support. That has nothing to do with the emulation engine or Windows. This is simply a feature of Quartz to capture a working windows image and then shrink or enlarge it from the Dock.

End of story.
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 20, 2004, 01:58 AM
 
Originally posted by RonnieoftheRose:
This is going to be my last post about this. First of all Open GL doesn't exist in Windows on VPC anyway. If you try to run any Open GL app like Lightwave or 3DS Max the app will show you that it can't find any Open GL hardware. Any app that has a 3D object will be rendered using a software z-buffer on the CPU alone, no graphics acceleration at all. Just try to run 3DS Max and read the dialogue box that appears when the app launches.

Therefore it is illogical to assume that Open GL calls in Windows will be sent to OSX. nothing will be because there is nothing in the first place.

Lastly, and I really mean it, you used the analogy of thumbnails minimizing to the Dock to support your idea Windows has Open GL support. That has nothing to do with the emulation engine or Windows. This is simply a feature of Quartz to capture a working windows image and then shrink or enlarge it from the Dock.

End of story.
What are you talking about? Did you completely miss the point?

I said OpenGL runs in software mode on VPC. Thats not hardware accelerated. You just repeated what I said in regard to software rendering. We're not talking about software rendering, we're talking about hardware rendering. I agreed that software rendering happens within Windows twice now. I KNOW VPC has no hardware acceleration right now. I really don't get where you think I said you can run programs that require OpenGL hardware acceleration. If you enabled OpenGL on the emulated graphics card you could forward the calls to Mac OS. That is the whole point of this discussion. This is P's point #1 (as I've said before). This is what I'm arguing that you seem to have a problem with.

Minimizing to the dock can be accurate because OpenGL is composited directly by the graphics card and not the window manager. The bit of code posted actually copies the current OpenGL image back to Quartz as simply a placeholder, its not an actual OpenGL rendering.

And if the Mac OS is not drawing the VPC window, what is? Magic? The Mac OS must draw it. VPC can't magically draw itself.

It just seems like you've taken a bunch of random facts and are trying to tie them together. I've done OpenGL programming for the past two years. I have a technical understanding of how this works. I'm beginning to doubt you do.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
zen jihad
Registered User
Join Date: May 2004
Location: Just a groove in "G"
Status: Offline
Reply With Quote
Sep 22, 2004, 06:11 AM
 
From AppleInsider:

"Feeling pressure from both Apple and G5 customers, Microsoft this summer cut several key enhancements from its Virtual PC 7.0 Windows emulation software in order to deliver a G5 compatible solution without further delays.

One of the features reportedly shelved until a future release was native graphics card support. But precisely what is delaying this feature remains a mystery to even some members of the Virtual PC team, as they are not the ones responsible for the implementation.

According to sources, Virtual PC's native graphics card support is being handled exclusively by Microsoft's Xbox team. Though not expected for several months, the feature will reportedly demand a graphics card that meets the same level of graphics sophistication required for Apple's Core Image and Video technology.

For Macintosh systems that sport a compatible ATI graphics card, future versions of Virtual PC will emulate an original Radeon with up to 32MB of virtual video memory. Likewise, for Macs equipped with a compliant Nvidia graphics card, sources said that the emulated chipset will be a Geforce 3 with up to 32MB of virtual video memory."
     
arekkusu
Mac Enthusiast
Join Date: Jul 2002
Status: Offline
Reply With Quote
Sep 22, 2004, 10:11 AM
 
That's very interesting, if nonsensical.

Why would they pick a Radeon and GF3 as the emulated cards? A Radeon8500 would be the nearest GF3 equivalent.
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 22, 2004, 01:38 PM
 
Originally posted by arekkusu:
That's very interesting, if nonsensical.

Why would they pick a Radeon and GF3 as the emulated cards? A Radeon8500 would be the nearest GF3 equivalent.
The XBox had a GF3. I'm assuming this is why. It sounds like they are doing some sort of XBox 1 emulation and handing it over to VPC team.

This is kind of confusing though. I don't see how the XBox team would be dabbling in OpenGL, which is needed for Virtual PC for Mac. The PC version wouldn't need any OpenGL stuff... but the Mac version...
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 22, 2004, 06:25 PM
 
Originally posted by arekkusu:
That's very interesting, if nonsensical.

Why would they pick a Radeon and GF3 as the emulated cards? A Radeon8500 would be the nearest GF3 equivalent.
No. The Radeon 8500 was concurrent with the late GF4 series. The original Radeon competed with the Geforce 3.

And of course none of these cards are compatible with an old HX or FX chipset so expect a total rewrite of VPC to emulate a far newer motherboard.
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 22, 2004, 07:12 PM
 
Originally posted by RonnieoftheRose:
And of course none of these cards are compatible with an old HX or FX chipset so expect a total rewrite of VPC to emulate a far newer motherboard.
: sighs... slams head on table :

You're back again...

They don't even need to re-write it (and you wouldn't need a complete re-write anyway, thats a huge overstatement). I'm assuming they might add AGP, but the original Radeon was PCI. The GeForce 3 had a PCI version.

Oh... and the original Radeon competed with the orignal GeForce.

Either way, in since they're using your real hardware, they won't be as slow as the emulated card.

The Radeon was around in 1999...
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 22, 2004, 07:50 PM
 
Originally posted by goMac:
[B]: sighs... slams head on table :

You're back again...
Looks like I have to slap you down again. Watch this, baby.


They don't even need to re-write it (and you wouldn't need a complete re-write anyway, thats a huge overstatement). I'm assuming they might add AGP, but the original Radeon was PCI. The GeForce 3 had a PCI version.
An emulated HX-FX chipset will still have problems communicating with modern graphics card, especially one that supports Core Graphics.

Oh... and the original Radeon competed with the orignal GeForce.
The Radeon wasn't out at the time. The original Geforce came out at the time of and beat the Rage 128. The Geforce was the first GPU class graphics card with tranform and lighting.

ATI then released the ATI Rage 128 MaXX that had two graphics chips on board but still wasn't a GPU with T+L. It still did very good in benchmarks though. These cards were also competing with the last of the 3dFX Voodoo 3 series at the time.


Then the Geforce 2, Geforce 2 Pro, Geforce 2 Ultra and MX came out.

3dFX then released its last generation of cards which were a catastrophic failure. Nvidia bought 3dFX out.

ATI was taking it's time with the Radeon design which came out roughly around the time the Geforce 3 was released.

ATI was losing the battle. But then they released the Radeon 8500. This challenge was answered by the Geforce 4 family. The 8500 went through a few more revisions and beat the Geforce 4 in most benchmarks.

That brings us to the ATI Radeon 9x00 series and the Geforce 5x00 series. Again ATI beat Nvidia in many benchmarks.

Now we're on ATI's X800 and X600 family and Nvidia's 6x00 series.

Next time don't play smart with sentences beginning with "Oh...and..." I bought just about every graphics card and motherboard in the Wintel world prior to moving to Macs full time in 2002. I know every model, release date and and performance rating of all of it.

You just got slapped down. Now shut it, baby.
( Last edited by RonnieoftheRose; Sep 22, 2004 at 08:15 PM. )
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 22, 2004, 09:56 PM
 
Originally posted by RonnieoftheRose:
Looks like I have to slap you down again. Watch this, baby.




An emulated HX-FX chipset will still have problems communicating with modern graphics card, especially one that supports Core Graphics.



The Radeon wasn't out at the time. The original Geforce came out at the time of and beat the Rage 128. The Geforce was the first GPU class graphics card with tranform and lighting.

ATI then released the ATI Rage 128 MaXX that had two graphics chips on board but still wasn't a GPU with T+L. It still did very good in benchmarks though. These cards were also competing with the last of the 3dFX Voodoo 3 series at the time.


Then the Geforce 2, Geforce 2 Pro, Geforce 2 Ultra and MX came out.

3dFX then released its last generation of cards which were a catastrophic failure. Nvidia bought 3dFX out.

ATI was taking it's time with the Radeon design which came out roughly around the time the Geforce 3 was released.

ATI was losing the battle. But then they released the Radeon 8500. This challenge was answered by the Geforce 4 family. The 8500 went through a few more revisions and beat the Geforce 4 in most benchmarks.

That brings us to the ATI Radeon 9x00 series and the Geforce 5x00 series. Again ATI beat Nvidia in many benchmarks.

Now we're on ATI's X800 and X600 family and Nvidia's 6x00 series.

Next time don't play smart with sentences beginning with "Oh...and..." I bought just about every graphics card and motherboard in the Wintel world prior to moving to Macs full time in 2002. I know every model, release date and and performance rating of all of it.

You just got slapped down. Now shut it, baby.
Dude... what are you talking about? They aren't doing direct hardware access. Core Image has nothing to do with Windows.

If you want to see smack down... here it is...

Apple was the first company to offer the GeForce3. They had been previously using the Radeon before this. If the Radeon came out at the same time as the GeForce3, how was Apple using it for a few years prior? The GeForce3 REPLACED the Radeon as the graphics on the Power Mac G4.

The GeForce3 was released in about Oct of 2001. The aftermarket Radeon came out FOR MAC in about September of 2000. This ran notably later than the PC version, which was already in its second incarnation.

Your timeline is wrong. Go get the facts.

Edit: I can find reviews of the DDR version of the Radeon dating back to April of 00. Still looking for the exact introduction date....

Edit: The Radeon 8500 was the card that was released to combat the GeForce3:

http://www20.graphics.tomshardware.c...on8500-09.html

By then the Radeon was already a well established brand.
( Last edited by goMac; Sep 22, 2004 at 10:05 PM. )
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 22, 2004, 10:02 PM
 
Originally posted by goMac:


Apple was the first company to offer the GeForce3. They had been previously using the Radeon before this. If the Radeon came out at the same time as the GeForce3, how was Apple using it for a few years prior? The GeForce3 REPLACED the Radeon as the graphics on the Power Mac G4.
As per your question above, read what I wrote carefully. Quote:

"ATI was taking it's time with the Radeon design which came out roughly around the time the Geforce 3 was released."

Also, Apple might have been the first to announce they would use the GF3 but they weren't able to ship it for a while. By the time they did PC buyers had a far wider and cheaper selection. Most still opted for the Radeon and later revisions of the Radeon.

The aftermarket Radeon came out FOR MAC in about September of 2000. This ran notably later than the PC version, which was already in its second incarnation.
I bought a Dual G4-450 in 2001. It came with a Rage 128 with no Radeon option.

Can you stop making a damn fool out of yourself? Not one person agrees with you on any point you made. You're living in a fantasyland where you are always right, I'm always wrong, and yet you still won't take back your claim that VPC 7 has hardware graphics support. For ****'s sake.
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 22, 2004, 10:10 PM
 
Originally posted by RonnieoftheRose:
As per your question above, read what I wrote carefully. Quote:

"ATI was taking it's time with the Radeon design which came out roughly around the time the Geforce 3 was released."

Also, Apple might have been the first to announce they would use the GF3 but they weren't able to ship it for a while. By the time they did PC buyers had a far wider and cheaper selection. Most still opted for the Radeon and later revisions of the Radeon.

Can you stop making a damn fool out of yourself? Not one person agrees with you on any point you made. You're living in a fantasyland where you are always right, I'm always wrong, and yet you still won't take back your claim that VPC 7 has hardware graphics support. For ****'s sake.
I've already said VPC7 doesn't have graphics card support, but as always, you never read my posts.

It's quite obvious you've switched over because you have no idea how Mac OS X works, and how you'd do graphics acceleration. You have some horrible idea that you'd give VPC direct hardware access, or even stranger, that a port of Windows to Mac would be a good idea. While these are wonderful things to imagine, in practice they fall apart.

And Microsoft does not have to do a major re-write to get AGP graphics working. The drivers are already in place for them. Motherboard designers didn't have to start from scratch when they went to AGP. All they have to do is add an emulated AGP port, which is not as relatively hard as, say, writing an x86 emulator.

I already posted the release dates. They speak for themselves. You can argue with me all you want, but I don't think you can argue with the release dates.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 22, 2004, 10:11 PM
 
Originally posted by RonnieoftheRose:
I bought a Dual G4-450 in 2001. It came with a Rage 128 with no Radeon option.
Thats very odd, because ATI was selling them in 2001.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 22, 2004, 10:16 PM
 
The ATI Radeon shipped July 17th, 2000. The card NVidia was offering at the time was the GeForce GTS.

http://www.geek.com/news/geeknews/q2...0717001881.htm
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 22, 2004, 10:29 PM
 
Originally posted by goMac:
The ATI Radeon shipped July 17th, 2000. The card NVidia was offering at the time was the GeForce GTS.

http://www.geek.com/news/geeknews/q2...0717001881.htm
The GTS was a very late revision of the original Geforce series. Like I said, when the Geforce came out it beat the Rage 128. ATI's response was the Rage 128 MaXX. Tom's Hardware reviewed the Maxx in December '99.

http://graphics.tomshardware.com/graphic/19991230/

I quote my previous post

The original Geforce came out at the time of and beat the Rage 128. The Geforce was the first GPU class graphics card with tranform and lighting.

ATI then released the ATI Rage 128 MaXX that had two graphics chips on board but still wasn't a GPU with T+L. It still did very good in benchmarks though. These cards were also competing with the last of the 3dFX Voodoo 3 series at the time.
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 22, 2004, 10:32 PM
 
Originally posted by RonnieoftheRose:
The GTS was a very late revision of the original Geforce series. Like I said, when the Geforce came out it beat the Rage 128. ATI's response was the Rage 128 MaXX. Tom's Hardware reviewed the Maxx in December '99.

http://graphics.tomshardware.com/graphic/19991230/

I quote my previous post
But as I have shown the Radeon came well before the GeForce3, and was available in 2000.

In addition, it was PCI. The GeForce3 had a PCI version also. Microsoft does not have to add AGP, and it would result in no performance loss. Both are easily capable of OpenGL.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 22, 2004, 10:33 PM
 
Originally posted by goMac:
Motherboard designers didn't have to start from scratch when they went to AGP. All they have to do is add an emulated AGP port
I've had enough. You don't add an 'emulated' AGP port to a motherboard. And you can't add AGP to a motherboard whose chipset doesn't support it.
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 22, 2004, 10:34 PM
 
Originally posted by goMac:
But as I have shown the Radeon came well before the GeForce3, and was available in 2000.
Look, just read my post carefully for one ****ing time. I didn't say it came out AFTER the Geforce 3. I said it came out around the time of the Geforce 3. I even mentioned it between the GF2 and GF3.

Just ****ing grow up. I've had enough of talking to you.
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 22, 2004, 10:36 PM
 
Originally posted by RonnieoftheRose:
Look, just read my post carefully for one ****ing time. I didn't say it came out AFTER the Geforce 3. I said it came out around the time of the Geforce 3. I even mentioned it between the GF2 and GF3.

Just ****ing grow up. I've had enough of talking to you.
Thats great... But like what I've shown... It came even before the GeForce2.

A year and a half difference isn't "around the time". There's a two generation difference there on both the Radeon and a NVidia sides.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 22, 2004, 10:46 PM
 
Originally posted by goMac:
Thats great... But like what I've shown... It came even before the GeForce2.

A year and a half difference isn't "around the time". There's a two generation difference there on both the Radeon and a NVidia sides.
One last post. Read carefully. Radeon was the response to the Geforce 2. You say above it came out before the Geforce 2 and competed with the original Geforce. Wrong, wrong, wrong. Again, again, again.

http://www.guru3d.com/review/ati/radeon/

Thus enters the Radeon, ATI's latest and greatest. At Radeon's heart is the Rage6C. Make no mistake, the core is beyond anything that ATI has ever constructed, it's not an updated version of the Rage128 cores. The chip's .18-micron architecture is somewhat more comparable to that of the GeForce2 than the Voodoo 5500 and its dual .25-micron VSA-100 chips. However, it has two rendering pipelines, each with three texture units per pipeline, which adds up to a total of 6 texels per clock, 2 less than that of the GeForce2, which has 4 pipelines each capable of passing two textures per clock; so in theory is Radeon starts to seem inferior, but we'll see if that's true or not.


So what we have here is ATI's response to the Voodoo 5500 and the Geforce II, and this time, ATI managed to get our attention by actually releasing the board earlier than previously promised. How does it stack up, you ask? Let's have a look.
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 22, 2004, 11:00 PM
 
Originally posted by RonnieoftheRose:
One last post. Read carefully. Radeon was the response to the Geforce 2. You say above it came out before the Geforce 2 and competed with the original Geforce. Wrong, wrong, wrong. Again, again, again.

http://www.guru3d.com/review/ati/radeon/

Thus enters the Radeon, ATI's latest and greatest. At Radeon's heart is the Rage6C. Make no mistake, the core is beyond anything that ATI has ever constructed, it's not an updated version of the Rage128 cores. The chip's .18-micron architecture is somewhat more comparable to that of the GeForce2 than the Voodoo 5500 and its dual .25-micron VSA-100 chips. However, it has two rendering pipelines, each with three texture units per pipeline, which adds up to a total of 6 texels per clock, 2 less than that of the GeForce2, which has 4 pipelines each capable of passing two textures per clock; so in theory is Radeon starts to seem inferior, but we'll see if that's true or not.


So what we have here is ATI's response to the Voodoo 5500 and the Geforce II, and this time, ATI managed to get our attention by actually releasing the board earlier than previously promised. How does it stack up, you ask? Let's have a look.
Perhaps you should read the review. Thats the second generation 64 MB Radeon. The first one was 32 MB. The 64 MB one did not ship until later.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 22, 2004, 11:22 PM
 
Originally posted by goMac:
Perhaps you should read the review. Thats the second generation 64 MB Radeon. The first one was 32 MB. The 64 MB one did not ship until later.
Oh boy. Look at the specs on the second page. The chip always supported up to 128MBs and this version reviewed is a 64MB version. 32MB and 64MB versions were available immediately depending on the manufacturer.

Did you even notice this review is the introduction of the Radeon chip and that no previous revision existed?
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 22, 2004, 11:42 PM
 
Originally posted by RonnieoftheRose:
Oh boy. Look at the specs on the second page. The chip always supported up to 128MBs and this version reviewed is a 64MB version. 32MB and 64MB versions were available immediately depending on the manufacturer.

Did you even notice this review is the introduction of the Radeon chip and that no previous revision existed?
The chips supported up to 128 MB, yes, but they certainly did not ship with 128 MB initially. The first ones were only 32 MB.

I quote from my earlier link:

"The first Radeon board features 32 MB of DDR (Double Data Rate) memory and is available for US$279."

64 MB started shipping later. It was announced but shipped after the 32 MB version. Much like how the Mac GeForce 6800 was announced 2 months ago.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 22, 2004, 11:50 PM
 
Originally posted by goMac:


64 MB started shipping later. It was announced but shipped after the 32 MB version. Much like how the Mac GeForce 6800 was announced 2 months ago.
Slapping you down again. I had a 64MB Geforce 2 Pro. When the Radeon came out I bought an Asus Radeon 64MB with TV I/O which I put in a machine with a PIII. Some manufacturers shipped them with 64MBs immediately. That review SHOWS one example.

Now cut it out. You don't know anything.
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 23, 2004, 01:44 AM
 
Originally posted by RonnieoftheRose:
Slapping you down again. I had a 64MB Geforce 2 Pro. When the Radeon came out I bought an Asus Radeon 64MB with TV I/O which I put in a machine with a PIII. Some manufacturers shipped them with 64MBs immediately. That review SHOWS one example.

Now cut it out. You don't know anything.
I don't know what you bought but all the reviews say 32 MB initially, 64 MB later. One review stated that they had tested a 64 MB version but there were no 64 MB cards on the market.

You seem to have issues with buying things. Apparently you were having trouble buying a Radeon with your G4 in 2001. I have a feeling the order of things you have in your head are slightly off.

How old are you exactly, and what does how much memory came on the Radeon have to do with VPC? We've already established that the Radeon is at least a year and a half older than the GeForce3, which was the point of this discussion. We've established that porting Windows to Mac is a technical nightmare, and that it is a stupid idea to give VPC direct access to your graphics. We've also established that the best way to do things is to pass the OpenGL commands from the virtual GPU to the real one, which was backed up by P.

We can spend all day long debating how much memory came on the Radeon, but the way I see it, my point is already proven. The GeForce3 and the original Radeon and from different eras and by the time the GeForce 3 was out, we had the Radeon 8500. Your assertion that Radeon came out at the same time as the GeForce 3 is false. From what I see, the only reason a GeForce 3 is being emulated is because (a) the XBox uses the same GPU and (B) both it and the Radeon are the bare minimum standard for a modern graphics feature set, creating low overhead and allowing the native graphics to reach full functionality.

In addition, all this article does is back me up that Windows does not need to know what your real graphics card is because of abstraction. Obviously a GeForce 3 is not the same as a GeForce 6800 but that doesn't matter. And it won't matter when VPC emulates AGP (if it does) and Macs move to PCI Express. It's called hardware abstraction.

You can spend all day getting riled up about how much vram the Radeon had, but every point you've made so far has fallen in on itself, to the point where your arguments are childish and not even on topic anymore. You've never done any graphics programming, or any programming at all, and its painfully obvious. I'm willing to bet you're still in high school.
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
hldan
Mac Elite
Join Date: May 2003
Location: Somewhere
Status: Offline
Reply With Quote
Sep 23, 2004, 02:33 AM
 
I don't understand why Microsoft still advertises on their web sight that VPC 7 has better graphics handling if the software doesn't support it. Wouldn't that be false adverstisement?
It should do what Microsoft says it does by the time it hits the shelves next month.
iMac 24" 2.8 Ghz Core 2 Extreme
500GB HDD
4GB Ram
Proud new Owner!
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 23, 2004, 02:44 AM
 
Originally posted by goMac:
I don't know what you bought but all the reviews say 32 MB initially, 64 MB later. One review stated that they had tested a 64 MB version but there were no 64 MB cards on the market.
Show these reviews in quotes and I'll post ten times more evidence to counter you. You keep proving that you're a liar, someone who claims to be a programmer yet nobody believes you. You constantly say things that fly in the face of all evidence such as saying 64MB Radeon cards didn't exist when the chip was released. I HAD ONE!

Give everybody a break, please.
     
RonnieoftheRose
Registered User
Join Date: Jun 2004
Status: Offline
Reply With Quote
Sep 23, 2004, 02:46 AM
 
Originally posted by goMac:
I'm willing to bet you're still in high school.
You'd be shocked to know where I am, who I am and what my profession is. Put it this way, it's a miracle and also stupidity on my part for even wasting my time here.
     
goMac
Posting Junkie
Join Date: May 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Sep 23, 2004, 11:22 AM
 
Originally posted by RonnieoftheRose:
You'd be shocked to know where I am, who I am and what my profession is. Put it this way, it's a miracle and also stupidity on my part for even wasting my time here.
I certainly hope your profession isn't computer related...
8 Core 2.8 ghz Mac Pro/GF8800/2 23" Cinema Displays, 3.06 ghz Macbook Pro
Once you wanted revolution, now you're the institution, how's it feel to be the man?
     
fiesta cat
Forum Regular
Join Date: Nov 2003
Location: US
Status: Offline
Reply With Quote
Sep 23, 2004, 01:45 PM
 
Originally posted by hldan:
I don't understand why Microsoft still advertises on their web sight that VPC 7 has better graphics handling if the software doesn't support it. Wouldn't that be false adverstisement?
It should do what Microsoft says it does by the time it hits the shelves next month.
If it has better graphics handling than 6, they can get away with it.

They could revise their site by the time it's available commercially as well.

I'm thinking 7.1 will have the fixes from the X-Box people. Hopefully they won't charge us for them.
www.macgenealogy.org - Genealogy on the Mac
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 25, 2004, 04:48 PM
 
Originally posted by goMac:
Thats very odd, because ATI was selling them in 2001.
Not odd at all. This was the time when ATI pre-announced some Powermacs and the Radeons were cut at the last minute. They were not available there for a while.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 25, 2004, 05:10 PM
 
Originally posted by goMac:
Thats great... But like what I've shown... It came even before the GeForce2.

A year and a half difference isn't "around the time". There's a two generation difference there on both the Radeon and a NVidia sides.
I probably should let this lie, but in the interest of getting this topic put to rest...

The first Geforce board was called Geforce 256 - codename NV10. It competed mainly with 3dFXs offerings. At this point ATI couldn't care less, they made lots of money selling cheap chips to OEMs. Well, OK, there was boards like Rage Fury MAXX (yuck, what a name) but this wasn't anywhere near the foucs of the company. nVidia followed up with the Geforce2 - NV15, then Geforce2 GT and GTS. The last two did nothing except upping the clockspeed of the core and the memory interface. One of these introduced DDR RAM on the graphics boards. These boards were the top of the heap, but nVivia realised that if they wanted to make money, they should aim for ATI. They released the Geforce2 MX, without the DDR memory, to have a low-priced upgrade and possible OEM customers. It would still run all over anything ATI had at the time.

During all of this, the graphics chip market was involved in a consolidation process. Everyone bought everyone, and nVidia - as has been noted - bought out 3dFX to settle a lawsuit. They killed the line and the Glide API, but they kept the people. ATI, on the other hand, was stirring. They saw that the Geforce-type boards was the future and realised that they needed something similar. They bought a company called Art-X - mainly to get its lucrative contract for making chips to the Nintendo Gamecube, but they got some talented people at the same time. These people released the Radeon. It was a chip very similar to the Geforce2 (and nothing at all similar to the Rage128), clocked about the same as the GTS edition (the fastest) and with the same amount of memory - more or less. The kicker was that it outperformed the Geforce boards by a small but significant margin. nVidia was no longer top of the heap - and that burned.

What ATI had done was to add some very simple tricks to the memory interface - things like Fast Z-clear and rudimentary texture compression - that meant that it used the memory bandwidth better. At this point, this was all that counted in graphics speed. nVidia's response was to launch the Geforce2 Ultra - an overclocked GTS with newer memory chips that could take a higher memory clock. by brute force alone, the paper-launched nVidia board could take back the top of the heap and hold until the NV20 - the Geforce3 - could come out, with a new memory interface and shaders (with DirectX 8 support) etc.

OK, does this put the various boards in a timeframe? And if you need to check my memory, I'd suggest that you read the order of the articles here:

http://graphics.tomshardware.com/graphic/2000.html
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 25, 2004, 05:14 PM
 
Originally posted by fiesta cat:
If it has better graphics handling than 6, they can get away with it.

They could revise their site by the time it's available commercially as well.

I'm thinking 7.1 will have the fixes from the X-Box people. Hopefully they won't charge us for them.
Let's put it this way: Which was the last non-free Microsoft application to get a .1 version? Unless I miss my guess, it was Word 5.1. Which was a while ago. Microsoft makes huge .0 updates and Service packs to fix bugs. They don't do .1. I'd love to be prove wrong, but...
     
 
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 02:32 PM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,