Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Community > MacNN Lounge > RIP iMac Pro

RIP iMac Pro
Thread Tools
Thorzdad
Moderator
Join Date: Aug 2001
Location: Nobletucky
Status: Offline
Reply With Quote
Mar 8, 2021, 04:33 PM
 
     
Doc HM
Professional Poster
Join Date: Oct 2008
Location: UKland
Status: Offline
Reply With Quote
Mar 8, 2021, 05:39 PM
 
I am sure all seven of the buyers will hold an online requiem for it. They might have to invite the three trash can MacPro buyers to make up the numbers though.
This space for Hire! Reasonable rates. Reach an audience of literally dozens!
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Mar 8, 2021, 06:52 PM
 
I think the iMac Pro was overall a much better machine than the trashcan Mac Pro, though. I hear people who have one actually love it. It’s just that Intel’s CPU offerings are impossible to cool. Intel’s new Rocket Lake munches up to 294 W (at a “TDP” of 125 W). Basically, it wasn’t worth it for Apple to invest much resources into trying to put their cooling system on steroids when their normal ARM-based iMac will likely be faster anyway (at least CPU-wise).
I don't suffer from insanity, I enjoy every minute of it.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Mar 8, 2021, 07:00 PM
 
The only real problem with the trash can was they wouldn’t update it. Pro users care about “speeds and feeds”.

I would have bought one, but it was so old when the time came to buy, I got an iMac. Would have gotten an iMac Pro, but that hadn’t been introduced yet.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Mar 8, 2021, 10:06 PM
 
Originally Posted by subego View Post
The only real problem with the trash can was they wouldn’t update it. Pro users care about “speeds and feeds”.
I had a Trashcan Mac Pro, it was a great machine. And I agree with that sentiment, the feeling I remember the most was that no matter what I did to my computer, it just wouldn't slow down. Eventually, it fell victim to the stupendous increase in single-core performance, because my 16" MacBook Pro was simply much faster. Still, I rather liked the machine. The only bad thing about this machine was that Apple didn't make an Apple Retina display to go along with it. I kept an older Thunderbolt display until it broke. The new LG doesn't nearly feel as nice.
Originally Posted by subego View Post
I would have bought one, but it was so old when the time came to buy, I got an iMac. Would have gotten an iMac Pro, but that hadn’t been introduced yet.
Yeah, the iMac is a great machine, and I reckon that 8+4 cores are enough for most people (in the sense that adding more cores wouldn't make their machine faster).
I don't suffer from insanity, I enjoy every minute of it.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Mar 8, 2021, 10:25 PM
 
Like I mentioned in the M1 thread, I can use an arbitrarily large number of cores, so I’m exactly who they were aiming the trashcan at.

If I was going to spend that much coin, though, I would have gotten way better performance if I switched to Windows and built a PC. At the time I wasn’t that desperate, so I went with an iMac, which is definitely usable.

I miss it actually. It’s on loan to my partner for WFH, and I’ve been limping along with my MBP in clamshell. In hindsight, I could have made better use of it than him, but the original plan had him doing a lot of mixing with 192khz audio, which can use the horsepower.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Mar 9, 2021, 01:59 AM
 
Originally Posted by subego View Post
Like I mentioned in the M1 thread, I can use an arbitrarily large number of cores, so I’m exactly who they were aiming the trashcan at.
I just inherited one. It replaced a cheesegrater Mac Pro, which I also inherited. I would have either bought an iMac or just kept my MacBook Pro.
Originally Posted by subego View Post
If I was going to spend that much coin, though, I would have gotten way better performance if I switched to Windows and built a PC. At the time I wasn’t that desperate, so I went with an iMac, which is definitely usable.
AMD's release of its Zen 2-based CPUs was devastating to the Mac Pro. Because you couldn't say that this is the fastest computer, period. And that once you look at top-end workstations, prices are actually not insane. But Zen 2 was released, and it gave Intel a very thorough spanking. It's kind of a pity, because otherwise the Mac Pro is an impressive feat of engineering. Not that I'd buy one.
Originally Posted by subego View Post
I miss it actually. It’s on loan to my partner for WFH, and I’ve been limping along with my MBP in clamshell. In hindsight, I could have made better use of it than him, but the original plan had him doing a lot of mixing with 192khz audio, which can use the horsepower.
The Mac Pros reminded me of the niceties that desktops offer. With a notebook (M1-based machines excluded) you can tell how hard your machine is working. You hear the fan spinning up, your machine gets warm, etc. With a desktop you mostly don't. That's what I liked about the trashcan. Plus, I really liked the color.
I don't suffer from insanity, I enjoy every minute of it.
     
MacNNFamous
Senior User
Join Date: Jul 2020
Status: Offline
Reply With Quote
Mar 10, 2021, 06:24 PM
 
I wanted and still want an iMac pro. **** if I am paying that much for something not upgradeable tho. But... it's a good package. Very fast machine... just not worth 3-4k... imho.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Mar 12, 2021, 11:58 AM
 
Originally Posted by OreoCookie View Post
It’s just that Intel’s CPU offerings are impossible to cool. Intel’s new Rocket Lake munches up to 294 W (at a “TDP” of 125 W).
If you disable the power limits, it can draw that much. Apple doesn’t disable the power limits on its computers (no OEM does), so this is about as relevant as trying to buy a daily driver and comparing fuel consumption at 250mph. Leave the power limits where they are, and it will perform similarly as that case except for when running AVX-512.

The problem isn’t trying to cool the CPU, the problem is cooling the GPU. A top tier GPU can make good use of 300W (As in, if you drop the power limit to 200W it will measurably reduce performance) and Apple can’t cool that in the stupidly slim case. Which is why they should return to the 2009 design, but I’m repeating myself here.

(I’ve been amusing myself by thinking how I would design an iMac. Just taking the 2009 model, updating it to 2021 internals and removing the optical and instead doubling the GPU heatsink fan is a pretty good start. Give me a little bit more freedom, and I would ditch the 3.5” HDD and move the PSU up into that space. That leaves space near the bottom to let the display grow into 16:10 format and reduce the chin.)

Originally Posted by OreoCookie View Post
AMD's release of its Zen 2-based CPUs was devastating to the Mac Pro. Because you couldn't say that this is the fastest computer, period. And that once you look at top-end workstations, prices are actually not insane. But Zen 2 was released, and it gave Intel a very thorough spanking.
There is something to this, however. Intel being stuck on Skylake for its CPUs for so long meant that single core performance is probably behind the latest iPhone on most code. That is simply embarrassing.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Mar 13, 2021, 10:00 PM
 
@P great post. Don’t mind my nitpicks.
Originally Posted by P View Post
If you disable the power limits, it can draw that much. Apple doesn’t disable the power limits on its computers (no OEM does), so this is about as relevant as trying to buy a daily driver and comparing fuel consumption at 250mph. Leave the power limits where they are, and it will perform similarly as that case except for when running AVX-512.
I see your point, but like you wrote Intel’s CPUs are designed* to run with power limits disabled, and enforcing the official power limits leads to reduced performance. I use an asterisk, because essentially all mainboard manufacturers ignore it, and Intel is clearly turning a willful blind eye (and I wouldn’t be surprised if they helped behind the scenes). Even without all that, the payoff seems rather limited: in single-core performance, the M1 is faster. And you’d need very specific workloads (many threads, AVX-512) for Intel’s CPUs to be faster.

Even when you enable the power limit, current Intel CPUs draw way more than their “TDP”, and as you correctly point out, this takes away valuable watts from the GPU. And you point to another weakness, which is power density (especially with AVX-512-heavy workloads) as opposed to just plain power draw.
Originally Posted by P View Post
The problem isn’t trying to cool the CPU, the problem is cooling the GPU. A top tier GPU can make good use of 300W (As in, if you drop the power limit to 200W it will measurably reduce performance) and Apple can’t cool that in the stupidly slim case. Which is why they should return to the 2009 design, but I’m repeating myself here.
Yes, but I’ll point back to the power draw of Intel CPUs: Apple would have to be very careful how to divvy up its power budget. This is plainly unnecessary if e. g. it used AMD CPUs, which have half the peak power draw. Even when you include power limits and look at average power instead, using an Intel CPU is a big liability that costs you 50–100 W of cooling capacity, which you could use for other things.
Originally Posted by P View Post
(I’ve been amusing myself by thinking how I would design an iMac. Just taking the 2009 model, updating it to 2021 internals and removing the optical and instead doubling the GPU heatsink fan is a pretty good start. Give me a little bit more freedom, and I would ditch the 3.5” HDD and move the PSU up into that space. That leaves space near the bottom to let the display grow into 16:10 format and reduce the chin.)
iMac or iMac Pro?
An entry-level regular iMac with 4+4 cores even would be fine, me thinks. It’d need beefier graphics than the M1, so I think the best solution might actually be for Apple to take the rumored M1X with 8+4 cores and more graphics. The differentiating factor might be a combination of graphics performance and a higher TDP. Seeing how small the Mac mini’s logic board is, I think Apple would just have to make a display, i. e. they can get rid of the chin.

The iMac Pro is another story. IMHO there should be at least four differentiating factors: ECC RAM, much higher graphics performance, much more IO bandwidth and more cores. I reckon the cooling system would have to be specced to match the GPU. While CPU and IO complex will draw power, I think we are talking about 30–50 W if run at full tilt. (I’m too lazy to dig out the numbers, but I think AMD’s IO chiplet for its larger CPUs draws about 15–20 W (correct me if I am wrong), and we can reserve another 15–30 W for, say, 16+4 cores. That is a fraction of the total power draw of current Intel and AMD chips. So it seems the current iMac Pro cooling system will be up for the job.
Originally Posted by P View Post
There is something to this, however. Intel being stuck on Skylake for its CPUs for so long meant that single core performance is probably behind the latest iPhone on most code. That is simply embarrassing.
And Intel’s newest chips don’t fundamentally change that, unless you take extreme measures.
I don't suffer from insanity, I enjoy every minute of it.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Mar 14, 2021, 04:39 PM
 
I'm guessing that new Mac Pros, MacBook Pro 16", and high end iMacs are going to need an Apple SoC with much better PCI-E than the M1. Am I right in thinking the M1 is PCI-E 3 and not 4? One would hope an M1X or M2 etc would have a stack of PCI-E 4 lanes or even PCI-E 5.

Assuming it does, that opens up the option of new AMD GPUs. I can't see Apple making their own discrete GPUs yet can you?
I have plenty of more important things to do, if only I could bring myself to do them....
     
EmilyCanham
Fresh-Faced Recruit
Join Date: Mar 2021
Location: Switzerland
Status: Offline
Reply With Quote
Mar 18, 2021, 05:39 AM
 
Well, everyone has their own opinion. My experience with iMac Pro was just fabulous. No one can beat its smoothness and I still can't digest that it is discontinued only after 3 years of its release.
     
reader50
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Mar 18, 2021, 01:18 PM
 
Originally Posted by EmilyCanham View Post
My experience with iMac Pro was just fabulous. No one can beat its smoothness ...
I'm sad to hear you've been petting your iMac Pro. An unusual habit. My condolences.
     
Thorzdad  (op)
Moderator
Join Date: Aug 2001
Location: Nobletucky
Status: Offline
Reply With Quote
Mar 18, 2021, 02:07 PM
 
Originally Posted by EmilyCanham View Post
Well, everyone has their own opinion. My experience with iMac Pro was just fabulous. No one can beat its smoothness...
I dunno. I always felt it sounded constrained in the midrange.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Mar 18, 2021, 06:58 PM
 
Originally Posted by Waragainstsleep View Post
I'm guessing that new Mac Pros, MacBook Pro 16", and high end iMacs are going to need an Apple SoC with much better PCI-E than the M1. Am I right in thinking the M1 is PCI-E 3 and not 4? One would hope an M1X or M2 etc would have a stack of PCI-E 4 lanes or even PCI-E 5.
I don’t think the PCIe protocol is an issue because AFAIK the M1 just uses it for Thunderbolt, and Thunderbolt limits your throughput. I have seen quite a few applications where the PCIe protocol version doesn’t matter, the number of lanes is just halved when going from v3 to v4 (i. e. the total throughput is the same).

Except for Mac Pros, I think the number of externally accessible PCIe lanes seems limited (since it is essentially just used for Thunderbolt connectivity). Graphics cards need not be attached via PCIe, and I reckon Apple can cover the needs of everything but the iMac Pro and the Mac Pro in house, I think. Perhaps there will be an iMac that will get an option for a beefier CPU.

The more interesting question is whether Apple will opt for user-replaceable memory in the iMacs. My money is on no. But they could nevertheless offer larger capacities and more bandwidth. The latter will be crucial, I think, if they put in more powerful graphics, which is necessary to drive 5k and 8k displays at speed.
Originally Posted by Waragainstsleep View Post
Assuming it does, that opens up the option of new AMD GPUs. I can't see Apple making their own discrete GPUs yet can you?
First of all, we should get away from integrated = slow, discrete = fast. The current generation of consoles have “integrated graphics”, where memory is shared amongst CPU and GPU. And they are not slow. In fact, for many applications, this model is faster since stuff does not have to be pushed between RAM and video RAM.

The more interesting question is whether Apple wants to build a video card that is beefy enough to replace AMD’s and nVidia’s fastest cards. At least for the first generation, I am a bit skeptical.
I don't suffer from insanity, I enjoy every minute of it.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Mar 23, 2021, 08:50 PM
 
Originally Posted by OreoCookie View Post
@P great post. Don’t mind my nitpicks.

I see your point, but like you wrote Intel’s CPUs are designed* to run with power limits disabled, and enforcing the official power limits leads to reduced performance. I use an asterisk, because essentially all mainboard manufacturers ignore it, and Intel is clearly turning a willful blind eye (and I wouldn’t be surprised if they helped behind the scenes). Even without all that, the payoff seems rather limited: in single-core performance, the M1 is faster. And you’d need very specific workloads (many threads, AVX-512) for Intel’s CPUs to be faster.
But motherboard manufacturers ignoring the power limits is not the same as computer OEMs ignoring them. I am not aware of that happening - in fact, Intel and AMD both sell models with reduced power limits to OEMs.

Even when you enable the power limit, current Intel CPUs draw way more than their “TDP”, and as you correctly point out, this takes away valuable watts from the GPU. And you point to another weakness, which is power density (especially with AVX-512-heavy workloads) as opposed to just plain power draw.
Leave the power limits as they are, and the CPU will on average over a medium amount of time (say 1 minute) draw no more than TDP. It can draw 25% more than TDP for a period tau seconds, but it must then compensate for this by drawing less than TDP. It really does this, and you can fiddle with MSR 610 to prove this if you want.

I don’t think the CPU takes away cooling capacity from a GPU in a desktop setting. That old iMac 2009 that I had had completely separate cooling systems for them, as does any tower desktop. For a laptop sure, but that’s not what we’re talking about here.

Yes, but I’ll point back to the power draw of Intel CPUs: Apple would have to be very careful how to divvy up its power budget. This is plainly unnecessary if e. g. it used AMD CPUs, which have half the peak power draw. Even when you include power limits and look at average power instead, using an Intel CPU is a big liability that costs you 50–100 W of cooling capacity, which you could use for other things.
Well, I don’t want a design that shares the cooling capacity between CPU and GPU, so this is moot.

iMac or iMac Pro?
An entry-level regular iMac with 4+4 cores even would be fine, me thinks. It’d need beefier graphics than the M1, so I think the best solution might actually be for Apple to take the rumored M1X with 8+4 cores and more graphics. The differentiating factor might be a combination of graphics performance and a higher TDP. Seeing how small the Mac mini’s logic board is, I think Apple would just have to make a display, i. e. they can get rid of the chin.
I just wish Apple made an iMac that was a credible casual gaming box. It only has to handle 1080p and has an upscaling algorithm to be fine, and it can be loud while doing that, but I do wish we could at least do 1440p on a top model. For that purpose, 4 cores is actually beginning to be a problem today.

The iMac Pro is another story. IMHO there should be at least four differentiating factors: ECC RAM, much higher graphics performance, much more IO bandwidth and more cores. I reckon the cooling system would have to be specced to match the GPU. While CPU and IO complex will draw power, I think we are talking about 30–50 W if run at full tilt. (I’m too lazy to dig out the numbers, but I think AMD’s IO chiplet for its larger CPUs draws about 15–20 W (correct me if I am wrong), and we can reserve another 15–30 W for, say, 16+4 cores. That is a fraction of the total power draw of current Intel and AMD chips. So it seems the current iMac Pro cooling system will be up for the job.

And Intel’s newest chips don’t fundamentally change that, unless you take extreme measures.
My concern with the iMac Pro cooling system is the “thermal corner” Apple found itself in with the trashcan Mac Pro. Any combined cooling system has to be based around an idea of what the relative power draw of the cooled chips have, and all of them seem to based around a mostly equal power draw from CPU and GPU. This design wouldn’t have that - the GPU would draw way more. I want the 2009 iMac cooling system with a beefy heatsink and fan for the GPU and the GPU alone.

On-chip ECC is part of the standard for DDR5. It will be interesting to see what this means for the market - it technically doesn’t protect the transfer between CPU package and DIMM like current ECC does, but maybe that could be validated by the manufacturer to always be OK?

I also think that expansion will always lead to a tower case being superior. It could clearly be organized better than ATX, but that basic idea of loose expansion cards has merit.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Mar 24, 2021, 09:24 AM
 
@P
Nice post, as usual.
Originally Posted by P View Post
Leave the power limits as they are, and the CPU will on average over a medium amount of time (say 1 minute) draw no more than TDP.
That’s not what I think happens, though: for tau seconds the CPU is allowed to exceed TDP substantially, and after tau second the CPU throttles so as to not exceed TDP. Have a look:


Originally Posted by P View Post
I don’t think the CPU takes away cooling capacity from a GPU in a desktop setting. That old iMac 2009 that I had had completely separate cooling systems for them, as does any tower desktop. For a laptop sure, but that’s not what we’re talking about here.

Well, I don’t want a design that shares the cooling capacity between CPU and GPU, so this is moot.
You are right that a tower will always be more flexible, but that’s by definition not the iMac Pro form factor. I think there is a chance that Apple can bring down the price of a Mac Pro to where the entry-level Intel-based Mac Pro was when it was first introduced — I’d call them quite affordable even for what they were. So if my ideal world, Apple would introduce such an entry-level ARM-based Mac Pro for these purposes and the iMac Pro for another audience.

I don’t think a combined cooling system for this machine is a net negative, and I think the cooling system can deal with lopsided TDP demands between CPU and GPU. As for a thermal corner, at least going by what I hear from current iMac Pro customers, the cooling system seems to work very well and has plenty of thermal headroom. I did a little bit of googling, and according to Apple’s spec page the iMac Pro has a max power draw (≠ TDP, I know) of 370 W. So I reckon the TDP of CPU and GPU taken together is at least 300 W, and it seems the cooling system has been over engineered and could handle more.
Originally Posted by P View Post
I just wish Apple made an iMac that was a credible casual gaming box. It only has to handle 1080p and has an upscaling algorithm to be fine, and it can be loud while doing that, but I do wish we could at least do 1440p on a top model. For that purpose, 4 cores is actually beginning to be a problem today.
My brother’s gaming PCs has 4 cores and he told me as much: he frequently maxes out the CPU in games.
Originally Posted by P View Post
My concern with the iMac Pro cooling system is the “thermal corner” Apple found itself in with the trashcan Mac Pro. Any combined cooling system has to be based around an idea of what the relative power draw of the cooled chips have, and all of them seem to based around a mostly equal power draw from CPU and GPU. This design wouldn’t have that - the GPU would draw way more. I want the 2009 iMac cooling system with a beefy heatsink and fan for the GPU and the GPU alone.
What TDPs would you aim for? Given that the current iMac Pro cooling system can handle at least 300 W, the question is how much higher you want to go. Higher-end desktop GPUs draw of the order of 300 W just on their own, and current x86 CPUs another 100–150 W. It stands to reason that Apple’s ARM-based CPUs will have a lower TDP than that (50–100 W?), so a cooling system that can handle 300–400 W seems feasible with the current enclosure.

Plus, I think Apple is agnostic as to whether to have two cooling fans or just one. I can see advantages and disadvantages for both.
Originally Posted by P View Post
On-chip ECC is part of the standard for DDR5. It will be interesting to see what this means for the market - it technically doesn’t protect the transfer between CPU package and DIMM like current ECC does, but maybe that could be validated by the manufacturer to always be OK?
IMHO ECC should be a defining difference between Pro and non-Pro desktops. I hope Apple does not skimp on this.
Originally Posted by P View Post
I also think that expansion will always lead to a tower case being superior. It could clearly be organized better than ATX, but that basic idea of loose expansion cards has merit.
Although I’d say this is the job of the Mac Pro successor, not the follow-on product to the iMac Pro.
I don't suffer from insanity, I enjoy every minute of it.
     
reader50
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Mar 24, 2021, 12:51 PM
 
Originally Posted by OreoCookie View Post
Although I’d say this [internal expandability] is the job of the Mac Pro successor, not the follow-on product to the iMac Pro.
This sounds like an endorsement of Apple's recent status quo, where internal expandability is a luxury option reserved for only the most expensive model of the lineup. And those mortals who cannot spend the price of a car on a computer must do without.

In my opinion, expandability should be offered wherever practical. All desktops that are not minimum sized, and even larger laptops. We should not adopt the mindset that expandability must be excluded from everything costing less than $6K.
     
MacNNFamous
Senior User
Join Date: Jul 2020
Status: Offline
Reply With Quote
Mar 24, 2021, 01:43 PM
 
I literally just junked a 24" iMac because the mobo died, and I couldn't use it as a display. It sucks that Apple refuses to put video inputs on any of their iMacs. Perfectly fine display, into the bin.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Mar 24, 2021, 08:19 PM
 
Actually you can buy boards to convert LCDs into standalone monitors but you need to search by the model on the panel. I'm guessing the 24" had an LG of some kind from memory.
Being that iMacs are so common, their displays are often the best candidates to do these conversions.
I ought to look into this, I have 4 21" displays with cracked glass. Perfectly good panels.
I have plenty of more important things to do, if only I could bring myself to do them....
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Mar 25, 2021, 05:33 AM
 
Originally Posted by reader50 View Post
This sounds like an endorsement of Apple's recent status quo, where internal expandability is a luxury option reserved for only the most expensive model of the lineup. And those mortals who cannot spend the price of a car on a computer must do without.
No, I think it is more complicated than that. The biggest issue is finding a good screen. Outside of Mac land (LG included), there are basically next-to-no high-quality, high-dpi screens. In my mind, this is what the iMac Pro offers. Apple could mitigate that to some degree by offering an official Apple-branded display that doesn’t cost $7k, but I’d still say that form factor is an issue. Like I said, in my ideal world, Apple would offer a $2kish Mac Pro configuration for people who prioritize expandability over the iMac Pro.

Offering an expandable iMac Pro doesn’t make sense to me, because if you make your iMac Pro a tad expandable, then it’ll be a kludgy compromise without good expandability. And once you get to good expandability, you have arrived at a tower aka Mac Pro.
Originally Posted by reader50 View Post
In my opinion, expandability should be offered wherever practical. All desktops that are not minimum sized, and even larger laptops. We should not adopt the mindset that expandability must be excluded from everything costing less than $6K.
Agreed. Although I’d rather push for an expandable Mac mini rather than an expandable iMac Pro. So in my ideal world, Apple would offer a relatively cheap Mac Pro, an expandable Mac mini (that e. g. accepts two m2 SSDs or so and memory upgrades perhaps), but also offer an iMac Pro that sticks pretty close to the current recipe.
I don't suffer from insanity, I enjoy every minute of it.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Mar 28, 2021, 04:17 PM
 
Originally Posted by OreoCookie View Post
No, I think it is more complicated than that. The biggest issue is finding a good screen. Outside of Mac land (LG included), there are basically next-to-no high-quality, high-dpi screens. In my mind, this is what the iMac Pro offers. Apple could mitigate that to some degree by offering an official Apple-branded display that doesn’t cost $7k, but I’d still say that form factor is an issue. Like I said, in my ideal world, Apple would offer a $2kish Mac Pro configuration for people who prioritize expandability over the iMac Pro.
There are plenty of 4K 27" displays. By Apple's own definition of Retina quality, these displays are Retina. Some of them are also HDR, though of course not all. The issue is if you want even higher resolution - 5K at 27", for instance - but that is because the rest of the market decided that that PPI is too high. Since I happen to think that Apple is in the wrong here, I don't think anyone else will change.

Originally Posted by OreoCookie View Post
@P
That’s not what I think happens, though: for tau seconds the CPU is allowed to exceed TDP substantially, and after tau second the CPU throttles so as to not exceed TDP. Have a look:

I know that I have seen the exact explanation from Intel for what is happening with Turbo but for some reason I can't find it again. My memory is that it works like this: The CPU measures the actual power draw (retrospectively - it measures what it used last cycle) and integrates over this to find the power draw over each second. When it is under PL1, the result of this integral must be limited ot PL1. When it is PL2, the limit becomes PL2. The issue is what happens when a CPU is under PL2 but tau runs out without the task being completed. The CPU will then go back to PL1, but it has a power debt that it must pay back before it can enable turbo again. The question is how quickly it will "pay off" this debt. It will do that over a long time - if you run a load that will hit the power limit, it will eventually drop below base clock slightly to pay off the debt - but it is over at least 1 minute if not even more.

You are right that a tower will always be more flexible, but that’s by definition not the iMac Pro form factor. I think there is a chance that Apple can bring down the price of a Mac Pro to where the entry-level Intel-based Mac Pro was when it was first introduced — I’d call them quite affordable even for what they were. So if my ideal world, Apple would introduce such an entry-level ARM-based Mac Pro for these purposes and the iMac Pro for another audience.
Will Apple make two pro models, though? I doubt there is space in the market for them.

I don’t think a combined cooling system for this machine is a net negative, and I think the cooling system can deal with lopsided TDP demands between CPU and GPU. As for a thermal corner, at least going by what I hear from current iMac Pro customers, the cooling system seems to work very well and has plenty of thermal headroom. I did a little bit of googling, and according to Apple’s spec page the iMac Pro has a max power draw (≠ TDP, I know) of 370 W. So I reckon the TDP of CPU and GPU taken together is at least 300 W, and it seems the cooling system has been over engineered and could handle more.
The issue with that on an Intel system is that if you're running the GPU hard and the CPU not at all, the GPU will heat up the CPU. This will mean that the CPU will think that it has a broken cooling system - basically, temperature is high despite the clock not being maxed out, ergo the cooling system is broken, ergo it will throttle. With an Apple CPU, it can be programmed to not think that, but it will still mean that the CPU will have to handle being at 100C just because it is sharing a cooling system with the GPU.

My brother’s gaming PCs has 4 cores and he told me as much: he frequently maxes out the CPU in games.
A 5600X is a great option right now. A 5900X, like I have, is patently unnecessary (but great when playing strange strategy games like I do).

What TDPs would you aim for? Given that the current iMac Pro cooling system can handle at least 300 W, the question is how much higher you want to go. Higher-end desktop GPUs draw of the order of 300 W just on their own, and current x86 CPUs another 100–150 W. It stands to reason that Apple’s ARM-based CPUs will have a lower TDP than that (50–100 W?), so a cooling system that can handle 300–400 W seems feasible with the current enclosure.
It seems power levels for GPUs are slowly creeping up. We are at 220W for the 3070/6700XT now, and there are no signs that it will drop again. Nvidia has broken past the 300W limit from the PCIe spec for its top cards. They're now close to the wink-wink limit of 375W, leaving no limit left for third-party manufacturers to stick to. Apple needs to design a box that can handle that sort of power limits.

Plus, I think Apple is agnostic as to whether to have two cooling fans or just one. I can see advantages and disadvantages for both.
I'm sure there are, but the fact remains that the 2009 iMac was better at cooling that the 2012 one. If the iMac Pro got them back to where they should have been all along, then great, but Apple has shown every intention of using those improvements in cooling to make its computers pointlessly thin instead of powerful and quiet. I think they need to stop doing that.

IMHO ECC should be a defining difference between Pro and non-Pro desktops. I hope Apple does not skimp on this.
IMHO, ECC should be standard across the line. If DDR5 gives us most of that, maybe we can live with not protecting the bits in transfer.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Mar 28, 2021, 07:55 PM
 
I heard somewhere that nVidia 3000 series drivers were significantly less efficient when the CPU is bottlenecking. The Radeon 5900 beats the 3090 under such conditions apparently.

The old PowerMac G5 2.7GHz dual CPU had a 1000W PSU. Ought to be plenty good enough for a couple of maxed out GPUs and a piece of Apple Silicon or two.

The 2012+ iMacs are sooooo much lighter than the 2009 models. I do not mind this at all. And Apple loves it because it means they can get more on a plane.
I have plenty of more important things to do, if only I could bring myself to do them....
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Mar 29, 2021, 12:57 AM
 
“I’ve had it with these motha****in’ iMacs not on this motha****in’ plane!”

-Tim Cook
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Mar 29, 2021, 04:06 AM
 
Originally Posted by P View Post
There are plenty of 4K 27" displays. By Apple's own definition of Retina quality, these displays are Retina. Some of them are also HDR, though of course not all. The issue is if you want even higher resolution - 5K at 27", for instance - but that is because the rest of the market decided that that PPI is too high. Since I happen to think that Apple is in the wrong here, I don't think anyone else will change.
No, I definitely, definitely prefer 5K on 27” displays.
Originally Posted by P View Post
I know that I have seen the exact explanation from Intel for what is happening with Turbo but for some reason I can't find it again. My memory is that it works like this: The CPU measures the actual power draw (retrospectively - it measures what it used last cycle) and integrates over this to find the power draw over each second. When it is under PL1, the result of this integral must be limited ot PL1. When it is PL2, the limit becomes PL2. The issue is what happens when a CPU is under PL2 but tau runs out without the task being completed. The CPU will then go back to PL1, but it has a power debt that it must pay back before it can enable turbo again. The question is how quickly it will "pay off" this debt.
I think from a technical description you are spot on, but the difficult bit seems to be what settings are used. I dug a little deeper and it seems that apart from official Intel spec, there is also Intel guidance, and some of these settings are part of the spec, others are part of the guidance. Some reviewers (e. g. Gamer Nexus) run their benchmarks following spec and guidance, while others ignore the guidance to eek out some extra performance. Essentially, it is overclocking by default. I don’t know what settings Anandtech has used, though, I tried looking for that in the article, but in my coffee-starved state, I was unable to find it. So I don’t know what settings were used to produce the graph I showed you. The article does not specify the mainboard yet as that is under NDA until tomorrow.
Originally Posted by P View Post
Will Apple make two pro models, though? I doubt there is space in the market for them.
I don’t know. They have until recently, though, and I reckon the iMac Pro sold better than the Mac Pro. I have no definite numbers, though.
Originally Posted by P View Post
The issue with that on an Intel system is that if you're running the GPU hard and the CPU not at all, the GPU will heat up the CPU. This will mean that the CPU will think that it has a broken cooling system - basically, temperature is high despite the clock not being maxed out, ergo the cooling system is broken, ergo it will throttle. With an Apple CPU, it can be programmed to not think that, but it will still mean that the CPU will have to handle being at 100C just because it is sharing a cooling system with the GPU.
I understand. You could mitigate that with liquid cooling, and the advantage of a single cooling system as I see it is that you can use a much larger, quieter fan rather than two, noisier fans. But in any case, I think Apple could go back to a two fan solution if they had to.
Originally Posted by P View Post
It seems power levels for GPUs are slowly creeping up. We are at 220W for the 3070/6700XT now, and there are no signs that it will drop again. Nvidia has broken past the 300W limit from the PCIe spec for its top cards. They're now close to the wink-wink limit of 375W, leaving no limit left for third-party manufacturers to stick to. Apple needs to design a box that can handle that sort of power limits.
I think we are close to a turning point for GPUs that are designed as GPU (rather than coin mining accelerator cards ). I reckon Apple will want to trade performance for efficiency here by e. g. reducing clock speeds. But I take your point, we know roughly what the thermals of mid-range and high-end discrete GPUs are and can extrapolate from there.
Originally Posted by P View Post
I'm sure there are, but the fact remains that the 2009 iMac was better at cooling that the 2012 one. If the iMac Pro got them back to where they should have been all along, then great, but Apple has shown every intention of using those improvements in cooling to make its computers pointlessly thin instead of powerful and quiet.
Cooling gets easier when you reduce the thermal load, so I reckon the switch to Apple Silicon will greatly simplify things. And if we remove ourselves from the Intel-based line-up and look at the iPad line-up, I can see Apple introducing a two-tier iMac line-up with a regular iMac and an iMac Pro. The moniker Pro just means faster/better in this case. The iMac Pro could sport faster graphics, a better screen (e. g. using the tech Apple pioneered in the new Apple Display).

As I wrote before, I am totally cool with such a tiered setup, provided Apple keeps the Mac Pro in the line-up.
Originally Posted by P View Post
IMHO, ECC should be standard across the line. If DDR5 gives us most of that, maybe we can live with not protecting the bits in transfer.
Seconded, thirded and fourthed. ECC should just be a standard feature on all devices. The amount of memory is increasing, structure size is decreasing, so the expected number of bit errors must have surely increased over time.
I don't suffer from insanity, I enjoy every minute of it.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Mar 29, 2021, 04:06 AM
 
Yeah, when they changed the form factor last time they also went from squared to tapered boxes. Again, more Macs per plane.
I have plenty of more important things to do, if only I could bring myself to do them....
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Mar 29, 2021, 10:24 AM
 
Originally Posted by OreoCookie View Post
No, I definitely, definitely prefer 5K on 27” displays.
Feel free to, but Apple has a definition of Retina, and 27" 4K does meet it. In fact, it is not as if PC makers have refused to try higher resolutions - they have, and the market has rejected them. Dell made a 24" 4K (P2415Q) for a long time but it is now gone. I think that was the last outside the LG Ultrafine.

I think from a technical description you are spot on, but the difficult bit seems to be what settings are used. I dug a little deeper and it seems that apart from official Intel spec, there is also Intel guidance, and some of these settings are part of the spec, others are part of the guidance. Some reviewers (e. g. Gamer Nexus) run their benchmarks following spec and guidance, while others ignore the guidance to eek out some extra performance. Essentially, it is overclocking by default. I don’t know what settings Anandtech has used, though, I tried looking for that in the article, but in my coffee-starved state, I was unable to find it. So I don’t know what settings were used to produce the graph I showed you. The article does not specify the mainboard yet as that is under NDA until tomorrow.
That is the standard thing - the motherboard increases PL1 and PL2 to 4096W so that it never powerthrottles. No OEM does that. Anandtech has written about it - they generally test with the motherboard default settings.

(Note that many motherboards let you control whether they should remove the powerlimits, but since it always defaults to on, that is what most people use.)

I don’t know. They have until recently, though, and I reckon the iMac Pro sold better than the Mac Pro. I have no definite numbers, though.
I don't think that was ever the plan. The plan was for the iMac Pro to replace the Mac Pro, until that early 2017 realization from Apple that they actually have to make a Mac Pro. At that point, the iMac Pro was already late in development and needed to be made to fill the gap left by trash can at least to some extent.

I understand. You could mitigate that with liquid cooling, and the advantage of a single cooling system as I see it is that you can use a much larger, quieter fan rather than two, noisier fans. But in any case, I think Apple could go back to a two fan solution if they had to.
I don't think liquid cooling is any better than fixed heatpipes - just add more heatpipes if you need to. And whether you can make a good cooling system with a single fan is sort of beside the point - when Apple tried for the 2012 model, they clearly failed to make something as good as the 2009 model (which had three fans - CPU, GPU and PSU). The iMac Pro fixed that by ditching the internal HDD to claw that space back, but I don't see how it is better than the 2009 design. It is just thinner.

I think we are close to a turning point for GPUs that are designed as GPU (rather than coin mining accelerator cards ). I reckon Apple will want to trade performance for efficiency here by e. g. reducing clock speeds. But I take your point, we know roughly what the thermals of mid-range and high-end discrete GPUs are and can extrapolate from there.
It is hard to say where gaming GPUs go from here. Apple clearly has their idea with TBDR. Nvidia has their idea with raytracing+DLSS. I think that with the consoles just being refreshed, we are locked into the idea of rasterization mainly for the next generation, but that it might change after that. Since the consoles are all built for lots of power output without being loud and the minimum is being able to match the consoles, I have a hard time seeing a stupidly thin computer being competitive as a gaming box any time soon.

Cooling gets easier when you reduce the thermal load, so I reckon the switch to Apple Silicon will greatly simplify things. And if we remove ourselves from the Intel-based line-up and look at the iPad line-up, I can see Apple introducing a two-tier iMac line-up with a regular iMac and an iMac Pro. The moniker Pro just means faster/better in this case. The iMac Pro could sport faster graphics, a better screen (e. g. using the tech Apple pioneered in the new Apple Display).
I forget if I have said this here or not, but I think that that was the plan but that Apple abandoned it. The plan was to have iMac and iMac Pro, and Macbook and Macbook Pro alongside iPad Pro and iPhone Pro. The reception of the 2016 MBP killed that plan, and Apple updated the mini and launched a new Mac Pro and MBA.

As I wrote before, I am totally cool with such a tiered setup, provided Apple keeps the Mac Pro in the line-up.
Sure, but I don't think Apple's volumes are that large. If the Mac Pro is going to pay for itself, it has to take a lot of the pro users. An iMac Pro, while profitable in itself, will just cannibalize that.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Mar 31, 2021, 03:43 AM
 
Originally Posted by P View Post
Feel free to, but Apple has a definition of Retina, and 27" 4K does meet it. In fact, it is not as if PC makers have refused to try higher resolutions - they have, and the market has rejected them. Dell made a 24" 4K (P2415Q) for a long time but it is now gone. I think that was the last outside the LG Ultrafine.
I have a 27” 5K display on my office desk, no way that 4K would be acceptable for me.
Originally Posted by P View Post
That is the standard thing - the motherboard increases PL1 and PL2 to 4096W so that it never powerthrottles. No OEM does that. Anandtech has written about it - they generally test with the motherboard default settings.
As far as I understand that is the crux of the matter when it comes to benchmarks: some mainboards default to ignoring the guidelines/overclock by default (depending on your point of view), others do not.
Originally Posted by P View Post
I don't think that was ever the plan. The plan was for the iMac Pro to replace the Mac Pro, until that early 2017 realization from Apple that they actually have to make a Mac Pro. At that point, the iMac Pro was already late in development and needed to be made to fill the gap left by trash can at least to some extent.
Sure, but that’s the past, I think.
Originally Posted by P View Post
I don't think liquid cooling is any better than fixed heatpipes - just add more heatpipes if you need to. And whether you can make a good cooling system with a single fan is sort of beside the point - when Apple tried for the 2012 model, they clearly failed to make something as good as the 2009 model (which had three fans - CPU, GPU and PSU). The iMac Pro fixed that by ditching the internal HDD to claw that space back, but I don't see how it is better than the 2009 design. It is just thinner.
Yes, and I’d say the iMac Pro’s cooling system has been well-regarded by-and-large. So I think it just boils down to that there is not a one-size-fits-all solution, and that there are good and bad implementations of a particular cooling solution. I’ve just heard problems with dust build-up. In my experience, all radial fans (such as the ones in iMacs, laptops and the like) get crudded up with dust over time, and that regular fans such as the ones you find in tower cases are much more resilient.
Originally Posted by P View Post
It is hard to say where gaming GPUs go from here. Apple clearly has their idea with TBDR. Nvidia has their idea with raytracing+DLSS.
Yeah, and it seems like game manufacturers aren’t as in love with raytracing as nVidia is. I think one central obstacle is the fact that triple-A games are usually also released on consoles, which run AMD graphics cards. That will be, in my opinion, the big hurdle for nVidia to overcome. Plus, the pay-off doesn’t seem so clear.
Originally Posted by P View Post
I think that with the consoles just being refreshed, we are locked into the idea of rasterization mainly for the next generation, but that it might change after that. Since the consoles are all built for lots of power output without being loud and the minimum is being able to match the consoles, I have a hard time seeing a stupidly thin computer being competitive as a gaming box any time soon.
The console manufacturers have been very happy with AMD from the looks of things, since they stuck with the same general recipe for the current generation as with the previous generations: relying on 6+ cores (even if at lower clocks) coupled to a fast, unified memory architecture.

However, I think it is quite easy for Apple to build a machine in a small form factor that can rival consoles — if it wants to. Even in its lowest-power implementation, in single core performance Apple’s CPU cores are across a broad selection of workloads already as fast as the fastest Zen 3 cores AMD can offer. Putting 8+4 cores on a chip would easily compete with consoles (and pretty much any x86 chip with a similar core count that doesn’t run on liquid nitrogen). Graphics-wise, it seems feasible, too. And Apple does have the same memory model, too. So I see no obstacles here, to be honest. To me the only question is whether Apple deems it necessary to put as much graphics power into an iMac as you have in consoles. Personally, I hope customers will at least have an option.
Originally Posted by P View Post
I forget if I have said this here or not, but I think that that was the plan but that Apple abandoned it. The plan was to have iMac and iMac Pro, and Macbook and Macbook Pro alongside iPad Pro and iPhone Pro. The reception of the 2016 MBP killed that plan, and Apple updated the mini and launched a new Mac Pro and MBA.
Sure, but that was then and now is now. The 16” MacBook Pro has been very well-received. It is a great machine, and its biggest weakness, battery life, will be remedied with the transition to ARM. The two-tier line-up (non-Pro/Pro) seems to work great for laptops, iPads and iPhones, so I can see Apple keep doing that for the iMac, too.
Originally Posted by P View Post
Sure, but I don't think Apple's volumes are that large. If the Mac Pro is going to pay for itself, it has to take a lot of the pro users. An iMac Pro, while profitable in itself, will just cannibalize that.
True. Conversely, I think Apple has learnt that the Mac Pro has an outsized impact on the perception of the company, because of a very small, but very vocal subset of their customers who want/need that type of machine. The iMac Pro appeals to a larger, different audience, true, and IMHO the benefits of offering both outweighs the cons. Time will tell. Apple has the chance to change its line-up of machines if it wanted to, and I don’t know what they will do. Interesting to
I don't suffer from insanity, I enjoy every minute of it.
     
   
Thread Tools
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 07:38 AM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,