Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Software - Troubleshooting and Discussion > macOS > macOS High Sierra

macOS High Sierra (Page 3)
Thread Tools
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 13, 2017, 09:30 PM
 
@P
I've gotta hand it to you, but you were right on the money when you said that EMIB would be a big deal. Although in my defense, who could have foreseen that hell would freeze over and AMD would sell GPUs to Intel?

The new chips have Apple written all over them. Given Apple's high-profile hire of AMD GPU czar Raja Koduri, this partnership seems to be a marriage with a limited expiration date, though … 
I don't suffer from insanity, I enjoy every minute of it.
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Nov 14, 2017, 02:35 AM
 
Originally Posted by Chongo View Post
The App Store downloaded High Sierra (10.13.1.05) I've been holding off for APFS support for Fusion drives. Anyone with a Fusion drive install HS?
Just did an update from 10.13.0 to 10.13.1 on my HS test partition. It took longer than I expected, but there was no APFS conversion. This is a hard drive partition.

Disk Utility has a "Convert to APFS..." command, but it was disabled for all my system partitions. It was willing to convert a data partition. I don't have another HS partition to check it against. And admittedly, I didn't wipe the partition and do a fresh 10.13.1 install to be absolutely sure.

Best guess: they're not ready on Fusion yet. If they'd whipped the bugs for Fusion drives, I would expect them to go all-in with HDDs too.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Nov 14, 2017, 02:32 PM
 
Originally Posted by OreoCookie View Post
@P
I've gotta hand it to you, but you were right on the money when you said that EMIB would be a big deal. Although in my defense, who could have foreseen that hell would freeze over and AMD would sell GPUs to Intel?
Intel is more customer driven than you think. If they develop something like EMIB, it is because some big customer asked for it. That big customer probably also pushed AMD. And yes, we both know who that customer is...

I think AMD sees their future as someone who makes specialized chips, much like the console SOCs. The GPU they’re selling is in all likelihood a high-margin product, and it is something to be proud of. I think AMD needs that too, for recruitment purposes.

The new chips have Apple written all over them. Given Apple's high-profile hire of AMD GPU czar Raja Koduri, this partnership seems to be a marriage with a limited expiration date, though … 
Oh, they absolutely have Apple written all over them.

This partnership has some life in it yet, though - if Intel starts making a GPU now, as seems likely, it will launch in 4 years or so. A lot can happen in four years - back in 2013, AMDs Radeon 290 was toe-to-toe with nVidias much larger and more expensive original Titan, and it looked like they had the upper hand as NVidia cancelled most of their Kepler refresh and delayed the next generation Maxwell cards. Now, AMD has finally launched their much-delayed new high-end card, after ceding that sector for 18 months to NVidia alone, and that high end card is a bit of a dud. NVidia is riding high with record profits and seems unstoppable.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 14, 2017, 07:25 PM
 
@P
Agreed. I just couldn’t see AMD partnering with Intel on that. But that shows that Apple will stick with Intel for its Macs for at least another few years. Once Intel’s own GPU is done in about 4 years time, I reckon the shelf life of the AMD-Intel deal will expire. I’m very curious to see how good the first products will be and whether they make it into a laptop in their first revision. (It seems that the estimated combined TDP is about 55 W, which seems a bit hot for a laptop and more suitable for an iMac.)

Regarding NVIDIA vs. AMD in the GPU space, I would add that NVIDIA has made significant inroads into the server compute market, which just like with CPUs is much more profitable than the consumer market. Plus, NVIDIA has been investing in the automotive market with its autonomous driving platform whereas AMD has nothing to compete with that. AMD is in a tough spot, and their Zen-based product gave them a little reprieve — that AMD hopefully uses to improve its whole product line. For example, I haven’t heard much about their ARM-based server efforts anymore, and looking beyond the horizon a little, this is where the next battle will be. Qualcomm just released a serious ARM-based 48-core monster (in terms of die size) fabbed in Samsung’s 10 nm process, so EPYC will not just have to fend off Xeons but also ARM-based chips.
I don't suffer from insanity, I enjoy every minute of it.
     
Chongo
Addicted to MacNN
Join Date: Aug 2007
Location: Phoenix, Arizona
Status: Offline
Reply With Quote
Nov 14, 2017, 08:43 PM
 
Originally Posted by reader50 View Post
Just did an update from 10.13.0 to 10.13.1 on my HS test partition. It took longer than I expected, but there was no APFS conversion. This is a hard drive partition.

Disk Utility has a "Convert to APFS..." command, but it was disabled for all my system partitions. It was willing to convert a data partition. I don't have another HS partition to check it against. And admittedly, I didn't wipe the partition and do a fresh 10.13.1 install to be absolutely sure.

Best guess: they're not ready on Fusion yet. If they'd whipped the bugs for Fusion drives, I would expect them to go all-in with HDDs too.
OK, I update to HS. I discovered the free Final Cut X (10.1.4) I got from my brother's mega purchase from Amazon is unusable. It looks like I will need to save my pennies and buy the latest version from the app store.
"The blood of the martyrs is the seed of the church" Saint Tertullian, 197 AD
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Nov 15, 2017, 04:42 AM
 
Originally Posted by OreoCookie View Post
@P
Agreed. I just couldn’t see AMD partnering with Intel on that. But that shows that Apple will stick with Intel for its Macs for at least another few years. Once Intel’s own GPU is done in about 4 years time, I reckon the shelf life of the AMD-Intel deal will expire. I’m very curious to see how good the first products will be and whether they make it into a laptop in their first revision. (It seems that the estimated combined TDP is about 55 W, which seems a bit hot for a laptop and more suitable for an iMac.)
The Intel/AMD deal will have to die in some way in four years time, but Intel will still need a license for GPU patents for some time longer.

TDP: The combined TDP for the CPU and GPU of the current 15" MBP is 75W. Now, I would argue that it can't actually manage that, as there are situations where running both at max will make one throttle, and it probably only works because max GPU power is rarely combined with max CPU power in the form of vector graphics, but OTOH... One of the early leaked samples has 24 CUs, a 50% increase from the Radeon 560 in the current model. The amount of memory also implies that it is a pretty decent GPU. I could maybe see that all fitting inside 75W, but not 55W.

At the same time, I think that we will see a 100W model (because that is what the older MXM standard could support), but even that won't be enough for the iMac. Remember that the iMacs are at 175-200W combined TDP right now, and the iMac Pro seems set to exceed that.

Originally Posted by OreoCookie View Post
Regarding NVIDIA vs. AMD in the GPU space, I would add that NVIDIA has made significant inroads into the server compute market, which just like with CPUs is much more profitable than the consumer market. Plus, NVIDIA has been investing in the automotive market with its autonomous driving platform whereas AMD has nothing to compete with that. AMD is in a tough spot, and their Zen-based product gave them a little reprieve — that AMD hopefully uses to improve its whole product line. For example, I haven’t heard much about their ARM-based server efforts anymore, and looking beyond the horizon a little, this is where the next battle will be. Qualcomm just released a serious ARM-based 48-core monster (in terms of die size) fabbed in Samsung’s 10 nm process, so EPYC will not just have to fend off Xeons but also ARM-based chips.
Oh yes, NVidia is doing great in the server market. Maxwell, the design that they revealed in 2014, was the step in the right direction that they needed, and they're building on that now with Pascal and Volta. They do appear to have trouble working with other people though - they have been kicked out of several self-driving car projects, and the latest I heard the reason their GPUs are banned from future Macs is because they refuse to share driver code with Apple.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 15, 2017, 05:01 AM
 
Originally Posted by P View Post
The Intel/AMD deal will have to die in some way in four years time, but Intel will still need a license for GPU patents for some time longer.
Becoming an IP shop is not a good place to be unless you are also competitive in the long term. AMD might get squeezed out by Intel and nVidia.
Originally Posted by P View Post
TDP: The combined TDP for the CPU and GPU of the current 15" MBP is 75W. Now, I would argue that it can't actually manage that, as there are situations where running both at max will make one throttle, and it probably only works because max GPU power is rarely combined with max CPU power in the form of vector graphics, but OTOH... One of the early leaked samples has 24 CUs, a 50% increase from the Radeon 560 in the current model. The amount of memory also implies that it is a pretty decent GPU. I could maybe see that all fitting inside 75W, but not 55W.
Don't forget, though, that now CPU and GPU are in close proximity, so it may actually be harder to cool 55 W with one fan than 75 W with two fans. Probably, they will implement some dynamic cooperative throttling so that, say, 55 W are not exceeded even if the CPU or GPU by itself can convert, say, 35 W into heat each.
Originally Posted by P View Post
At the same time, I think that we will see a 100W model (because that is what the older MXM standard could support), but even that won't be enough for the iMac. Remember that the iMacs are at 175-200W combined TDP right now, and the iMac Pro seems set to exceed that.
Once we go past 100 W, I don't see the advantage of putting CPU, GPU and memory onto one chip. The fact that it is smaller might now become a disadvantage, and with desktop designs, you are much less constrained by space anyway. On the desktop, I see the role of AMD's discrete GPU to replace Intel's weaker integrated GPU, and slot in between a “proper” discrete GPU (with more TDP headroom) and Intel's integrated graphics.
Originally Posted by P View Post
Oh yes, NVidia is doing great in the server market. Maxwell, the design that they revealed in 2014, was the step in the right direction that they needed, and they're building on that now with Pascal and Volta.
Don't forget the software-side of the story, it seems CUDA has really been adopted by a few important niche communities to accelerate computations.
Originally Posted by P View Post
They do appear to have trouble working with other people though - they have been kicked out of several self-driving car projects, and the latest I heard the reason their GPUs are banned from future Macs is because they refuse to share driver code with Apple.
I hadn't heard that tidbit, although I am not surprised given their strained relationship with the Linux community. In the automotive sector, I do hope they get their act together, because the SoCs they have on offer are infinitely more powerful than the “entertainment systems” you find even in high-end cars these days.
I don't suffer from insanity, I enjoy every minute of it.
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Nov 15, 2017, 05:34 AM
 
Originally Posted by P View Post
... and the latest I heard the reason their GPUs are banned from future Macs is because they refuse to share driver code with Apple.
I was under the impression Apple is pissed at nVidia due to patent trolling lawsuits. And they'd presumably be welcomed back as soon as they stopped trying to shake down other parties for unearned cash.

Do you have a link for them refusing to share driver code? That sounds interesting, and distinctly concerning for any OS vendor.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 15, 2017, 08:09 AM
 
Originally Posted by reader50 View Post
Do you have a link for them refusing to share driver code? That sounds interesting, and distinctly concerning for any OS vendor.
This is especially troubling given how closely Apple integrates its hard- and software stack, and I can see how this would be a deal breaker for Apple.
I don't suffer from insanity, I enjoy every minute of it.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Nov 15, 2017, 09:08 AM
 
Originally Posted by OreoCookie View Post
Becoming an IP shop is not a good place to be unless you are also competitive in the long term. AMD might get squeezed out by Intel and nVidia.
Nope, that is certainly not good. It is simply a way to delay the inevitable.

Don't forget, though, that now CPU and GPU are in close proximity, so it may actually be harder to cool 55 W with one fan than 75 W with two fans. Probably, they will implement some dynamic cooperative throttling so that, say, 55 W are not exceeded even if the CPU or GPU by itself can convert, say, 35 W into heat each.
Right now, the 13" MBP with Touchbar has two fans and two radiators, each radiator connected by a heatpipe to the same heatsink over the CPU. This is what Apple could do with these chips. The 15" has a very similar setup, except that there are two heatsinks - one each for CPU and GPU. This means that if the CPU is going full blast and the GPU is idling, all of the cooling capacity can go to the CPU - somewhat similar to how it works in the trashcan Mac Pro.

But this setup has a major problem, which I think is what caused Apple to abandon the trashcan design: Intel's thermal throttling algorithm. Intel will not throttle at a fixed temperature - it will actually throttle sooner if the temperature rises while the CPU uses only a little power. This may sound insane, but Intel's logic is that if the cooling system cannot keep the CPU cool when it is using very little power, then the cooling system is broken and it needs to throttle to compensate. The problem is when something else is heating up the heatsink. If the CPU is mostly idling while the GPU is running hot, the temperature on the heatsink can increase to the point where the CPU starts to throttle. This will starve the GPU, causing it to slow down from a lack of data to work on. The only way to make this work is that make sure that the temperature of the heatsink always stays below the LOWER throttling temperature, which means a lot of extra cooling.

The fix it probably exactly what you say - have the two chips communicate with each other about their power usage and thermal needs. EMIB can be a way to do that.

Once we go past 100 W, I don't see the advantage of putting CPU, GPU and memory onto one chip. The fact that it is smaller might now become a disadvantage, and with desktop designs, you are much less constrained by space anyway. On the desktop, I see the role of AMD's discrete GPU to replace Intel's weaker integrated GPU, and slot in between a “proper” discrete GPU (with more TDP headroom) and Intel's integrated graphics.
This idea is for laptops only for now, agreed. Intel will put them in a NUC as well (Mac mini-sized computer), but that's it.

Don't forget the software-side of the story, it seems CUDA has really been adopted by a few important niche communities to accelerate computations.
I know. I never could understand why it is used over OpenCL, but it seems people begun learning CUDA and using nVidia cards and just didn't want to switch when - surprise, surprise - OpenCL ran worse on nVidia cards.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Nov 15, 2017, 09:15 AM
 
Originally Posted by reader50 View Post
I was under the impression Apple is pissed at nVidia due to patent trolling lawsuits. And they'd presumably be welcomed back as soon as they stopped trying to shake down other parties for unearned cash.

Do you have a link for them refusing to share driver code? That sounds interesting, and distinctly concerning for any OS vendor.
Rumor only, Apple would never share that information.

The general idea was that patent trolling was what kicked this off, but nVidia has stopped trolling now (because they lost badly against Samsung), so that argument doesn't apply anymore. If nVidia were OK again, Apple would put one of their cards as an option in the iMac to keep them happy - especially as they could fit something like a 1070 in there on thermals, and AMD didn't have anything to match that until the very delayed Vega release. There must be something else holding it up, and this has emerged as the new most likely answer.

As Oreo says, nVidia's actions regarding Linux drivers lend some support to this. Part of the secret source of how Maxwell achieved its efficiency increase (dynamic tiling) was figured out by David Kanter of RealWorldTech. AMD has now tried to implement the same thing in Vega with limited success, and nVidia may be worried that their driver source might leak details of how they did it. Note that Apple never used Maxwell graphics. Even when they stuck by nVidia, they used the previous generation (Kepler) for much longer than made sense.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 15, 2017, 07:50 PM
 
@P
Regarding AMD, I am still surprised that Apple didn’t just buy them a while back. Even the CPU- and GPU-related IP seems completely worth it on its own.

About NVIDIA drivers, I can’t help but think that Apple’s ventures into building its own GPU also contributed to that: surely there was scuttlebutt about Apple’s effort in the industry long before they became accepted as plausible by the mainstream (e. g. when friends and family were hired by Apple). NVIDIA might have felt that it would hand them valuable ideas on how to make their own graphics chips and graphics drivers faster.

Not to re-heat the Macs running ARM discussion, but rumor is that the iPad Pro gets an 8 = 3 fast + 5 slow core SoC. I would really like someone to run the new SPECmarks on the A11 or A11X and compare that to Intel’s offerings. (Why doesn’t Anandtech do that anymore? Are there any other sites that do?)
I don't suffer from insanity, I enjoy every minute of it.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Nov 16, 2017, 08:40 AM
 
Originally Posted by OreoCookie View Post
@P
Regarding AMD, I am still surprised that Apple didn’t just buy them a while back. Even the CPU- and GPU-related IP seems completely worth it on its own.
It is totally worth it. The even more likely buyer is MS, actually (both for the Xbox and for the Surface business), but they won't do it. There is a poison pill in the x86 licensing agreement, so AMD loses its x86 license if they change owners. The possible workaround is to spin off RTG (Radeon Technology Group, who makes the graphics cards) with a license back to AMD and sell that to someone, but I think AMD keeps that as the last resort.

There is also a small chance that someone could buy AMD if they made a deal with Intel first so Intel does not execute its right to terminate the x86 license. I don't know what sort of leverage you'd need to do that, though.

About NVIDIA drivers, I can’t help but think that Apple’s ventures into building its own GPU also contributed to that: surely there was scuttlebutt about Apple’s effort in the industry long before they became accepted as plausible by the mainstream (e. g. when friends and family were hired by Apple). NVIDIA might have felt that it would hand them valuable ideas on how to make their own graphics chips and graphics drivers faster.
Oh, absolutely. Raja Koduri, former head of RTG and star of the first post on this page, was at Apple for a few years before moving back to AMD. I'm pretty sure nVidia keeps track of guys like him.

Not to re-heat the Macs running ARM discussion, but rumor is that the iPad Pro gets an 8 = 3 fast + 5 slow core SoC. I would really like someone to run the new SPECmarks on the A11 or A11X and compare that to Intel’s offerings. (Why doesn’t Anandtech do that anymore? Are there any other sites that do?)
I want that too. I'm not sure why Anandtech stopped, but I suspect it is because Anand Lal Shimpi himself left and sold the company (to go do competitive analysis at Apple) and the new owners don't want to spend the reporter time it would take to do one of his deep analysis posts. It is a real hole in the market place right now.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Nov 16, 2017, 09:50 AM
 
Originally Posted by P View Post
It is totally worth it. The even more likely buyer is MS, actually (both for the Xbox and for the Surface business), but they won't do it. There is a poison pill in the x86 licensing agreement, so AMD loses its x86 license if they change owners.
If Apple were to buy AMD, they could just do it for the talent and the patents. Just imagine an ARM-based server chip, that'd be right up AMD's alley. The GPU guys could be sprinkled in where needed, Apple's GPU division or the groups who design the various other coprocessors.
Originally Posted by P View Post
The possible workaround is to spin off RTG (Radeon Technology Group, who makes the graphics cards) with a license back to AMD and sell that to someone, but I think AMD keeps that as the last resort.
What about a cross-licensing deal where post-acquisition-AMD gives all (or the important bits) of its GPU-related IP and retains the right to make x86 cores?
Originally Posted by P View Post
I want that too. I'm not sure why Anandtech stopped, but I suspect it is because Anand Lal Shimpi himself left and sold the company (to go do competitive analysis at Apple) and the new owners don't want to spend the reporter time it would take to do one of his deep analysis posts. It is a real hole in the market place right now.
It's quite sad, because those were the reviews I looked forward to the most. Especially now I'd like to see a better, more nuanced comparison than Geekbench between a MacBook Pro and an iPhone 8/X. (That's a pretty insane sentence to write.)
I don't suffer from insanity, I enjoy every minute of it.
     
And.reg
The Mighty
Join Date: Feb 2004
Location: Well the sports issue was within arm's reach but they closed up shop and kicked me out. And I'm out of toilet paper.
Status: Offline
Reply With Quote
Yesterday, 10:45 AM
 
And here is another example of the limited power of Siri:

This one time, at Boot Camp, I stuck a flute up my PC.
     
 
Thread Tools
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 02:02 AM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,