Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Community > MacNN Lounge > Mac transition to ARM to be announced at WWDC?

Mac transition to ARM to be announced at WWDC? (Page 4)
Thread Tools
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jul 8, 2020, 09:30 AM
 
Originally Posted by P View Post
The only downside LPDDR has over the regular is that it is always soldered. Other than that, it is essentially “DDR Pro” now.
Now I'm kinda pissed my 16" doesn't have it since they soldered the RAM down.


Originally Posted by P View Post
The thing is - being better at performance per watt is fine on paper, but when you make a new iMac, it needs to be faster than the old one in actual terms. Whatever CPU Apple puts in there, it is going to have to run faster than the 2.6 GHz or whatever the iPad does. I think we’re talking 4 GHz at least, as well as more cores, and that will draw some power. Maybe the idea is to make something that can handle 65W in a pinch but runs cooler and quieter at 35W, or whatever the new CPUs will run at.
The A12Z runs 15W doesn't it? The current base iMac has a 65W CPU. Plus a Radeon Pro 555X. Actually given the iMac Pro enclosure can handle ~350W of CPU+GPU, a model based on the same Intel gear as the 13" MBP could probably withstand the sort of shrink that would still allow an Apple Silicon based iMac Pro to comfortable thrash the Intel one.

Originally Posted by P View Post
They’re not negligible, they’re effectively the A6 core from iPhone 5 with a higher clock and more modern cache system. But yes, Geekbench doesn’t use them because Rosetta 2 apparently doesn’t use them. Remember that these scores are run in emulation.
Yes, I was aware of the emulation but not certain about those extra cores. It sounds to me then that the native version of Geekbench should score significantly higher on multicore, assuming Apple allows developers to thrash all 8 cores and doesn't lock the efficiency cores to background tasks only.
I have plenty of more important things to do, if only I could bring myself to do them....
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jul 8, 2020, 06:51 PM
 
Scaling GPUs is not necessarily hard. You can add more shaders rather easily, and the advantage with the tile-based model is that you can just do a “stupid” scaling, make all the tiles equally sized and then just pass them off to different units.

The GPU in the A13 is 15.28 mm2. The 5700 XT, an upper midrange GPU from AMD on the same node, is 251mm2. The memory controller amount to effectively 50mm2 (sidenote - I love that this is something that you can google up), so that GPU is 200mm2. You can fit 13 of Apple’s GPU in that space. Yes you would need a wider memory controller, but this is one thing TBDR does great - you need less memory bandwidth, and you can cover it with cache more easily.

Or... you pay AMD to develop a TBDR GPU. They could certainly sell that to Nintendo for the next Switch as well.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jul 8, 2020, 06:57 PM
 
Originally Posted by Waragainstsleep View Post
Now I'm kinda pissed my 16" doesn't have it since they soldered the RAM down.
Blame Intel. The latest Skylake refresh (=what you have) doesn’t support it, and the model of that CPU that does, Comet Lake, is just barely out, and it only supports lower clocks anyway.

Or consider that the 13” needs that memory bandwidth to support both CPU and GPU, while your setup has hat DDR4 for the CPU only and then a massive chunk of GDDR6 for the GPU to play with.

Yes, I was aware of the emulation but not certain about those extra cores. It sounds to me then that the native version of Geekbench should score significantly higher on multicore, assuming Apple allows developers to thrash all 8 cores and doesn't lock the efficiency cores to background tasks only.
Apple allows devs to use all 8 cores simultaneously in the iPad, and has done so ever since the A11 (the A10 is only two cores max at any time, even though it has four), so it makes sense that they will in the Mac as well. Probably Apple didn’t bother writing Rosetta 2 for that, as four cores is plenty for emulating Intel chips anyway.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jul 9, 2020, 03:26 AM
 
Originally Posted by P View Post
The GPU in the A13 is 15.28 mm2. The 5700 XT, an upper midrange GPU from AMD on the same node, is 251mm2. The memory controller amount to effectively 50mm2 (sidenote - I love that this is something that you can google up), so that GPU is 200mm2. You can fit 13 of Apple’s GPU in that space. Yes you would need a wider memory controller, but this is one thing TBDR does great - you need less memory bandwidth, and you can cover it with cache more easily.
Nice to see some numbers attached to this. So the GPU in the A13 is positively tiny.
Originally Posted by P View Post
Or... you pay AMD to develop a TBDR GPU. They could certainly sell that to Nintendo for the next Switch as well.
There is also the added problem of patents — as soon as Apple ventures further into GPU territory, they might have to partner with AMD to avoid being sued.

(Patents have become so “cheap” that a lot of engineering work consists of engineering around patents of other companies rather than do the “obviously” best thing.)
I don't suffer from insanity, I enjoy every minute of it.
     
ghporter
Administrator
Join Date: Apr 2001
Location: San Antonio TX USA
Status: Offline
Reply With Quote
Jul 11, 2020, 02:46 PM
 
With some thought, it’s come to me that these new machines are likely to be “user ready” out of the box, and if Rosetta 2 is as effective as the original, current software should “just run,” rather than needing any sort of fancy shenanigans to make it run.

Which makes me now think about this: has anybody heard or seen anything about price points yet? My 2015 MBP is a great machine, but my iMac is running on unofficial, unsupported upgrades. I can see getting a new iMac in the near future. As long as it doesn’t cost me two mortgage payments...

Glenn -----OTR/L, MOT, Tx
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Jul 11, 2020, 03:19 PM
 
Trend has been higher prices. I would like they to lower them but doubtful.
     
reader50
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Jul 11, 2020, 03:33 PM
 
The last transition, we went from limited-market PPC chips and custom motherboards to industry-dominant x86 chips + custom motherboards. Apple talked about increased performance, and prices went up.

Apple now plans to go nearly all-proprietary, including GPUs. They've been using proprietary SSDs for awhile. Near as I can tell, the only standard parts they still use are RAM sticks. Though most are soldered down.

Apple is talking about performance increases again. I expect prices to go north.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jul 11, 2020, 05:55 PM
 
RAM chips and Flash chips are standard. Network baseband chips seem likely to be as well. There isn’t a lot else.

Overall, I think Apple tried to push prices up for a while but have backed off. The reintroduction of the MacBook Air at $999 is the best sign of this.

I have to wonder about the path forward, though. All talking heads are assuming touchscreens on the laptops. I pray not, but the redesigned UI in Mac OS 11 is suggestive. I also think that LTE networking and high refresh rate displays are likely, with the latter being very easy to do now that the bandwidth limitation of DisplayPort 1.2 is gone. All of these things cost money, and it’s not like Apple is going to cut margins.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jul 11, 2020, 06:32 PM
 
The UI is for AR not touch.
I have plenty of more important things to do, if only I could bring myself to do them....
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Jul 11, 2020, 07:04 PM
 
Any guesses?

Assuming rumors of 14”, touchscreens and more performance...

Mini: $999
12/13” MBA: $1099
13/14” MBP: $1599
16” MBP: $2799
iMac: $
iMac Pro: $5999
MP: $kidney ($8999?)

I wonder if Apple keeps the Intel Macs around as cheaper options or not.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jul 12, 2020, 04:54 AM
 
I think Apple wants the $999 price point for MBAs around. They may have to cut the storage and RAM to make that happen, but have you seen the RAM figures on iOS devices? They can make that work with small amounts of RAM.

Other than that, Apple likes its price points, so I would rather think that the current $1299 point becomes MBA only and the MBP starts at $1499 etc.

MP I don’t think gets updated any time soon, and iMac Pro I’m not sure it gets updated again ever.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jul 12, 2020, 05:43 AM
 
Originally Posted by reader50 View Post
Apple now plans to go nearly all-proprietary, including GPUs. They've been using proprietary SSDs for awhile. Near as I can tell, the only standard parts they still use are RAM sticks. Though most are soldered down.
Apple makes a decent iPad for $350 and even the iPad Pro starts at $800. I'm not saying Apple will make their products cheaper, but the products should be cheaper for Apple to make. If we assume that Apple will keep its profit margin constant, then Macs should get cheaper.

I think Apple was quite happy when you could pick up a MacBook Air for $999. (The price was one of the three big issues of the MacBook, apart from the keyboard and only having one USB port, it just wasn't cheap enough.) Edit: P beat me to it.

And Apple sells these strange 13" MacBook semi-Pros with only two ports, which sit uncomfortably between the Air and the “proper” Pros. In an ideal world, I think Apple would want to get rid of those, too.
Originally Posted by reader50 View Post
Apple is talking about performance increases again. I expect prices to go north.
From a marketing point of view, you have a point. But more performance does not mean the chips will cost Apple more, quite the opposite.

If I had to guess, here is what I expect Apple to do as far as starting prices for mobile Macs are concerned:

MacBook Air: $800-$1,000
MacBook Pro 13"/14": $1,300–$1,500
MacBook Pro 16": $2,200-$2,400

It all depends on how much additional tech Apple puts in the higher-end models and how aggressive Apple wants to be on the pricing, especially with regards to its iPad line-up. If Apple insists on putting 120 Hz displays in its Pro models, for example, that'll cost them.
I don't suffer from insanity, I enjoy every minute of it.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jul 12, 2020, 02:33 PM
 
Apple has released tools for Windows to help devs port games to Metal and the Apple ecosystem. I take this as an indicator that they are going to go for a bigger market share.
If that's the case, I expect to see some aggressive pricing paired with performance that no other builder can match. Things are going to get interesting.
I have plenty of more important things to do, if only I could bring myself to do them....
     
mindwaves
Registered User
Join Date: Sep 2000
Location: Irvine, CA
Status: Offline
Reply With Quote
Jul 13, 2020, 09:29 PM
 
I am wishing for a MBA with integrated 4G just like their iPads. I get unlimited internet here for around $35 a month and having an always on connection would be extremely nice for me.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Jul 14, 2020, 01:12 AM
 
Originally Posted by mindwaves View Post
I am wishing for a MBA with integrated 4G just like their iPads. I get unlimited internet here for around $35 a month and having an always on connection would be extremely nice for me.
I think they want you to use your phone as a hotspot.

That hoses me, though, since I’ll lose my AT&T unlimited if I activate it.

At least it used to be that way. I haven’t checked in years.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jul 14, 2020, 08:07 AM
 
The reason we don’t have integrated mobile networking on Macs is Qualcomm’s monopolistic behavior regarding its patents. That is slowly clearing up, and i think it is coming whether Apple moves to ARM or not.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
reader50
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Jul 14, 2020, 12:41 PM
 
Recall that after Apple settled with Qualcomm, Intel sold its competing modem-chip business. To Apple. We haven't heard anything since.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jul 14, 2020, 12:52 PM
 
Oh, I remember. I also seem to remember that Intel’s problems were related to two things: power consumption in general not being competitive, and their own 10nm process. The second is obviously not an issue anymore - Apple couldn’t fab their modems there if they wanted to - and the first seems like it should be less of an issue for a laptop with its much larger battery and less of an expectation for background updates.

The only issue with that is the timing. The purchase went through almost exactly a year ago, and Apple would have to design a chip for some other process friendly to analog circuits. I don’t think they can be ready with that work for a launch this year.

One caveat in that I don’t know exactly how these chips are usually divided up. I know that it is common to put the analog bits on a larger process, but I don’t know how that would affect anything here. Apple may have had some work done before the purchase, at least if Intel was OK with using some other process for the analog bits.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jul 14, 2020, 07:58 PM
 
Originally Posted by P View Post
The only issue with that is the timing. The purchase went through almost exactly a year ago, and Apple would have to design a chip for some other process friendly to analog circuits. I don’t think they can be ready with that work for a launch this year.
AFAIK the timeline from concept to product for a SoC is about four years. If we take the same timeline for modems, then we should expect something in two, three years. Although, Apple has had a team working on modems earlier if memory serves. Of course, a lot of the engineering is to engineer around patents, I reckon, so perhaps Apple did not get very far without Intel’s/Infineon’s patent portfolio.
I don't suffer from insanity, I enjoy every minute of it.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Jul 24, 2020, 04:23 AM
 
Intel 7nm not coming to market until 2022 or 2023.

Woops!

https://www.tomshardware.com/news/in...d-expectations
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jul 24, 2020, 04:44 AM
 
Intel could find themselves in some trouble if they aren't careful. Some think that if Apple Silicon really blows x86 out of the water, then Microsoft will be forced to improve Windows for ARM as PC makers switch their machines over to try to keep up with Apple.
If Nvidia buys ARM, this might be a strong indicator that they believe this and have ambitions to be the ones filling the semiconductor void as AMD and Intel clamour to get some working ARM CPU products to market.
I have plenty of more important things to do, if only I could bring myself to do them....
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jul 24, 2020, 11:29 AM
 
This is a very dangerous time for Intel. They're making money hand over fist, but their future looks bleak.

I think MS is going to keep working on Office for ARM no matter what. Their new plan is Office on everything, and if the ARM market is growing, MS will work on it. It is already growing with the Mac going ARM. What is interesting here is Windows on ARM. If MS starts to put some real work into that, and a platform standard establishes so that it is easy to make an ARM box that can run Windows, businesses can start to switch. That would be a disaster for Intel.

The ARM sale is interesting, because wo can buy it? Apple or Qualcomm or Samsung or something won't fly for competition authorities. NVidia does not play well with others, and I'm sure everyone would oppose that. The only sane thing I can see is to make a Symbian redux with a consortium of partners all owning a part.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Thorzdad  (op)
Moderator
Join Date: Aug 2001
Location: Nobletucky
Status: Offline
Reply With Quote
Jul 24, 2020, 12:52 PM
 
Intel buying ARM would help brighten its future. Not sure if they're interested, though.
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Jul 24, 2020, 08:20 PM
 
Let’s not forget Apple used to own 50% of ARM either.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jul 24, 2020, 08:48 PM
 
I was quite worried when ARM sold to Softbank, and it seems my worries now come to a pass. I can’t think of a good shepherd to sell ARM to: Intel and Apple shouldn’t get approval due to anti-trust concern. (Honestly, I don’t think Apple would actually abuse it’s power, I think it is really disinterested in buying ARM.) Intel would likely eff things up, ARM’s business model is antithetical to its own and a lot of their products compete directly with it. nVidia is one of the front runners, and they seem to have a problem playing nice. AMD probably doesn’t have the money, and even if it did, I’m not sure they are a good fit either.
I don't suffer from insanity, I enjoy every minute of it.
     
reader50
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Jul 24, 2020, 10:01 PM
 
AMD has plenty of dough, they're grabbing market share from Intel right and left. The Zen core is scaling better than Intel's Core design, Intel remains (mostly) stuck at 14nm, and AMD's pricing is better. Especially since they don't nickel-and-dime every CPU feature, the way Intel does.

However, I can't see ARM being sold to AMD or Intel. ARM-based chips increasingly compete with existing computer platforms, so antitrust issues abound. Same for selling to Apple or Samsung or Google. It would give them licensing control over their competitors. nVidia might work, or selling ARM stock directly on the stock market.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jul 24, 2020, 10:13 PM
 
Qualcomm?
I have plenty of more important things to do, if only I could bring myself to do them....
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jul 24, 2020, 10:47 PM
 
Originally Posted by Waragainstsleep View Post
Qualcomm?
… also has an earned reputation for abusing its market power. While they might be a better fit than Intel, I don’t think it’ll work out well for the market.
I don't suffer from insanity, I enjoy every minute of it.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jul 24, 2020, 10:56 PM
 
Originally Posted by reader50 View Post
AMD has plenty of dough, they're grabbing market share from Intel right and left. The Zen core is scaling better than Intel's Core design, Intel remains (mostly) stuck at 14nm, and AMD's pricing is better. Especially since they don't nickel-and-dime every CPU feature, the way Intel does.
True, and to be honest, that’s part of the reason why I don’t see it as a good fit: AMD has had a great streak of great products recently sold at great prices. Basically in every category they are superior to Intel’s offerings, save AVX512 performance. Better IPC, built on a better process, you get more cores for less money, you get more PCIe lanes than Intel (sometimes for any price), etc. AMD could pivot and put their eggs into the ARM basket, but that’d be quite risky at this very point. They could make both in tandem, but then there’d be in-house competition — even if the honchos at AMD decided that ARM is the future in the long run.
Originally Posted by reader50 View Post
However, I can't see ARM being sold to AMD or Intel. ARM-based chips increasingly compete with existing computer platforms, so antitrust issues abound. Same for selling to Apple or Samsung or Google. It would give them licensing control over their competitors. nVidia might work, or selling ARM stock directly on the stock market.
IMHO if regulatory agencies were working (and they are not), I’d exclude all of these companies (plus nVidia) from purchasing ARM. (And it seems Apple really isn’t interested anyway.) Of course, that’s a big and unrealistic assumption. Apple at least stands to profit the most from ARM being ubiquitous, i. e. from desktops to iPads to servers.
I don't suffer from insanity, I enjoy every minute of it.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jul 25, 2020, 04:41 AM
 
Originally Posted by reader50 View Post
AMD has plenty of dough, they're grabbing market share from Intel right and left.
AMD has to develop both CPUs and GPUs, and they have a development budget of $1.6B.

Intel makes mainly CPUs, and have a budget of $3.3B.

NVidia makes only GPUs, and have a budget of $2.9B.

That AMD can compete as a well as they do is amazing, but they do not have plenty of dough. They are extremely cash-strapped. They don’t even develop a complete GPU family anymore - they develop half of one and let the old one tide them over in higher segments while they develop the other. Right now, they don’t have a high-end card - the 5700 series is the midrange models. Since the shrink was extra large this time, they have actually exceeded the last high-end models (Vega 64 and 56), so AMD used a compute card as Radeon VII to hold the high end. Next generation there is “Big Navi” that should be a high end, but it may very well be the same situation as the Vega launch - that the old midrange cards get to hold the midrange again, because AMD can’t afford a new midrange design.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jul 25, 2020, 04:51 AM
 
Originally Posted by OreoCookie View Post
… also has an earned reputation for abusing its market power. While they might be a better fit than Intel, I don’t think it’ll work out well for the market.
They’re also no better than Apple or Samsung in that they are one of the competitors that use the instruction set. No, if they’re selling to one existing company, it will have to be a semi-conductor company that isn’t in the mobile market right now. Other threads have suggested TI or even IBM. There are companies on the outskirts of that segment that are possible (Cisco, Ericsson, probably others) but I don’t see why - and it is a lot of money.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
reader50
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Jul 25, 2020, 05:22 AM
 
AMD ended Q1 2020 with $1.385B of cash on hand. Yes, Intel had $20.8B cash for the same period. But in my books, cash in the Billions range is still plenty. Even when the competition has more. And AMD has no fabrication costs. All their money can go to development, while Intel has to budget billions towards debugging their 10nm (and now 7nm) processes. Not to mention covering all those "Intel-Inside" bribes to OEMs. If those are still running.

Leaving your previous models on the market, with price cuts, is a smart move. AMD has done it with CPUs as well as GPUs. It doesn't help the high end, but most customers don't buy the high end. Instead, it hits the value segments squarely. Enough so that nVidia launched the GTX 1650 while withholding drivers from reviewers until the day it launched. Because it was dead on arrival. It cost $20 more than an RX 570, which beat it almost across the board.

I don't see smart allocation of resources as a negative. If nVidia spends more, developing a new midrange card to replace cards they already have, let them. nVidia should do a die shrink and price cut on their previous models too.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jul 25, 2020, 07:25 AM
 
Originally Posted by P View Post
They’re also no better than Apple or Samsung in that they are one of the competitors that use the instruction set.
Sure, although unlike Apple (who simply hasn’t been in that position), Qualcomm has used its market power in the modem market to strong arm competitors into compliance. I guess I am saying that Qualcomm would be even worse than, say, Samsung.
Originally Posted by P View Post
No, if they’re selling to one existing company, it will have to be a semi-conductor company that isn’t in the mobile market right now. Other threads have suggested TI or even IBM. There are companies on the outskirts of that segment that are possible (Cisco, Ericsson, probably others) but I don’t see why - and it is a lot of money.
Hmmm, I don’t really see much interest from IBM’s side. They seem to have almost completed their transmogrification into a services company.
I don't suffer from insanity, I enjoy every minute of it.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jul 25, 2020, 08:03 AM
 
Why is Softbank even selling? Are they in trouble?

It seems there are a few folks who believe that ARM has the capacity to explode if Apple stampedes to market with performance that blows away the two long established overlords of the industry. At the very least the value of ARM will skyrocket as others start to believe the ARM is the future. And if MS follows AS Macs by opening the Windows licensing (I figure this is near a dead cert regardless), that's another value boosting milestone. Then you get another one when Dell or HP launches their first Windows ARM devices. Selling now seems crazy to me.
I have plenty of more important things to do, if only I could bring myself to do them....
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jul 25, 2020, 08:41 AM
 
Originally Posted by Waragainstsleep View Post
Why is Softbank even selling? Are they in trouble?
Yes, they made some questionable investments such as WeWorks.
Originally Posted by Waragainstsleep View Post
It seems there are a few folks who believe that ARM has the capacity to explode if Apple stampedes to market with performance that blows away the two long established overlords of the industry. At the very least the value of ARM will skyrocket as others start to believe the ARM is the future. And if MS follows AS Macs by opening the Windows licensing (I figure this is near a dead cert regardless), that's another value boosting milestone. Then you get another one when Dell or HP launches their first Windows ARM devices. Selling now seems crazy to me.
Yeah, agreed, it’s stupid to view ARM as a short-term investment. If things continue as they are, they’d be in control of the vast marketshare of anything that can claim to be a CPU.
I don't suffer from insanity, I enjoy every minute of it.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jul 25, 2020, 05:50 PM
 
Softbank’s trouble isnt so much a few failed bets like WeWork as one massive bet: Uber. Uber keeps bleeding money, but may still be a massive profit maker. SoftBank needs to keep Uber afloat, and that costs money.

More generally, Softbank has an issue in that it isn’t getting its money back because big VC-backed companies aren’t making any IPOs lately, so they can’t “make an exit” and realize those increased valuations.

As for the value of ARM specifically: it is all in whether an instruction set has value. After PowerPC failed to gain consumer market share and Itanium flopped massively, people took the lesson that ISAs don’t matter. 64-bit ARM in particular is showing that that isn’t true - a better ISA is an advantage. These things are hard, and maybe Intel can still come back - history has shown them coming back from worse odds than this - but right now, ARM is looking like it has a lot of value.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jul 25, 2020, 09:29 PM
 
Originally Posted by P View Post
Softbank’s trouble isnt so much a few failed bets like WeWork as one massive bet: Uber. Uber keeps bleeding money, but may still be a massive profit maker. SoftBank needs to keep Uber afloat, and that costs money.
Masayoshi-san (Softbank’s CEO) hasn’t had much luck investing money after hitting the jackpot with Alibaba. I was just mentioning WeWorks, because that actually hit the fan this year whereas Uber is still in the air. Uber is massively overvalued, but has salvageable parts. Masayoshi got really lucky actually with Covid pandemic, that has likely saved him millions, if not billions with WeWork alone.
Originally Posted by P View Post
As for the value of ARM specifically: it is all in whether an instruction set has value. After PowerPC failed to gain consumer market share and Itanium flopped massively, people took the lesson that ISAs don’t matter. 64-bit ARM in particular is showing that that isn’t true - a better ISA is an advantage. These things are hard, and maybe Intel can still come back - history has shown them coming back from worse odds than this - but right now, ARM is looking like it has a lot of value.
Agreed, and unlike PowerPC that was supported by one bigger company (IBM) and two smaller companies (Motorola and Apple), ARM is backed by many, many companies which have poured billions each in the ecosystem (e. g. to have software ported or for compiler optimizations). Because when you say ISA, you don’t just mean the IP on the actual instruction set, but the whole ecosystem built around it. That’s why x86 is in trouble: PowerPC was always the scrappy, underfunded underdog whereas ARM is decidedly not.
I don't suffer from insanity, I enjoy every minute of it.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 1, 2020, 09:24 AM
 
Finally took the time to watch the WWDC videos on the GPU, and I think that the TBDR issue is not going to be a big problem right now...but maybe later.

Long story short: Apple GPUs will support the current Metal feature set that current Macs do. Any app written for them will compile and run. It may not work perfectly because Apple has made a couple of changes to improve performance, so you need to fix a couple of things, but it should work. You can then also have multiple code paths for different feature sets. This sounds like what Apple will do in Quartz - add codepaths for optimized function on Apple GPUs, while not removing the old code paths. Thus, it can continue to work on both immediate mode renderers and TBDRs.

There are now two questions: what happens if there is both an integrated TBDR renderer and a discrete IM renderer? One may be more efficient, but the other so powerful that it may be used instead anyway. Apple didn’t say, and this is clearly something that needs to work if Apple is going to ship Macs with AMD graphics in them.

Second, will here be a new feature set for discrete GPUs, newer than the current one? Because I can see situations where GPUs of Vega and Navi-class will be better served by running a codepath written for Apple GPUs, but that wouldn’t necessarily even work on Intel GPUs.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Sep 2, 2020, 03:29 AM
 
This is a very sensible take.
I’m curious how far Apple wants to scale up its GPUs. But at least for the intermediate term, i. e. 5-6ish years (= the support period of Intel Macs), I reckon Apple will have to deal with two code/render paths anyway, so we won’t know before then, me thinks.
I don't suffer from insanity, I enjoy every minute of it.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 2, 2020, 05:21 AM
 
Reading more about the details of TBDR and how it works in a modern setup, it seems that there is no problem is running code intended for a TBDR on an immediate-mode renderer - it will just be much less efficient. If you try to run generic code on the TBDR chip, performance will crash if it does certain things - basically, running geometry shaders and certain advanced tesselation functions. The TBDR optimization relies on the geometry being fixed very early in the rendering pipeline, so it can figure out exactly which triangles it needs to keep in memory for each tile.

Apple's solution to making sure that this isn't a problem is to just not do anything to let people use geometry shaders in Metal. If you want to hack them up yourself using compute shaders, you can, but it is hard. In the same way, they enable basic fixed function tessellation, but nothing more complex.

Remember how Apple only every supported OpenGL 4.1? We all assumed that it was because they just didn't care. On a hunch, I went back and read the changelogs for the newer OpenGL versions. 4.2 was very minor, but 4.3:

* shader storage buffer objects that enable vertex, tessellation, geometry, fragment and compute shaders to read and write large amounts of data and pass significant data between shader stages;
Yeah, that is going to kill performance on a TBDR renderer. The mor detailed spec sheet has more of these. I'm willing to bet that applee stayed away from OpenGL 4.3 (and newer) because supporting these functions would be hard in Rosetta 2.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
 
Thread Tools
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 06:11 AM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,