Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Community > MacNN Lounge > Mac transition to ARM to be announced at WWDC?

Mac transition to ARM to be announced at WWDC? (Page 2)
Thread Tools
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 16, 2020, 10:21 PM
 
I assume he means OS updates will be forced which would in turn force users to buy new versions of outdated apps. Its difficult to see users accepting this. Microsoft absolutely cannot do this as things stand. While they have made some progress towards where Apple has been for years, in a place where they can just say "We changed this, deal with it." Microsoft still has all manner of old tech running critical systems. ATMs running Win XP, power stations running DOS etc etc. You can't obsolete the hardware controlling nuclear cooling rods. I'm exaggerating (probably) but I suspect such a move would be highly dangerous. Plenty of users don't want to be forced into everyone's upgrade cycles/subscription models. Thats a bigger ask than Apple has ever asked before. You can get away with it on iOS where the apps top out at $20-30. You can't do it with $1000+ Creative Suite or $10k AutoCAD setups. You'll drive away users and devs.
I have plenty of more important things to do, if only I could bring myself to do them....
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 16, 2020, 11:32 PM
 
Originally Posted by Waragainstsleep View Post
I assume he means OS updates will be forced which would in turn force users to buy new versions of outdated apps. Its difficult to see users accepting this.
Many apps I pay for have become subscription services, so I'd have to pay either way and I don't think this creates as much pressure as in the past. (I'm not philosophically opposed to software-as-a-subscription-service, although others are.)
Originally Posted by Waragainstsleep View Post
Microsoft absolutely cannot do this as things stand.
Microsoft has already made the switch. Most Windows and Office licenses are subscription services these days. Our uni has a site license, which it pays for on a yearly basis, and most companies also pay Microsoft regularly. Yes, you can still buy regular licenses, but AFAIK once you pay for Windows 10, you get all updates in perpetuity.

Microsoft is quite active in making sure that bigger companies pay their share. My brother's former employer (he was in the IT department) didn't believe my brothers that they owed Microsoft a large six-digit amount in licensing fees. People had taken all sorts of, ahem, liberties. Microsoft audited the company (which, I was surprised to find out, was part of the licensing agreement), and indeed, the jaw of my brother's former boss dropped when Microsoft's auditor came up with essentially the same number (I think it was within 5-10 %). So this change is already happening in medium-sized companies and up.
Originally Posted by Waragainstsleep View Post
While they have made some progress towards where Apple has been for years, in a place where they can just say "We changed this, deal with it." Microsoft still has all manner of old tech running critical systems. ATMs running Win XP, power stations running DOS etc etc.
While many companies are slow to update to newer versions of Windows, this is a problem that is taken care of by time. Especially once you get new hardware, you basically need to get a new version of Windows as well.
Originally Posted by Waragainstsleep View Post
Thats a bigger ask than Apple has ever asked before. You can get away with it on iOS where the apps top out at $20-30. You can't do it with $1000+ Creative Suite or $10k AutoCAD setups. You'll drive away users and devs.
This has already happened for much of specialized software. In Japan an educational license for Mathematica costs about $2k per year. There are no other, cheaper (legal) options. Another pro-level software, the simulation software package COMSOL also costs $2k per year for an edu license, but with severe restrictions on the number of cores you can use. (The licensing mechanism is a forking nightmare, it literally took two professors and a support staffer on the phone to get the license to work.) I haven't used AutoCAD in decades (I dabbled with it when I was a teenager), but also they seem to have switched to a subscription model.

Consumers have gotten used to “free” software, so anything >0 seems like a big ask initially. On the business side, software costs are part of doing business. Some software companies are milking their customers. (Microsoft has been doing this for years by making users pay for the number of sockets, then CPU cores and the number of users. Companies like COMSOL are doing the same with inane restrictions on their licenses.)
I don't suffer from insanity, I enjoy every minute of it.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 17, 2020, 03:22 PM
 
I didn't mean Microsoft's own charging model, I meant they can't cut off support for old, critical expensive apps and just tell their users to deal with it like Apple does every few years. People have come to expect that Apple will make them throw their way of life out the window periodically and that in the long run it will generally be the right thing to have done. Likewise MS customers have come to expect that if they want to run the DOS 1.X control program for their 1974 desktop X-ray accelerator on Windows 10, they'll be able to do so. That landscape is shifting, but its not there yet. I take your point about subscription models but there are more than a few holdouts sticking with the $3000 standalone copies of CS6 they bought because they don't want to subscribe. I'm sure there is plenty of obscure, specialist, expensive software licenses that other people need to keep running too. I used to deal with sports video analysis software that had a dongle to work and the licenses for those were £8-10k each. In small business, people keep that stuff running as long as they can.
I have plenty of more important things to do, if only I could bring myself to do them....
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 17, 2020, 07:40 PM
 
Originally Posted by Waragainstsleep View Post
I didn't mean Microsoft's own charging model, I meant they can't cut off support for old, critical expensive apps and just tell their users to deal with it like Apple does every few years. People have come to expect that Apple will make them throw their way of life out the window periodically and that in the long run it will generally be the right thing to have done.
Can you explain that in more detail? In what way is Apple periodically changing what people use? Are you thinking of something like the Final Cut Pro —> Final Cut Pro X transition? (Curiously, that is still sold as a stand-alone piece of software.)
Originally Posted by Waragainstsleep View Post
Likewise MS customers have come to expect that if they want to run the DOS 1.X control program for their 1974 desktop X-ray accelerator on Windows 10, they'll be able to do so.
Not always and necessarily so. A friend of mine worked in a lab on an electron microscope that sells for €500k. The controller card only worked with non-NT-kernel versions of Windows, and they had to scour for old PCs on ebay to keep their microscope running. But I take your larger point and totally agree.
Originally Posted by Waragainstsleep View Post
That landscape is shifting, but its not there yet. I take your point about subscription models but there are more than a few holdouts sticking with the $3000 standalone copies of CS6 they bought because they don't want to subscribe. I'm sure there is plenty of obscure, specialist, expensive software licenses that other people need to keep running too. I used to deal with sports video analysis software that had a dongle to work and the licenses for those were £8-10k each. In small business, people keep that stuff running as long as they can.
At least in my experience there is a clear trend towards a subscription model. I agree the transition has not completed, but I’d say at least the curve is flattening and subscriptions have become the new normal. In other areas of business computing, “subscriptions” have been the norm for years. Support contracts come to mind, which are technically not a software licensing subscription, but it is recurring revenue nonetheless. That’s essentially why Dell is in business, companies love their support.

I am familiar with plenty of businesses who basically run their setup into the ground, but even they will have to accept the new reality once they are forced to upgrade. (Like doctor’s offices, for example.) Whenever CS6 stops working, people either need to look for alternatives (which exist and some are really, really good) or bite the bullet and get a CC subscription. Plus, there are such good alternatives to most of Adobe’s software products these days that the “pry CS6 from my cold dead hands” attitude that some people seem to have is quite counterproductive.
I don't suffer from insanity, I enjoy every minute of it.
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Jun 17, 2020, 08:10 PM
 
What I mean is more like this:

It used to be that software updates were optional and many did not do them.

So, companies started offering opt-in auto updates. App Stores etc. offer this.

Now, take Win 10 for example, they are opt out mechanisms for updating the OS and sometimes, apps.

It does not seem to be that much of a leap in logic to remove the ability to turn them off and have the OS force updates to itself & applications automatically with no user intervention.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 17, 2020, 09:34 PM
 
Originally Posted by Brien View Post
What I mean is more like this:

It used to be that software updates were optional and many did not do them.

So, companies started offering opt-in auto updates. App Stores etc. offer this.
True, but there were sometimes good reasons for that. Updates had to be done by hand, you sometimes had to get physical media with the updates or pay for updates. OS updates were not free for the most part of my life. Updating software ranged from being a chore in the best case to being complicated because things could break.

Software distribution changed. I remember having to go to a Apple reseller to get the (free) Mac OS X 10.1 update. The idea of autoupdates is just natural, websites and web services are autoupdated, and this way of thinking migrated from there, I think.
Originally Posted by Brien View Post
Now, take Win 10 for example, they are opt out mechanisms for updating the OS and sometimes, apps.

It does not seem to be that much of a leap in logic to remove the ability to turn them off and have the OS force updates to itself & applications automatically with no user intervention.
I don't know. Many IT organizations cannot allow autoupdates for sometimes good reasons (and many bad ones, too). Many Mac users cannot allow autoupdates, because they have to wait until software they rely on supports a new OS — or at least they want to wait until they are sure that the update does not cause any problems.

IMHO the current way — autoupdate is opt-out, i. e. on by default, but you can deactivate it — seems like the right compromise.
I don't suffer from insanity, I enjoy every minute of it.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Jun 17, 2020, 10:42 PM
 
Updates are a massive pain for long term projects. One bug in the wrong spot can crash your entire workflow.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 18, 2020, 01:58 AM
 
Originally Posted by OreoCookie View Post
Can you explain that in more detail? In what way is Apple periodically changing what people use? Are you thinking of something like the Final Cut Pro —> Final Cut Pro X transition? (Curiously, that is still sold as a stand-alone piece of software.)
Apple just decides that something has run its course and they mercilessly kill it off when it suits them. I think someone mentioned Aperture recently, Classic went too early for some people (I recall I had a couple of stubborn users running Photoshop 7 on G4 PowerMacs when that happened), a great recent example is killing 32-bit app support. I haven't updated to Catalina precisely because I have a raft of old apps I use for data recovery or other infrequent but handy tasks that I've picked up over the years and can't really afford to replace.
Microsoft on the other hand killed off support for XP after a dozen years or so and then found they had to keep issuing security updates just for the version running on ATMs. For all I know they still are.I heard the US Navy paid them a boatload(!) of cash for that support too as their fleet had essential systems running XP that weren't scheduled for refit yet. Maybe it was the UK Navy. I have no idea how true that one is tbh but it sounds highly plausible.

Apple removes features, apps, hardware (floppy drive, optical drive, headphone jacks on iPhones) frequently and a few complain every time but they keep doing it and the dissenters get less and less vocal because people have become accustomed to it. And Apple is usually right. Samsung mocked them about headphones but look who also got rid of the 3.5mm jack because their waterproof phones were about as waterproof as a cheese grater? Just like Apple said.

Originally Posted by OreoCookie View Post
I am familiar with plenty of businesses who basically run their setup into the ground, but even they will have to accept the new reality once they are forced to upgrade. (Like doctor’s offices, for example.) Whenever CS6 stops working, people either need to look for alternatives (which exist and some are really, really good) or bite the bullet and get a CC subscription. Plus, there are such good alternatives to most of Adobe’s software products these days that the “pry CS6 from my cold dead hands” attitude that some people seem to have is quite counterproductive.
In the majority of businesses its counterproductive but there are a lot of outliers that matter. I doubt MS or Adobe care too much about the one man bands who simply don't want to drop X amount every couple of years on the latest package of this or that, but its a problem for big lumbering bureaucratic nightmare organisations too and they matter a lot. The university with an undergrad lab using gear from the 70s that would cost hundreds of thousands to replace might not have factored that into the budget for several more years (if at all), but they also pay MS $2m a year for their site license so all their staff get Windows and Office covered for them. Its the same with some big companies too. A client once asked a BT engineer what would happen if he dug a trench and replaced their shitty old copper lines with brand spanking fibre and just asked them to connote it up for him. He was told that even if the engineer who came to look had paperwork saying BT were scheduled to put that exact fibre cable in that exact place tomorrow, he would rip it all out and put copper back and then another engineer would come back do the fibre upgrade as planned. For all the long term planning there is a gargantuan lack of initiative and therefore flexibility in such organisations.


Originally Posted by Brien View Post

It does not seem to be that much of a leap in logic to remove the ability to turn them off and have the OS force updates to itself & applications automatically with no user intervention.
Forcing updates on iOS is different. Because the whole system is relatively young and has been forcing those updates since very early on, people are used to it. There is a clear 'system' where the devs know if their app is going to stop working in advance and they can take care of it. Also there are no power stations or missile targeting systems running from iPads and if there were, its an iPad so you just drop a few hundred and get a new one. You don't need to spend $250k on a security cleared team of engineers who have to rip out consoles from the 80s and put them all back again without breaking them. Or scheduling a boat with 300 sailors on board to all be unavailable for duty for three days. Possibly 3000 miles from where you'd like them to be.

If you force updates on Macs or Windows that kill an app that is no longer being updated, or even just circumvents an IT departments cautious wait-and-see-a-bit policy approach to patching their systems, you are potentially causing a spectacular amount of needless financial outlay, work, time, etc etc. I can't see this flying, I really can't.

Originally Posted by OreoCookie View Post
IMHO the current way — autoupdate is opt-out, i. e. on by default, but you can deactivate it — seems like the right compromise.
Is it on by default now? I think it is on Windows 10 but on a domain I imagine it defers to global policy. I'm reasonably sure the Mac I updated to Catalina yesterday asked me if I wanted to turn auto-updates on, but it might have been that I had deliberately turned them off while it was still running Sierra because when Sierra was set up the network connection still had data limits.
I have plenty of more important things to do, if only I could bring myself to do them....
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 18, 2020, 07:55 PM
 
Originally Posted by Waragainstsleep View Post
Apple just decides that something has run its course and they mercilessly kill it off when it suits them. I think someone mentioned Aperture recently, Classic went too early for some people (I recall I had a couple of stubborn users running Photoshop 7 on G4 PowerMacs when that happened), a great recent example is killing 32-bit app support. I haven't updated to Catalina precisely because I have a raft of old apps I use for data recovery or other infrequent but handy tasks that I've picked up over the years and can't really afford to replace.
I have little sympathy for people who insisted on running Photoshop 7 long after CS (with MacOS X support) was available. Removing Rosetta after much of the actively developed software had moved to OS X was IMHO the right move and the right timing IMHO. Apple’s treatment of Aperture on the other hand broke my hearts :sad: (I think it is one of the best pieces of software ever made and used it until the end.) 32 bit support — even if it killed my beloved Aperture — had to end sometime. When you keep undeveloped software around for that long, it sometimes gets a bit glitchy, too.

It gets touch when you rely on specialized, rarely updated or no longer actively developed software. But IMHO you should think about moving away from that. That’s why I wasn’t angry at Apple — they had stopped developing Aperture and I was an outlier. I was using a dead product, and that is not a good idea in the long term. It gets tough when there is no replacement out there. For your pieces of software, are there no replacements yet? Are they still actively developed?
Originally Posted by Waragainstsleep View Post
Microsoft on the other hand killed off support for XP after a dozen years or so and then found they had to keep issuing security updates just for the version running on ATMs. For all I know they still are.I heard the US Navy paid them a boatload(!) of cash for that support too as their fleet had essential systems running XP that weren't scheduled for refit yet. Maybe it was the UK Navy. I have no idea how true that one is tbh but it sounds highly plausible.
Keeping Windows XP on life support for that long was a disaster. My brother’s best friend’s old job was to keep a Windows XP install running — for 40 freakin’ years. (Yes, it was also for the military.)

Specialized customers have sometimes very weird requests and are willing to pay ungodly amounts of money for them at times. A telco company was relying on a very old version of a software product for their infrastructure. For some reason they did not want to updated to the newest, certified version of their software (and certification is done only once every 7-8 years as it is so expensive). So they paid said company — as you put it — a boatload of money for bug fixes and support. Those bug fixes happened to include trivial things like Spectre and Meltdown mitigation. (Which caused other problems, because early patches severely degraded performance.)
Originally Posted by Waragainstsleep View Post
Apple removes features, apps, hardware (floppy drive, optical drive, headphone jacks on iPhones) frequently and a few complain every time but they keep doing it and the dissenters get less and less vocal because people have become accustomed to it. And Apple is usually right. Samsung mocked them about headphones but look who also got rid of the 3.5mm jack because their waterproof phones were about as waterproof as a cheese grater? Just like Apple said.
This is a case of bad Apple, yeah. I was ok with them removing optical drives, there was a clear benefit (laptop thickness and weight) and I haven’t really missed them. But there is no rationale for being so stingy with ports. My 16” has plenty of space for an HDMI port and 2 USB-A ports. Being in dongle town is a pain.

Originally Posted by Waragainstsleep View Post
In the majority of businesses its counterproductive but there are a lot of outliers that matter. I doubt MS or Adobe care too much about the one man bands who simply don't want to drop X amount every couple of years on the latest package of this or that, but its a problem for big lumbering bureaucratic nightmare organisations too and they matter a lot.
And in many cases patent mismanagement is part of the story. At my brother’s old job, their central internal tools were based on Active X, a technology Microsoft had long deprecated. Rather than developing something new when Microsoft started giving hints — or when it discontinued it, the kept on truckin’. (They were experts in Active X, you see …). At the end my brother had to run through quite a few contortions to install IE8 (?) on new machines. Needless to say, this isn’t recommended.
Originally Posted by Waragainstsleep View Post
The university with an undergrad lab using gear from the 70s that would cost hundreds of thousands to replace might not have factored that into the budget for several more years (if at all), but they also pay MS $2m a year for their site license so all their staff get Windows and Office covered for them. Its the same with some big companies too.
Don’t get me started. Budgeting is a mess. Universities and many companies cannot “save money” the way people do for bigger purchases like cars. If you need to replace your NAS or servers every X years, you should start saving as soon as you buy a new machine. Instead, it is usually a very messy construction.
Originally Posted by Waragainstsleep View Post
A client once asked a BT engineer what would happen if he dug a trench and replaced their shitty old copper lines with brand spanking fibre and just asked them to connote it up for him. He was told that even if the engineer who came to look had paperwork saying BT were scheduled to put that exact fibre cable in that exact place tomorrow, he would rip it all out and put copper back and then another engineer would come back do the fibre upgrade as planned. For all the long term planning there is a gargantuan lack of initiative and therefore flexibility in such organisations.
Yup. Many would like to install something and use it for a decade.
The time scales for software companies are much, much faster than that, even for very conservative companies like Microsoft. Microsoft in particular was bitten by this very hard. Many companies clung to XP for a very long time and started to get the ball rolling when Windows 7 (which was very beloved) was just about to be replaced by Windows 8. Many companies skipped Windows 8, and were forced to upgraded to Windows 10. Add to that the various server versions. Many organizations ran a hodge podge of different Windows versions, and support was predictably a nightmare. Licensing still is a nightmare. (My brother wanted to kill himself when he inventoried Windows licenses. That was complicated by the fact that his company bought another company, and the two had two different licensing agreements and two different licensing schemes.)
Originally Posted by Waragainstsleep View Post
Is it on by default now? I think it is on Windows 10 but on a domain I imagine it defers to global policy. I'm reasonably sure the Mac I updated to Catalina yesterday asked me if I wanted to turn auto-updates on, but it might have been that I had deliberately turned them off while it was still running Sierra because when Sierra was set up the network connection still had data limits.
Hmmm, good question. I turned autoupdate on when it became an option, so perhaps you are right and the default is still opt-in. In that case, Apple is even more conservative in this regard.
( Last edited by OreoCookie; Jun 19, 2020 at 01:53 AM. Reason: Fixed a tag.)
I don't suffer from insanity, I enjoy every minute of it.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 18, 2020, 07:57 PM
 
Originally Posted by subego View Post
Updates are a massive pain for long term projects. One bug in the wrong spot can crash your entire workflow.
That depends on what you mean by long-term. If your project lasts longer than 2-3 years, I think you need to take software updates into account.
I don't suffer from insanity, I enjoy every minute of it.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Jun 18, 2020, 11:58 PM
 
Oh, absolutely. I’m usually working more in the 6 month to a year range.

Except for the project I’m working on now, where I had one phase leak into the next (in between phases is when I upgrade), and then the pandemic tacking on 3-1/2 months and counting.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 19, 2020, 12:01 AM
 
As annoying as it is when Apple decides to kill things off, I think its served to set peoples expectations that they are just going to do that when they feel like its time and there's nothing to be done about it. As such, they get less angry when it happens now. Am I right in recalling that MS had to reinstate the Start menu in Windows 8.1 because despite years of people laughing at them because you had to click "Start" in order to shut down your PC, they got a ton of complaints when they changed it? Thas another danger of keeping things around too long, it undermines your flexibility to innovate or change.

I'm not looking forward to being stuck using USB-C adaptors all over the place either but Apple's worst design crimes for me are the current mess of soldering everything to the logic board and the T2 chip tying components to the logic board so replacements from 3rd parties or donor machines won't work. Killing the second hand parts market altogether, and forcing everyone to scrap perfectly good hardware after ~5 years. $2500 disposable computers is a deeply shitty move and I really wish Greenpeace or someone would get a bug up their ass about it and go after them. Apple told a congressional hearing or a grand jury or whatever it was that they don't make any money from doing repairs. If that was the case then there is no excuse to not make stuff easier and cheaper to fix. Except it hurts sales of new stuff. And with the T2 shenanigans, people will be forced to pay Apple for repairs so they will start making money soon enough, assuming you believe they aren't already. Which I really don't.
I have plenty of more important things to do, if only I could bring myself to do them....
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 19, 2020, 12:02 AM
 
Oh and the criminal piece of design that saw a lightning port on the bottom of the Magic Mouse. Garbage idea.
I have plenty of more important things to do, if only I could bring myself to do them....
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 19, 2020, 02:26 AM
 
Originally Posted by subego View Post
Oh, absolutely. I’m usually working more in the 6 month to a year range.
Well, I wasn't sure what “long” meant to you. Certainly, in those situations I can see why you might want to defer updates.
Originally Posted by Waragainstsleep View Post
As annoying as it is when Apple decides to kill things off, I think its served to set peoples expectations that they are just going to do that when they feel like its time and there's nothing to be done about it. As such, they get less angry when it happens now.
Right, and Apple has been quite consistent. For example, the 64 bit transition was over a decade in the making, and overall Apple has been doing a good job setting expectations correctly. Important transitions last several releases and developers have plenty of time to adjust — in principle. The last big botch I can remember was 64 bit Carbon, which Apple first promised and then canned. OpenGL and OpenCL will surely kick the can next.* And yes, I'll be sad, because some games will stop working.

* I think it is important to separate the question whether Apple chose the right technologies to adopt or kill from how they make the transition.
Originally Posted by Waragainstsleep View Post
Am I right in recalling that MS had to reinstate the Start menu in Windows 8.1 because despite years of people laughing at them because you had to click "Start" in order to shut down your PC, they got a ton of complaints when they changed it? Thas another danger of keeping things around too long, it undermines your flexibility to innovate or change.
Windows 8 was a s-show. Control Panel was half-new, half-old, controls optimized neither for touch nor for mouse input, etc. etc. It was easy to hate on.
Originally Posted by Waragainstsleep View Post
I'm not looking forward to being stuck using USB-C adaptors all over the place either but Apple's worst design crimes for me are the current mess of soldering everything to the logic board and the T2 chip tying components to the logic board so replacements from 3rd parties or donor machines won't work. Killing the second hand parts market altogether, and forcing everyone to scrap perfectly good hardware after ~5 years. $2500 disposable computers is a deeply shitty move and I really wish Greenpeace or someone would get a bug up their ass about it and go after them.
That's a difficult one. In my mind, Apple computers should be more repairable. You should be able to replace the keyboard or the battery without having to replace the entire top case and/or have to deal with glue. This would aid recyclability as well. Upgradability is distinct from that.

Apple has made quite a big push to ensure privacy and security in its machine, and the tight integration between T2 and the SSD is part of that. (For example, the microphone hardware disconnects if you close the lid of newer MacBook Pros.)

I would be happy if Apple made its machines more repairable, although I think it might be a reasonable compromise to consider the logic board as one single unit — if that makes the machines more reliable on average.
Originally Posted by Waragainstsleep View Post
Apple told a congressional hearing or a grand jury or whatever it was that they don't make any money from doing repairs. If that was the case then there is no excuse to not make stuff easier and cheaper to fix. Except it hurts sales of new stuff. And with the T2 shenanigans, people will be forced to pay Apple for repairs so they will start making money soon enough, assuming you believe they aren't already. Which I really don't.
If you look at it more broadly, manufacturers are trying to prevent users from repairing their stuff — they oppose the so-called right-to-repair movement. With computers that is one thing, but e. g. tractor manufacturers like John Deere have been trying to pull shenanigans like invoking the DCMA of all things to prevent their customers from repairing their gear. (As you can imagine, they rely on electronics, sometimes you just need to clear errors to make it work again.) There should be a right to repair, absolutely, although I am willing to accept realities. For example, I don't expect Apple to design AirPods where the battery is user-replaceable. The larger and more expensive a device, the more repairable it should be.
I don't suffer from insanity, I enjoy every minute of it.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 19, 2020, 06:48 AM
 
Maybe the AirPods themselves, but the battery case could easily have a user serviceable battery in it. And it irks me that at £200+ for a set of earbuds, you only get the bog standard (mandated by law) warranty and the "privilege" of paying goodness knows how much to replace a battery that costs 10% of what they charge to replace it. Due to the amount of labour they have chosen to make it entail to do so.

Car manufacturers actually have a more valid excuse for making their vehicles more difficult to take apart - they are often left on public streets or driveways so thieves would just steal your engine or gearbox if it just popped out with a lever pull. There is no justification for locking an LCD to a specific motherboard. Its anti-competitive and needs to be stepped on hard.
I have plenty of more important things to do, if only I could bring myself to do them....
     
Laminar
Posting Junkie
Join Date: Apr 2007
Location: Iowa, how long can this be? Does it really ruin the left column spacing?
Status: Offline
Reply With Quote
Jun 19, 2020, 08:46 AM
 
Originally Posted by OreoCookie View Post
Windows 8 was a s-show. Control Panel was half-new, half-old, controls optimized neither for touch nor for mouse input, etc. etc. It was easy to hate on.
10 is still that way. Sure you can open the "Windows Settings" pane but if you actually want to get any real info or make any real changes, you have to dig down until you find the old control panel that's hidden behind the stupid new skin. Like changing your IP address - Windows Settings > Network & Internet > Change Adapter Options > There's the Windows 7-style control panel. OH BUT YOU'RE NOT DONE. Now you have to do the typical right click on the network device > Properties > TCP/IPv4 > Properties.

That's like 8 steps just to set your IP address. OS X used to be able to get there in one step (command-space > network > boom) but I think they hide the IP address behind one more click now.

You can also go straight to the old-style Windows Control Panel by searching with Cortana. But even then it's still a stupid kludge to dig through to the IP address. Working in industrial automation there are a lot of times I have to change IP addresses, and a lot of times I've had to guide users through doing it, and it's always a big pain.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 19, 2020, 09:03 AM
 
Originally Posted by Waragainstsleep View Post
Maybe the AirPods themselves, but the battery case could easily have a user serviceable battery in it. And it irks me that at £200+ for a set of earbuds, you only get the bog standard (mandated by law) warranty and the "privilege" of paying goodness knows how much to replace a battery that costs 10% of what they charge to replace it. Due to the amount of labour they have chosen to make it entail to do so.
AFAIK Apple exchanges the case if you send it in for a battery replacement.
What we need is a combination of regulations and a change of thought. Things should be made to last longer — which is hard when a 4-year-old iPhone is considered garbage by many. If we want to conserve our resources, we must make things more repairable and recyclable. Apple is the only company that talks about that, but at best only does baby steps in its products.
Originally Posted by Waragainstsleep View Post
Car manufacturers actually have a more valid excuse for making their vehicles more difficult to take apart - they are often left on public streets or driveways so thieves would just steal your engine or gearbox if it just popped out with a lever pull. There is no justification for locking an LCD to a specific motherboard. Its anti-competitive and needs to be stepped on hard.
Right-to-repair is opposed not just by the likes of Apple, AFAIK it is a big deal for the types of people who repair their stuff themselves. The problem is the software part: manufacturers cannot prevent you from taking a wrench to your own car. But they can tell the electronics to throw a code and prevent the vehicle/machine from moving until you clear the code. Which you can only do at an authorized dealer. So if a part fails, the electronics throws a code, blocks the machine/vehicle from operating and then you rely on an authorized dealer to delete the code. (Which, I assume, they will only do if they do the repair themselves.) And even if you find a way to hack the electronics, you may breach the DCMA.

Apple (and I reckon most other electronics manufacturers) go another path. E. g. they order specialized electronics parts, which differ from the standard ones only in small ways such as the pin layout. And the only company that is being sold these parts is Apple.
I don't suffer from insanity, I enjoy every minute of it.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 19, 2020, 09:05 AM
 
Originally Posted by Laminar View Post
10 is still that way. Sure you can open the "Windows Settings" pane […]

That's like 8 steps just to set your IP address. OS X used to be able to get there in one step (command-space > network > boom) but I think they hide the IP address behind one more click now.
I touch a Windows machine about once a year, and every time I do I am reminded that despite all my misgivings with Apple and its software quality, we have it quite good over here.
I don't suffer from insanity, I enjoy every minute of it.
     
subego
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Jun 19, 2020, 02:59 PM
 
Originally Posted by OreoCookie View Post
Well, I wasn't sure what “long” meant to you. Certainly, in those situations I can see why you might want to defer updates.
Even for a much longer project, the end result is the same. It’s very important to be able to choose when updates occur, rather than have them be automatic.
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Jun 19, 2020, 05:27 PM
 
Originally Posted by OreoCookie View Post
Right-to-repair is opposed not just by the likes of Apple, AFAIK it is a big deal for the types of people who repair their stuff themselves. The problem is the software part: manufacturers cannot prevent you from taking a wrench to your own car. But they can tell the electronics to throw a code and prevent the vehicle/machine from moving until you clear the code. Which you can only do at an authorized dealer. So if a part fails, the electronics throws a code, blocks the machine/vehicle from operating and then you rely on an authorized dealer to delete the code. (Which, I assume, they will only do if they do the repair themselves.) And even if you find a way to hack the electronics, you may breach the DCMA.
Apple (and others) have been telegraphing for years the road they’re going down is going to lead to sonically welded cases, 100% soldered/ glued down parts and complete disposability. Apple already for a majority of repairs simply replace the device because even Apple deems them unrepairable.

As things get more and more miniaturized I think eventually the concept of repairs is going to go away. Electric cars seem like a good breaking point for the manufactures to shut that down as well.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 19, 2020, 09:16 PM
 
Originally Posted by Brien View Post
Apple (and others) have been telegraphing for years the road they’re going down is going to lead to sonically welded cases, 100% soldered/ glued down parts and complete disposability. Apple already for a majority of repairs simply replace the device because even Apple deems them unrepairable.
My first computer, my beloved Amiga 500, came with the design schematics for the main board, so the company assumed you may want to do repairs yourself and honestly, back then, things were much simpler. Money-wise, I am sure Apple is 100 % correct, namely that it is cheaper to simply replace large bits or the entire enchilada rather than having a technician figure out the fault and repair it. Plus, you actually can make gadgets smaller. I don’t think “repairable AirPods” would be nearly as small as the ones Apple makes now. That’d make them much less appealing to customers.

If you compare old notebooks to new ones, new ones have gotten much simpler on the inside. I remember the amount of time it took to take my iBook apart — and sometimes having screws leftover So that should make Apple’s new machines much more repairable. But they are using proprietary screws (e. g. pentalobe screws) and the like to make it a PITA for others to open up their machines.

IMHO we really need regulations that empower consumers and save the planet. But it’d be myopic to only focus on Apple. Many other manufacturers are copying Apple when it comes to construction and construction philosophy. The new Microsoft Surfaces are not much more repairable than mobile Macs, for instance.
Originally Posted by Brien View Post
As things get more and more miniaturized I think eventually the concept of repairs is going to go away. Electric cars seem like a good breaking point for the manufactures to shut that down as well.
The drivetrain of electric cars is actually much simpler than that of gas-powered vehicles. And since the number of moving parts is much smaller, it should also be much more reliable. I don’t think repairs will go away, cars are too expensive for that. But also here, I reckon most parts are replaced outright. Battery packs are usually designed to be “easily” replaceable.
I don't suffer from insanity, I enjoy every minute of it.
     
reader50
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Jun 19, 2020, 11:42 PM
 
Cars already have a right-to-repair. Manufacturers cannot withhold parts or service manuals. Nor void warranty if something gets repaired elsewhere. This has driven a healthy market of generic replacement parts, and plenty of service centers competing with the dealers.

Independent Tesla service shops are appearing. Tesla may discourage it a bit, but they do have to provide parts when ordered. Also, Tesla opened up all their patents (provided those using the patents don't sue Tesla). So it's even legal to manufacture Tesla's parts today.

The John Deere nonsense involves software DRM locks, trying to force owners back into the dealer repair network. No right-to-repair for electronics & software yet. Also, farm tractors are technically not cars.
     
Thorzdad  (op)
Moderator
Join Date: Aug 2001
Location: Nobletucky
Status: Online
Reply With Quote
Jun 20, 2020, 06:46 AM
 
I always tended to prefer buying Kenmore appliances largely because they always came with detailed schematics and parts lists. You could tear down your washer/dryer/oven/etc. following the schematics and replace any broken parts (which your local Sears store usually had in stock, or could get for you in a couple of days.)
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 22, 2020, 04:45 AM
 
Last minute rumours are saying Apple has pulled hardware announcements. Whether that's just new hardware ready to release or anything hardware related isn't clear, I guess the former.
I have plenty of more important things to do, if only I could bring myself to do them....
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 22, 2020, 12:26 PM
 
OK,

1: iOS 14 - Smart Pgaes on your home screen like smart folders for you apps. You can set pages to automatically show the apps you use most often instead of being fixed as they are now.

2: ARM based Macs starting with consumer laptop and iMac early next year;

3: MacOS and MacOS Pro;

4: Siri can listen to your phone calls and when talking to other iPhone users, if you talk about setting up a meeting during the call, Siri will compare your calendars for free time slots and suggest options to you both; Call recording as an option too;
I have plenty of more important things to do, if only I could bring myself to do them....
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Jun 22, 2020, 05:36 PM
 
Originally Posted by Waragainstsleep View Post
Last minute rumours are saying Apple has pulled hardware announcements. Whether that's just new hardware ready to release or anything hardware related isn't clear, I guess the former.
Dangit, I was hoping for a new ATV!
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Jun 22, 2020, 06:09 PM
 
Thoughts:
- macOS 11, No big deal at all apparently. The end of an era. Wouldn’t be surprised if they tweak the numbers to be in step next year. iOS 15, macOS 15, watchOS 15 etc.
- Big Sur looks like iOS. Eww.
- So much focus on having the computer tell you what to do, not sure I like the direction Apple and other tech companies are going. I want a tool to get work done, not tell me what to do.
     
mindwaves
Registered User
Join Date: Sep 2000
Location: Irvine, CA
Status: Offline
Reply With Quote
Jun 22, 2020, 06:33 PM
 
Thoughts:

1) Widgets in iOS similar to Windows Tiles in Windows Phone. I say about time! I hated launching the Weather app just to view the weather.
2) ARM based Macs. Good! Intel has been slacking a lot recently. Apple will start fro the bottom up, I guess, from the Mac mini and 2 years later, will revamp the Mac Pro. That means that a super powerful A_X chip with a high pipline is needed to replace Xenon processors. Possibly multicore, multiple A_X chips would be needed.
3) Enhanced privacy including having iOS give your approximate location instead exact location is good.
4) Handwriting recognition in iOS along with native translation in Safari is good. I would use the later a lot.
5) Safari being able to use extensions made for other browsers would be a godsend. I hated Safari since Safari 12 came out for Mac, because all my extensions didn't work or worked in heavily disabled form. Way too many annoying webpages recently.

edit: How will the ARM chips work with discrete GPUs? Will Nvidia be back or will Apple not need discrete GPUs considering their current chips run great with games already.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 22, 2020, 07:24 PM
 
I was wondering about the GPUs. Their test rig was running the A12Z from the iPad Pro. But with 16GB RAM. Be very interesting to see the first benchmarks from a dev rig.
I have plenty of more important things to do, if only I could bring myself to do them....
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Jun 22, 2020, 09:58 PM
 
I wonder about pro user support if Apple ditches discrete GPUs though
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 22, 2020, 10:39 PM
 
I don't think they'll ditch them. Its just that we aren't used to seeing Apple A chips with discrete graphics.
I have plenty of more important things to do, if only I could bring myself to do them....
     
mindwaves
Registered User
Join Date: Sep 2000
Location: Irvine, CA
Status: Offline
Reply With Quote
Jun 23, 2020, 02:53 AM
 
Curious how well the A12Z benchmarks considering it is running a different and more full-featured OS, but also in an envelope that has much higher thermal capacity and not limited to sipping battery life in a constrained mobile device.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 23, 2020, 10:44 AM
 
So I have a brand new 16" 8-core MBP sitting unopened and I don't know what to do. If Apple release an ARM based one 6 or even 12 months from now that'll be annoying but I don't really want to wait that long. On the other hand, maybe this is one of the Intel ones still to launch with 10th gen CPUs. That'll be irritating too. That said, x86 Windows might be handy to have for a few years yet.

I have no idea but if I want to return it I don't have long.
I have plenty of more important things to do, if only I could bring myself to do them....
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jun 23, 2020, 01:32 PM
 
16” uses 45W CPUs, so the 10th gen is just another Skylake rebadge, except with LPDDR4. Nothing to wait for.

Personally, I will absolutely not get a 1st gen ARM Mac. Two transitions before have been clear that waiting on generation makes all the difference.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 23, 2020, 04:32 PM
 
Yes but its not like they haven't been building ARM based hardware for years. That wasn't true of Intel.
I thought the 10th gen 13" MBPs were a tidy upgrade but were they coming from 8th gen in the previous model?

I heard Intel had made a layer of the die thinner and it was enabling a surprising performance bonus due to better thermal characteristics.
I have plenty of more important things to do, if only I could bring myself to do them....
     
mindwaves
Registered User
Join Date: Sep 2000
Location: Irvine, CA
Status: Offline
Reply With Quote
Jun 23, 2020, 11:04 PM
 
A 16" MBP will last you a long time and Apple will be supporting it for another 5 years. Additionally, the first Apple Silicon Macs will most likely be lower end Macs, such as the iMac and the Mac mini (see the developer kit Mac). This gives Apple more time to make higher end processors for their Pro line, such as your 16" machine. I would keep it.

Personally, I'll be first in the virtual line to buy a Mac mini with Apple Silicon inside (TM).
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 24, 2020, 12:39 AM
 
@Waragainstsleep
For the record, I owned a first-gen 15" MacBook Pro when Apple transitioned to x86, and this was a great machine. Just about the only thing I disliked was the name (PowerBook still sounds so much better). And I completely agree with you, at least for Macs that use Apple Silicon that closely resembles what has been in iPads and iPhones for over a decade, I don't expect a lot of issues.

Regarding what to do with your machine, if you can wait for another year or so, you might want to wait for ARM-based Mac laptops. But I wouldn't want to wait for the updated Intel chips, they won't offer you much in terms of additional performance. Personally, I have no choice, I have my 16" for a few months now. As far as features go, the only “downgrade” is that compared to the Mac Pro I used before, it can get a bit louder if I make the machine sweat so-to-speak.
I don't suffer from insanity, I enjoy every minute of it.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 24, 2020, 01:07 AM
 
Overall, I thought this was a really great Dub Dub Keynote, although when I watched the keynote with a friend, we'd usually caveat a lot of new features with “… if it works reliably.” Apart from this, a lot of separate strands converged this year, and we have a significant UI overhaul on macOS, an ISA transition for Macs, significant advances in Swift and privacy as well as many other things. The only thing that I wish Apple had made explicit was that they were focussing more on stability and reliability. Anyway, here is my list:

macOS
I really like how the new design looks, although I can't say yet whether I like how it works. In the State of the Union, Apple emphasized that their goal is not to dumb down the Mac, but to still give you access to Unix, allow you to manipulate your machine at a low level, continue to use virtualization and boot off of external devices to name a few things. I was also surprised to learn that virtualization is already supported at a low level, although according to the friend I watched the keynote with, support was very fiddly and you had to do a lot of things by hand. (And he knows virtualization, he was a tech lead at a big hypervisor company before getting back into academia.)

Going from iOS to iPadOS to macOS, the UI design now looks as if these are true siblings. And clearly, this should also make Swift UI more useful across platforms. Apple did essentially what I hoped it'd do, rather than unifying everything into a mush, it keeps the family resemblance above and under the hood, but does not give up platform-specifics. Incrementing the version number by one seems fair, given the changes.

Things I had hoped that would be covered but weren't mentioned were stability and improvements to Time Machine.

iOS & iPadOS
If the hand recognition Scribble works as advertised, this will be huge. I use my iPad a lot, both with and without the smart keyboard cover, and very often I am just with a pen. Giving developers access to more details about pens will likely also enable improvements of one of my most-used apps, GoodNotes.

Widgets seem like a great addition, just seeing them already gave me ideas on how to modify some of my home screens on my iOS devices.

watchOS & fitness
It's a pity that Apple still does not offer some sort of workout scheduling or libraries with some periodized workouts. (iOS) apps like Strength come with a library of individual workouts that explain how to do exercises properly. As many of us are confined to our homes for much of the day, this would have really benefited Apple Watch and iOS customers. Ditto for better integration with some of the fitness apps out there, e. g. Training Peaks, TrainerRoad or Zwift.

Moreover, there was no announcement that Apple allows other devices to use the Apple Watch's heart rate sensor. This would be very useful, because you don't need to wear a heart rate strap. Despite catching a glimpse of their gorgeous gym, it seems that Apple still doesn't want to take the fitness side more seriously.

Transition to Apple Silicon
It seems our speculation in this thread was spot-on. I would have expected a slightly shorter transition period, but I reckon that Apple wants to err on the side of underpromising and overdelivering. It's interesting that they never uttered the word ARM in the keynote (and, if memory serves, also in the State of the Union). And the only time there was something like a benchmark between Apple Silicon and an x86 Mac was when they wanted to show off the power of custom logic like the Neural Engine and compute that was offloaded to the GPU.

The fact that Apple has ported all of its pro apps to ARM was good to see confirmation of. The same goes for mentioning their partners Adobe and Microsoft as well as others, and showing off ARM versions of their apps. Apple really seems to have covered all their bases: virtualization, universal binaries, static and JIT emulations, inexpensive hardware kit (rentals), it seems all there.

Swift
Swift is moving lower and lower down, more and more bits of macOS have been rewritten in Swift. The standard library has been moved further down the API stack to allow people to write lower-level code in Swift rather than e. g. straight C. And the pervasive use of Swift UI indicates that this framework, too, has matured quite a bit. From what I have heard the initial release of Swift UI was plagued with problems.
I don't suffer from insanity, I enjoy every minute of it.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 24, 2020, 06:10 AM
 
I remember my boss at the time (who never has been a coder) had a developer account because he liked to go to WWDC and bought the Intel dev kit. He still has it, never sent it back.

The two year transition was longer than I expected too but my take is that they need to prolong replacing the 2019 Mac Pro as long as they can stand to. People dropping $50k on those would be pissed if there was a newer, better completely different beast available inside 12 months, especially if it were much cheaper as it ought to be (but probably won't be). I hadn't realised the 2019 Pro was still stuck on PCI-E 3.0. They really should have used AMD EPYC instead. I'm guessing Apple needs plenty of time to get their Mac Pro chips up to 64/128 cores and heaps of PCI-E 5.0 lanes by the time they release them.
I did notice a lot of places citing two years from now to transition but I unless I'm mistaken the language Apple used was "the first Macs with Apple Silicon ship by end of this year and the whole transition will take two years. To me that says its going to be 2.5 years from now for the next Mac Pro.


It was fun seeing inside Apple's "top secret lab" though. I know it didn't really give anything much away but its more than they've ever really shown us before I think. Those blade frames with the prototype hardware are cool. I'm guessing the big boxes on the shelves in the background with red LED displays are PSUs that they can adjust as needed to run the new boxes. There seemed to be a lot of XDR displays for not many Mac Pros. Unless some were hooked up to that stack of rack mount ones in the other room. I wonder what they were using them for specifically. Can't imagine board design would justify that kind of horsepower.
I have plenty of more important things to do, if only I could bring myself to do them....
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 24, 2020, 06:21 AM
 
Originally Posted by OreoCookie View Post
@Waragainstsleep
For the record, I owned a first-gen 15" MacBook Pro when Apple transitioned to x86, and this was a great machine. Just about the only thing I disliked was the name (PowerBook still sounds so much better). And I completely agree with you, at least for Macs that use Apple Silicon that closely resembles what has been in iPads and iPhones for over a decade, I don't expect a lot of issues.
I'm with you all the way on the MBP/PowerBook names. I've gotten used to MBP now but it really used to grate. It makes logical sense which I do like, but it doesn't roll off the tongue the way PowerBook did.

Originally Posted by OreoCookie View Post
Regarding what to do with your machine, if you can wait for another year or so, you might want to wait for ARM-based Mac laptops. But I wouldn't want to wait for the updated Intel chips, they won't offer you much in terms of additional performance. Personally, I have no choice, I have my 16" for a few months now. As far as features go, the only “downgrade” is that compared to the Mac Pro I used before, it can get a bit louder if I make the machine sweat so-to-speak.
I think I'm talking myself into keeping hold of it. I have a suspicion my current, used 2013 retina 15" is slowly killing itself. I get odd graphical issues from time to time and I think they are gradually getting more frequent. There is a tiny chip in the glass of the display though it had that when I got it and I've been meaning to try that glass filler resin they used on car windscreens. The display also has an odd permanent purple patch on it. Its only visible when the screen is very dark though. I'm not sure if its the panel or the GPU but these things are well known for display and graphics issues so I can't help but think its on its way out slowly.
As it is, it still works well so I could get a some cash for it. In a year, I suspect not so much. Might sling 10.11 on it before I sell, just to have a play with it.

Waiting seems like a risk. It could be that the Apple notebook silicon is so far ahead of Intel that they release the MBP first. After all, it would be a bit mad if they made a MacBook Air that could smoke their flagship machine that cost double the price. I am betting that when Apple releases the first one one, they are going to want the sort of astonished gasping and cheering they haven't had since the iPhone came out. We all know ARM is coming, the only way to do that is with monster performance. Though if they did make an Air better than the current Pro, we'll know the next Pro will be a massive upgrade on that. And of course there is these mysterious still-to-be-released Intel Macs. Seems like madness to me tbh, but if one of those is a 10th get 16", the ARM version can be reasonably safely assumed to be another 12-18 months away. I don't want to wait that long.
Maybe I hope that Macs that do Windows Boot Camp hold their value for a certain set of users and see if I can sell and upgrade when the time comes.
I have plenty of more important things to do, if only I could bring myself to do them....
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 24, 2020, 09:03 AM
 
The 16” is solid, and while I think Apple Silicon-based Macs will be faster, it’s still a beefy machine with 8 fast cores and discrete graphics. Yeah, probably ARM-based Macs will be faster, but Apple could also decide to simply lower the TDP or allocate more of the TDP to graphics as well as other specialized hardware. The transition is going to be pretty interesting.

It should easily outlive the support window of Intel on macOS. Apple stated the transition will take two years, and I would reckon Apple at the very least supports its last-sold Intel hardware for another four years. That should give you 5-6 years, which, in my book, is the usable lifespan of a computer. I wouldn’t worry about any Intel upgrades. Their 10 nm process is borked and 7 nm not happening for a while. So you’ll get just a few tweaks, but nothing major with Intel’s latest *lake cores.

I could also picture a scenario where when the transition begins some people want to pick up the “last” Intel Mac for cheap, so you could try and sell it then. The only thing you are giving up are iOS apps running natively on it. But honestly, that’s not really the preferred solution anyway.
I don't suffer from insanity, I enjoy every minute of it.
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Jun 24, 2020, 09:24 AM
 
I hope they don’t keep lower TDP and make everything thinner. I can kind of expect it for the MBA but please stop making the Pros smaller.

Going back to Power/i Macs/Books would be nice....
     
Brien
Professional Poster
Join Date: Jun 2002
Location: Southern California
Status: Offline
Reply With Quote
Jun 24, 2020, 09:24 AM
 
Also, dropping force touch on watchOS 7. Booooo.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 24, 2020, 09:43 PM
 
This is worth a watch.

https://developer.apple.com/videos/play/wwdc2020/10686/

Explicitly states that GPUs will be built into the SoC, not discrete. Though I guess that might only apply to the early releases.
I have plenty of more important things to do, if only I could bring myself to do them....
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 24, 2020, 10:13 PM
 
Originally Posted by Brien View Post
I hope they don’t keep lower TDP and make everything thinner. I can kind of expect it for the MBA but please stop making the Pros smaller.
Well, if performance improves and TDP decreases, it is a win-win. And that clearly seems possible given that the iPad Pro's SoC handily outperforms anything that is in e. g. the MacBook Air. A fanless MacBook Air would be quite something.
Originally Posted by Brien View Post
Going back to Power/i Macs/Books would be nice....
I don't think the Power moniker will come back, but Pro Mac/iMac, Pro Book/iBook sounds kinda nice.
I don't suffer from insanity, I enjoy every minute of it.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 24, 2020, 11:21 PM
 
Originally Posted by Waragainstsleep View Post
This is worth a watch.

https://developer.apple.com/videos/play/wwdc2020/10686/

Explicitly states that GPUs will be built into the SoC, not discrete. Though I guess that might only apply to the early releases.
Integrated GPUs do not preclude Apple from adding discrete GPUs. Our current 16" MacBook Pros have just such a configuration, a built-in GPU that shares the memory with the CPU and a discrete GPU. And since Apple explicitly mentions that it will support PCIe devices and how they will access memory (from 9:12 onwards of the video you linked to), it would seem plausible that they will continue to support dual-GPU configurations just like they do now. Perhaps this will only apply to the iMac Pro and the Mac Pro in the future, but I don't think the publicly available information precludes that from happening.
I don't suffer from insanity, I enjoy every minute of it.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jun 25, 2020, 09:20 AM
 
Originally Posted by OreoCookie View Post
Integrated GPUs do not preclude Apple from adding discrete GPUs. Our current 16" MacBook Pros have just such a configuration, a built-in GPU that shares the memory with the CPU and a discrete GPU. And since Apple explicitly mentions that it will support PCIe devices and how they will access memory (from 9:12 onwards of the video you linked to), it would seem plausible that they will continue to support dual-GPU configurations just like they do now. Perhaps this will only apply to the iMac Pro and the Mac Pro in the future, but I don't think the publicly available information precludes that from happening.
Having two GPUs is such a hack, though. Apple clearly wants to have one GPU and one pool of RAM, and now that LPDDR4X enables enough memory bandwidth to make that happen, they're going to go all in on integrated graphics - just making them bigger and adding more cache and whatever they need to make it work.

(Sidenote: GPU scaling algorithms from AMD and Nvidia are really good these days. You can have the game render at 1080p and let the GPU scale it up instead of the display and it looks really good. Getting a GPU to to handle 1080p@60 Hz is not hard to day, so if Apple implements that kind of scaling, they could get away with a comparably weaker CPU.)

Originally Posted by Waragainstsleep View Post
Yes but its not like they haven't been building ARM based hardware for years. That wasn't true of Intel.
I thought the 10th gen 13" MBPs were a tidy upgrade but were they coming from 8th gen in the previous model?
10th gen MBP 13" are Ice Lake - 10nm process instead of 14nm and a completely new core with wider execution, wider mid-core, bigger caches. 10th gen MBP 15" is Skylake++++. Not comparable at all.

I heard Intel had made a layer of the die thinner and it was enabling a surprising performance bonus due to better thermal characteristics.
They may very well have fiddled with the packaging again, but all that means is that they can turbo up higher. You can see the performance increase in the clockspeed ratings directly, and quite frankly, they're not impressive. Top model goes from 5.0 GHz to 5.3 GHz max turbo, so 6% or so. The top i7 goes up by 500 MHz if you stay at 6 cores, although there is now an 8-core i7 as well. Not a lot has happened.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 25, 2020, 09:38 AM
 
Originally Posted by P View Post
Having two GPUs is such a hack, though. Apple clearly wants to have one GPU and one pool of RAM, and now that LPDDR4X enables enough memory bandwidth to make that happen, they're going to go all in on integrated graphics - just making them bigger and adding more cache and whatever they need to make it work.
I fully believe that this is what Apple wants to do in an ideal world, but at least for the Mac Pro and probably also the iMac Pro, I don’t think Apple will have much of a choice. While its GPU designs are solid for what they are — performant for mobile SoCs — Apple has no history of producing GPUs that consume north of 100 W. Even if it did, I am not sure it makes sense to compete with AMD and nVidia on the high end.
Originally Posted by P View Post
(Sidenote: GPU scaling algorithms from AMD and Nvidia are really good these days. You can have the game render at 1080p and let the GPU scale it up instead of the display and it looks really good. Getting a GPU to to handle 1080p@60 Hz is not hard to day, so if Apple implements that kind of scaling, they could get away with a comparably weaker CPU.)
The GPU makes a big difference if you are a having way too many windows open, like I frequently do. My 16”’s GPU handles my window mess noticeably better than its predecessor.
I don't suffer from insanity, I enjoy every minute of it.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jun 25, 2020, 09:43 AM
 
Originally Posted by OreoCookie View Post
I don’t think that contradicts what I have written before: Intel changed what TDP means in a substantial way.
...in 2008, because that is when they implemented this. Motherboard manufacturer cheating came much later, of course.

We are not talking about adding a few % around the edges, PL2 allows for double the power draw for a limited period of time. Clearly, overclockers and enthusiasts will want to fudge with tau and the voltages and run in higher-power states for longer. But Intel supports PL2 in the first place, and IMHO it is disingenuous to call what Intel dubs TDP as what TDP used to mean.
No, TDP means what it always did. If the chip is at PL1, it will draw an average of TDP watts over time. It can be higher for a short period of time, but the average of time will be this, and the chip will underclock to make that happen. If the chip goes to PL2, the power draw will be 25% higher for a period of time, but as soon as PL2 ends, the power draw will be reduced so that the total average eventually gos back to TDP. This means that if your chip's base clock is 3 GHz and max turbo is 5GHz and you have a task that loads the CPu 100% for a period of time longer than tau, it will drop below 3 GHz for a time to "pay back the power debt".

TDP means that "if you have a cooling system that can remove this much heat from the CPU, it will never throttle." It always meant that, and it still does.

Just imagine you spec your voltage regulators for “TDP” + safety margin of, say, 30-50 %, and stick an Intel chip in it. Hilarity would ensue once you run some AVX512 workload on all of your cores.
No, because that is what PL3 and PL4 are about. They are there to prevent exactly this.

You may say that TDP stands for thermal design powers, but it has been used for years and years to spec other components such as power supplies.
Right. But that isn't really Intel's fault, is it? Every PSU calculator I have seen adds a nice 25% margin of error to handle power spikes, and that should be enough to handle PL3 and PL4.

I only keep harping about it, because we use TDP to compare chips with one another. If Intel’s PL2 numbers are closer what other manufacturers call TDP, then we should use that.
OK, so what is the TDP for the A12X? Is it the same in an iPad as in the DTK? Probably not, right?

TDP is the average power draw over time unless someone has fiddled with the MSR. The issue is that people DO fiddle with the MSR to set PL1, PL2 and tau to something that Intel didn't intend.

I think Intel should stop calling it TDP. We all know why Intel is doing what it is doing, to try and keep up performance-wise. It’s fine if power consumption isn’t a concern, performance is performance, and higher clocks make everything faster.

That strikes me as a “Technically, I did not lie, I just did not tell you the truth.”
Intel is stuck between a rock and a hard place. They used to just advertise base clock, but then it got to be a problem because the idiots among us didn't understand why quadcore had a lower clock than the dualcore. So they started advertising the max turbo as well, and that's where we are.

If it were me, I would:

a) publish the turbo tables again. Intel stopped with Kaby Lake, because they needed to hide that it was really just Skylake again with new turbo tables, but we need them so people can see what speeds they get with how many cores for each chip
b) set some sort of sanity limit on PL1, PL2 and tau. Let people adjust them, but unless it is the K models, limit the increases. I'm sure this will drive people bonkers - I don't care. 4096W is not a sane power limit.
c) require OEMs and motherboard manufacturers to advertise when they change the power limits by default. This one will be hard to enforce, but they should try.

As for the long deprecation discussion - I get that Apple deprecates things. My issue is that if you combine this with enforced updates, things get dark very fast. If you have to do that, leave me a long-term support version where you don't remove features and just make security updates

(I'm also still salty about them ditching sub-pixel rendering, but that is part of a larger thing where Apple is Just. Plain. Wrong. about the resolutions they pick for their Mac displays.)
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jun 25, 2020, 05:18 PM
 
Originally Posted by P View Post
10th gen MBP 13" are Ice Lake - 10nm process instead of 14nm and a completely new core with wider execution, wider mid-core, bigger caches. 10th gen MBP 15" is Skylake++++. Not comparable at all.
I was talking about the difference between the current and previous 13" models. I think.


Originally Posted by P View Post
They may very well have fiddled with the packaging again, but all that means is that they can turbo up higher. You can see the performance increase in the clockspeed ratings directly, and quite frankly, they're not impressive. Top model goes from 5.0 GHz to 5.3 GHz max turbo, so 6% or so. The top i7 goes up by 500 MHz if you stay at 6 cores, although there is now an 8-core i7 as well. Not a lot has happened.
Yeah but because they perform better thermally they get throttled much less than they used to in Apple's skinny MacBook chassis and I gather it makes a bigger than expected difference.
I have plenty of more important things to do, if only I could bring myself to do them....
     
 
Thread Tools
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 04:19 PM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,