Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Software - Troubleshooting and Discussion > macOS > macOS High Sierra

macOS High Sierra
Thread Tools
reader50
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Sep 19, 2017, 03:36 PM
 
A High Sierra GM support document has been posted by Apple, concerning APFS. Highlights:

• Fusion drives and HDDs are not converted by the GM. Fusion was supported by some of the betas.
• SSD conversion is automatic - no opt-out option. Presumably High Sierra will work from HFS+ if it were cloned to an HFS+ SSD.
• APFS drives are not readable by Sierra or earlier. At least, USB drives formatted as APFS can't be. No word on Sierra updates for compatibility.
• APFS drives cannot be shared over AFP. SMB or NFS sharing only.
• Boot Camp will work, but can't read or write APFS.
• Time Machine will back up APFS as normal. As TM almost always uses HDDs, it is implied that High Sierra backs up APFS volumes to HFS+ volumes.

It is not clear if Fusion / HDD support is coming in a later update. The 27" iMacs come with Fusion drives standard today, so Apple has motivation to finish HDD support. And I expect HDD support for APFS would significantly improve Time Machine performance.

Still no word on file checksumming.
     
Ham Sandwich
Guest
Status:
Reply With Quote
Sep 19, 2017, 07:07 PM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 08:51 AM. )
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Sep 25, 2017, 06:39 PM
 
Ars has their High Sierra review posted. In particular, there is updated info from the APFS section.

• HDDs are supported by APFS. They aren't converted by default, but you can do it manually. Even for a boot drive. It's just Fusion that is MIA today.
• Boot Camp not only can't read/write APFS, it can't even see a Mac volume is present. You can't boot back to macOS from Boot Camp because you can't select an APFS volume. Keep an HFS+ boot option around. Or option-boot, then update Startup Disk.
• There is currently no de-duplication process to complement APFS cloning. Probably a good thing until more bugs are detected and worked out.
• eGPU support is here, but apparently intended more for developers than the gaming public.
• HS performs a weekly checksum of the EFI for tampering. Mac Pros flashed from 4,1 to 5,1 may or may not get flagged. It's done via a commandline tool, which also has the ability to dump the current firmware. While Apple intends it against hacking, it may be quite enabling for enthusiast hacking.

I spotted a couple other items of interest, but failed to take notes. And the Ars article is getting mobbed, causing server errors - I'll recheck later.
     
Chongo
Addicted to MacNN
Join Date: Aug 2007
Location: Phoenix, Arizona
Status: Offline
Reply With Quote
Sep 25, 2017, 07:55 PM
 
I guess those of us with 3TB Fusion drives will wait until the new file format is supported. (2013 27” iMac.)
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Sep 25, 2017, 10:30 PM
 
The other item I'd noticed is: APFS is undocumented outside of Apple. They plan to release documentation at some point. So High Sierra is the only OS that can address it. I wouldn't put much faith in Sierra's beta support, unless Apple updates it.

So no 3rd party support for say, disk repair/recovery tools. No drivers for Windows or Linux. Just the one OS version that can access or repair it. Which is a .0 release.

Make backups, and keep Time Machine up to date. APFS sounds nifty, but beware the early adopter pitfalls.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 26, 2017, 04:07 AM
 
Haven't read the review yet, but some comments:

Originally Posted by reader50 View Post
• There is currently no de-duplication process to complement APFS cloning. Probably a good thing until more bugs are detected and worked out.
Deduping eats RAM. As in, all of it. It is of quite limited utility on a consumer setup as well, so I don't think Apple will integrate it into the OS. Remember how they handled filesystem compression in 10.6? No live compression facility, but files were compressed on delivery to save space in the default install. Worked pretty well.

I can see a third-party utility doing off line de-duping, but as I said, use case is limited.

Originally Posted by reader50 View Post
• HS performs a weekly checksum of the EFI for tampering. Mac Pros flashed from 4,1 to 5,1 may or may not get flagged. It's done via a commandline tool, which also has the ability to dump the current firmware. While Apple intends it against hacking, it may be quite enabling for enthusiast hacking.
That tool has been there for Sierra as well, it just didn't run regularly.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Sep 26, 2017, 05:13 AM
 
Deduping is so intensive because it requires heavy files comparison. But if file checksums get enabled, deduping would be easy. You'd still do a byte comparison for safety, but you'd only do it on files with identical checksums. As the chances of a collision are negligible, in practice you'd only compare files you end up deduping. No wasted passes.
     
Ham Sandwich
Guest
Status:
Reply With Quote
Sep 26, 2017, 07:31 AM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 08:51 AM. )
     
Thorzdad
Moderator
Join Date: Aug 2001
Location: Nobletucky
Status: Offline
Reply With Quote
Sep 26, 2017, 08:55 AM
 
Seems there's a bit of a security issue with HS and earlier versions.
The kicker is this:
While the app in the video is unsigned—and as a result can't be installed on a default Mac installation—the vulnerability can be exploited by signed apps as well. All that's required to digitally sign an app is a membership in the Apple Developer Program, which costs $99 per year.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 26, 2017, 09:32 AM
 
Originally Posted by reader50 View Post
Deduping is so intensive because it requires heavy files comparison. But if file checksums get enabled, deduping would be easy. You'd still do a byte comparison for safety, but you'd only do it on files with identical checksums. As the chances of a collision are negligible, in practice you'd only compare files you end up deduping. No wasted passes.
I know how deduping works, but if you check the system reqs for ZFS - which has de-duping and relies heavily on file checksumming in every part of the file system, RAM requirements start to run away quickly. According to tests of ZFS on FreeBSD, you need 5GB of RAM per TB of storage that you want to keep live de-duping active on:

https://wiki.freebsd.org/ZFSTuningGuide

Apple only recently stopped shipping Macs with 4GB RAM, and they do ship models with 3TB storage. This is too much for a feature that will have very limited utility in a consumer model. Better to have some form of offline de-duping.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 26, 2017, 09:39 AM
 
Originally Posted by Thorzdad View Post
It seems like there is a bug in the keychain. Not specifically related to High Sierra - just a pissed off security researcher who think they should have delayed the release and given him a bounty.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 26, 2017, 09:42 AM
 
Originally Posted by And.reg View Post
However, now my Mac uses 5 GB of RAM after starting up, rather than 1.5 GB in 10.12, according to iStat Menus (now version 6). What gives?
Siri saw reader's post above yours and silently enabled dedupe.

What are you looking at when you're comparing - Wired + Active + Inactive, or Wired + Active? Because if you're including Inactive, it is a pointless measurement. That can happen if Apple has, for instance, improved the disk caching as part of the APFS introduction.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 26, 2017, 09:47 AM
 
An interesting piece in the Ars review about the server features ported over: An Apple Update server (similar to WSUS for Windows), a Time Machine Server, and an updated file server over SMB. The last I can ignore, but the other two are interesting. Considering taking an old MBA together with an external drive and placing in the cabinet next to the router to act as a server for the last two. It is only so I won't have to listen to the noisy HDD, but still.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Ham Sandwich
Guest
Status:
Reply With Quote
Sep 26, 2017, 09:56 AM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 08:52 AM. )
     
Chongo
Addicted to MacNN
Join Date: Aug 2007
Location: Phoenix, Arizona
Status: Offline
Reply With Quote
Sep 26, 2017, 10:39 AM
 
Does anyone who has update have a Fusion drive?
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 26, 2017, 11:21 AM
 
Originally Posted by And.reg View Post
What is dedupe? I've never heard of that term before.
De-duplication. Reader and I were discussing it above your post, so I made a little joke. De-duplication is when the computer can detect that the file at /Users/user1/photo1.jpg is actually the same as the file at /Users/user2/my_great_photo.jpg, and so save space by only storing it once until one of the files are changed. Actual de-duplication works on parts of files, even, but that is the principle. It is a nice feature, but it uses lots of RAM, hence my little joke.

Originally Posted by And.reg View Post
I'm looking at:

OK, so it isn't even showing the Inactive part. Interesting.

I don't know why the memory usage is higher. Go to Activity Monitor and check which app is using more data, I guess.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Ham Sandwich
Guest
Status:
Reply With Quote
Sep 26, 2017, 12:41 PM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 08:52 AM. )
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Sep 26, 2017, 03:10 PM
 
Originally Posted by P View Post
I know how deduping works, but if you check the system reqs for ZFS - which has de-duping and relies heavily on file checksumming in every part of the file system, RAM requirements start to run away quickly. According to tests of ZFS on FreeBSD, you need 5GB of RAM per TB of storage that you want to keep live de-duping active on: ...
They must be creating per-block checksums on the fly, for every block of every file. For maximium deduping of common blocks in unrelated files. That would produce the scaling problem.

I suggest only an identical-file process, especially for consumer installs. And like you say, leave more complete deduping to an offline utility.

With just identical files, pull existing checksums from the file system. Pick the largest files with matching sums that haven't been compared recently, and pull blocks in succession for comparison. Should require less than a MB of memory space, easily set to a low priority, safe to run in the background.
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Sep 26, 2017, 04:11 PM
 
Originally Posted by Chongo View Post
I guess those of us with 3TB Fusion drives will wait until the new file format is supported. (2013 27” iMac.)
Apple has confirmed APFS support is coming to Fusion and HDDs in a future update.

Ars confirm

Press release for High Sierra:
Originally Posted by Apple
• APFS currently supports every Mac with all‑flash internal storage — support for Fusion and HDD Mac systems will be available in a future update.
Since HDDs are supported already, Apple presumably means they'll eventually convert boot HDDs by default.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 26, 2017, 06:23 PM
 
Originally Posted by reader50 View Post
They must be creating per-block checksums on the fly, for every block of every file. For maximium deduping of common blocks in unrelated files. That would produce the scaling problem.

I suggest only an identical-file process, especially for consumer installs. And like you say, leave more complete deduping to an offline utility.

With just identical files, pull existing checksums from the file system. Pick the largest files with matching sums that haven't been compared recently, and pull blocks in succession for comparison. Should require less than a MB of memory space, easily set to a low priority, safe to run in the background.
Deduping on a server is indeed per block, and it makes sense in a multiuser system, but I just don't see that the savings are that great for a single user. Why should I put the same file in more than one spot? Mistakes happen, but they can't be that common. Unless the savings are great, why spend CPU cycles and RAM on it?
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 26, 2017, 06:24 PM
 
Originally Posted by And.reg View Post
This? The sum hardly adds up to the claimed 5-6 GB in my recent usage, and that's with the same Applications that I used to have open on 10.12.

Those are just your processes, though. You're not showing system processes.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Ham Sandwich
Guest
Status:
Reply With Quote
Sep 26, 2017, 07:28 PM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 08:52 AM. )
     
besson3c
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Sep 26, 2017, 07:53 PM
 
Originally Posted by And.reg View Post
However, now my Mac uses 5 GB of RAM after starting up, rather than 1.5 GB in 10.12, according to iStat Menus (now version 6). What gives?

That is not necessarily a bad thing. The test is how well your Mac performs when memory resources become scarce. Simply adding up the numbers in that column doesn't tell you a ton. I don't see anything alarming in your screenshots.
     
Ham Sandwich
Guest
Status:
Reply With Quote
Sep 26, 2017, 07:57 PM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 08:52 AM. )
     
besson3c
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Sep 26, 2017, 07:59 PM
 
Is Photos showing processing photos? https://discussions.apple.com/thread...art=0&tstart=0
     
Ham Sandwich
Guest
Status:
Reply With Quote
Sep 27, 2017, 12:23 PM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 08:52 AM. )
     
besson3c
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Sep 27, 2017, 12:27 PM
 
Originally Posted by And.reg View Post
No. It shows one photo in People and no processing information.

Still really bumped that about 6 GB of RAM are being used instead of about 2 GB, I would have thought that High Sierra would have made memory management more efficient, but instead, I have about 30% less RAM available for other apps/processes.
That's not how it works though.

A higher number does not necessarily = bad. Memory is used for caching, because it is fast. Using available memory when it is available is fine, and can be a sign of smart coding. Absolutely needing/requiring that memory is another thing, which is why how things behave when memory is actually low is really what to be concerned over. The stuff that isn't actually needed should be reallocated in these conditions.
     
besson3c
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Sep 27, 2017, 12:28 PM
 
Originally Posted by And.reg View Post
Huh? What test, what is High Sierra testing my memory for? As soon as I boot up my Mac, it bloats up to 4-5 GB of RAM used, instead of less than 2 GB under 10.12. That, combined with not understanding your post about the "test," has me confused.

There is no literal test, I meant rhetorical test, as in "what is most telling". That being said, that photo service probably shouldn't be spinning out of control that way.
     
Ham Sandwich
Guest
Status:
Reply With Quote
Sep 27, 2017, 07:07 PM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 08:53 AM. )
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 28, 2017, 04:18 AM
 
Originally Posted by And.reg View Post
#12. When un-muting your volume for the first time after booting the computer, the volume control will show a fraction of a dial:
In 10.10:
In 10.11:
Same in 10.12 and 10.13, and was never remedied. Either make a continuous slider option for the volume buttons, or change the volume meter display to make it continuous.
It is possible to adjust the volume fractions by holding shift and/or option when pushing the volume adjust buttons. Are you sure that you haven't done that?

Your other links are broken for me.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Ham Sandwich
Guest
Status:
Reply With Quote
Sep 28, 2017, 07:39 AM
 
[...deleted...]
( Last edited by Ham Sandwich; Apr 23, 2020 at 08:53 AM. )
     
ghporter
Administrator
Join Date: Apr 2001
Location: San Antonio TX USA
Status: Offline
Reply With Quote
Sep 30, 2017, 10:13 AM
 
De-duping on a consumer level? OH BOY do I need that!

I have multiple copies of a huge number of photos, multiple copies of various PDFs for reference material, etc.... Yeah, this consumer really needs duplicate identification.

Do I need it on-the-fly and in the background? Probably not. I need it occasionally, sort of as a housekeeping thing. "User Glenn has 458GB in duplicate files in 59 folders. Would you like to review these files?" That's what I'd want.

At the moment I don't know if I would prefer it to be native or third-party, but there are already third party apps that identify duplicate files. The problem is that all of them I've tried are a pain to use. I'd want something that told me about not only duplicates, but also which copy I'd opened or read most recently, and maybe even tell me about folder usage data (most recently used, most frequently used, etc.).

I know this isn't specific to the HS beta, but as an arche consumer I know my online storage (whatever kind of "disk" I'm using) would be much more efficient if I was more aware of, or "protected from" excessive disk use from multiple copies of files.

Glenn -----OTR/L, MOT, Tx
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Sep 30, 2017, 03:08 PM
 
Deduping on a file level seems like it should be easy if you have the duplicates identified. Just delete all but one and hard link the rest - and probably write protect that file to make it clear that it shouldn't be touched.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
besson3c
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Sep 30, 2017, 03:11 PM
 
Isn't dedup more about individual file segments rather than complete files?

For example, docker containers operate this way with image segments.
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Sep 30, 2017, 05:16 PM
 
That's the typical server-version of deduping. We discussed that somewhat earlier. But it's too memory-intensive to run in the background on consumer systems.

File-level deduping though is safe to run in the background. For example, I usually import photos into Photos. But I leave the originals, because I don't trust irreplaceable family pix to Apple's semi-closed ecosystem. And I want to crossreference them via folder.

With automatic hardlinking (what APFS would do) I could have my separate access, without burning the gigs for full separate copies. Wouldn't burn those gigs in my Time Machine backups either.

I've upgraded systems so many times, that I believe I have many duplicate files scattered about. From changing organizational design over the years. They won't always have the same names either.
     
CharlesS
Posting Junkie
Join Date: Dec 2000
Status: Offline
Reply With Quote
Sep 30, 2017, 09:16 PM
 
I got interested by the topic of dedup, and decided to make a little tool to scan for files that could be deduped on the per-file level, at least:

http://charlessoft.com/CloneClub.zip

Heavy disclaimers apply:

- This doesn't actually dedup anything; it currently just does a pre-scan to see how many duplicated files there are, and how much space is taken up by these duplicated files.

- If any files are already clones, they'll show up as false positives here. Unfortunately, there's no way for me to tell whether a file contains cloned blocks or not (well, there's ATTR_CMNEXT_PRIVATESIZE, but it's been returning 0 for me on any files that aren't brand new. According to the docs, ATTR_CMNEXT_PRIVATESIZE returns the size of any blocks that aren't part of either a clone or a snapshot; I suspect that Mobile Time Machine is probably using snapshots, causing every file on the disk's ATTR_CMNEXT_PRIVATESIZE to go to 0 when it updates itself).

- For simplicity's sake, I'm currently not looking at any files that we don't have write access to, or any files that have extended attributes or more than one hard link.

- Also, this tool takes forever to run, and chews through a lot of RAM (it topped out around 3 GB on my system). There are a few ways I could probably make it faster, but it's probably always going to be slow by nature of what it has to do.

Anyway, I'd be curious to see what results you guys get. On my machine, I'm getting over 50 GB of duped files (!), so unless I've got a bug in the size reporting, there might be some benefit to making a third-party dedup tool.

Ticking sound coming from a .pkg package? Don't let the .bom go off! Inspect it first with Pacifist. Macworld - five mice!
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Sep 30, 2017, 11:48 PM
 
@Charles, it won't run on Sierra. I'm holding off on updating until more bugs are fixed. Does the code actually require HS? edit - I modified the minimum OS in info.plist, but it's hardcoded somewhere else too. Still shows as needing HS.

The files deduping would be easy if file checksums were turned on in APFS. You'd only look at files highly likely to be identical. Along with keeping a log of comparisons that failed (checksum collisions) so they don't get rechecked until one of them is modified.

btw, does your utility allow for more than two copies of a file?
     
CharlesS
Posting Junkie
Join Date: Dec 2000
Status: Offline
Reply With Quote
Sep 30, 2017, 11:54 PM
 
Originally Posted by reader50 View Post
@Charles, it won't run on Sierra. I'm holding off on updating until more bugs are fixed. Does the code actually require HS?
Yeah, HS only. I figured there wouldn't be much of a point on OS versions that can't boot from APFS.

The files deduping would be easy if file checksums were turned on in APFS. You'd only look at files highly likely to be identical. Along with keeping a log of comparisons that filed (checksum collisions) so they don't get rechecked until one of them is modified.
AFAIK APFS doesn't have checksums for files, only for metadata. It's a shame (and it's why my tool takes so long to run; it's checksumming every file on the disk).

btw, does your utility allow for more than two copies of a file?
Yeah, it groups files that match each other, and adds the size of each of them (except for the first one) to the running total. You can click a file and see all the files that are identical to it.

Ticking sound coming from a .pkg package? Don't let the .bom go off! Inspect it first with Pacifist. Macworld - five mice!
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Oct 1, 2017, 12:01 AM
 
When APFS was first announced, plenty of people asked about file checksums. It's only for metadata today, but I got the distinct impression it may become an option.

On the Mac, it could be an out-of-the-way option for people who want it. Probably only used for system files on iPhone. And the Watch wouldn't use it at all. I remain hopeful it will become available in a future update. Or that someone will create an extension to add it.

Apple would be the best party to add it, as it should be tied into Time Machine. ie - if a checksum fails, and TM has a copy that matches the sum, there should be a silent replacement. And a user dialog only if there isn't a clean copy, and it isn't a temp/cache file.
     
CharlesS
Posting Junkie
Join Date: Dec 2000
Status: Offline
Reply With Quote
Oct 1, 2017, 12:08 AM
 
Originally Posted by reader50 View Post
When APFS was first announced, plenty of people asked about file checksums. It's only for metadata today, but I got the distinct impression it may become an option.

On the Mac, it could be an out-of-the-way option for people who want it. Probably only used for system files on iPhone. And the Watch wouldn't use it at all. I remain hopeful it will become available in a future update. Or that someone will create an extension to add it.

Apple would be the best party to add it, as it should be tied into Time Machine. ie - if a checksum fails, and TM has a copy that matches the sum, there should be a silent replacement. And a user dialog only if there isn't a clean copy, and it isn't a temp/cache file.
I do hope that that happens eventually. They do seem to be rolling things out piece by piece (there still doesn't seem to be any API that I can see for snapshots, and fast directory sizing doesn't seem to be used anywhere thus far). Unfortunately as far as I know there hasn't been any official word on it, and apparently they've made the claim in the past that they don't need it, so who knows.

Ticking sound coming from a .pkg package? Don't let the .bom go off! Inspect it first with Pacifist. Macworld - five mice!
     
CharlesS
Posting Junkie
Join Date: Dec 2000
Status: Offline
Reply With Quote
Oct 1, 2017, 12:11 AM
 
One thing that is pretty cool dedup-wise is that I can confirm that the file clones are block-based rather than file based. As in, if you clone a file, then add some stuff to the end of the clone, it doesn't make a new copy of the whole file, but rather retains the reference to the original blocks and just stores the new stuff you added. You can see this by checking ATTR_CMNEXT_PRIVATESIZE of a newly created clone file (as I noted earlier, it always seems to be zero on older files) before and after making a change to it. Quite nice.

Ticking sound coming from a .pkg package? Don't let the .bom go off! Inspect it first with Pacifist. Macworld - five mice!
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Oct 1, 2017, 12:24 AM
 
Originally Posted by CharlesS View Post
Unfortunately as far as I know there hasn't been any official word on it, and apparently they've made the claim in the past that they don't need it, so who knows.
Your link is most of what I read at the time. The Apple engineers were interested in talking about it, and carefully ignored the use of 3rd party storage devices. What gave me hope is that no one ruled out file checksums. They just didn't commit to anything, and kept asking questions.

Any chance you could recompile CloneClub to run on earlier OSes? So those of us waiting to upgrade can check our dupe situation too. When I asked if it required HS, I meant if it had a HS-specific code dependency.
     
CharlesS
Posting Junkie
Join Date: Dec 2000
Status: Offline
Reply With Quote
Oct 1, 2017, 12:33 AM
 
Done—it's now recompiled for Sierra. I think I didn't use any APIs that will break it there, but since I haven't tested it, no guarantees.

The other disclaimer is that if I ever do make this into a full deduper, it'll have to go back to HS-only since you can't do that without APFS. That's still an "if" at this stage, though. Tell me what you think.

Ticking sound coming from a .pkg package? Don't let the .bom go off! Inspect it first with Pacifist. Macworld - five mice!
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Oct 1, 2017, 01:44 AM
 
Run completed, took less than an hour. 500 GB partition, 315 GB used, 8.3 GiB duplicated. edit - RAM usage peaked at 655 MB.

It's finding mostly game files (especially Overgrowth), along with GarageBand files. Oh, and 6.2 MB of identical .DS_Store files. There's a scattering of other game and utility dupes too.

My results look plausible.
( Last edited by reader50; Oct 1, 2017 at 01:02 PM. )
     
CharlesS
Posting Junkie
Join Date: Dec 2000
Status: Offline
Reply With Quote
Oct 1, 2017, 02:00 AM
 
Thanks!

Ticking sound coming from a .pkg package? Don't let the .bom go off! Inspect it first with Pacifist. Macworld - five mice!
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Oct 1, 2017, 12:58 PM
 
Originally Posted by reader50 View Post
Your link is most of what I read at the time. The Apple engineers were interested in talking about it, and carefully ignored the use of 3rd party storage devices. What gave me hope is that no one ruled out file checksums. They just didn't commit to anything, and kept asking questions.
What 3rd party storage devices? Internal 3rd-party storage devices are all but gone: The MBPs have storage soldered in now. The iMac requires you to remove a glued-on display to get at the drive. The Mac mini's last (or maybe latest) update made it significantly harder to access as well, and the trashcan Mac Pro has removable but extremely proprietary storage. Apple probably has stats on how many users have third-party external storage that isn't strictly backup, and probably knows that the number is small. If Apple implements an online backup service for Macs - they are clearly moving in that direction - the need for third-party storage will be gone from Apple's perspective.

CloneClub is running as I write this. Status bar looks full, but it is still running. Charles, what are you using to checksum? Because my MBP is at max all-core turbo all throughout the scan, and that implies that there is no AVX2 code at least (AVX will mostly disable turbo, and AVX2 will even reduce clocks below base clock as the chip power throttles. This is documented by Intel, but mostly not acknowledged unless you really dig down into the depths of the documentation).

EDIT: done. Didn't time it, but Activity Monitor says that it used 41 minutes of processor time. 500GB drive, 120 GB free, 690 MB duplicates.
( Last edited by P; Oct 1, 2017 at 01:11 PM. )
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Oct 1, 2017, 01:14 PM
 
Originally Posted by P View Post
What 3rd party storage devices? Internal 3rd-party storage devices are all but gone: {summary of current models} Apple probably has stats on how many users have third-party external storage that isn't strictly backup, and probably knows that the number is small. If Apple implements an online backup service for Macs - they are clearly moving in that direction - the need for third-party storage will be gone from Apple's perspective.
Recall that Macs are expected to last a long time. I'm not the only cheesegrater MacPro owner skipping the trashcan because of lack of internal expansion. And while Apple may offer online backup, today they recommend everyone add an external for TM backup. Even if they do offer online backup later, everyone should still have a local backup against internet outage, and for fast recovery. Internet speeds/prices remain a black eye in the US.

So I'm talking external storage, potentially more external drives than the number of Macs in use. And internal drives in legacy Macs, perhaps half of which remain supported. But even if we stick to Apple's internal OEM components, Apple cannot guarantee no bitrot will happen. Claiming that relies on statistics, but stats cannot take into account unexpected conditions. Like unplanned component failures elsewhere in the Mac (power supply voltage spikes), or if X-ray machines became routine in airports - or traffic stops at state borders in the southern US. Or if cosmic ray incidents at the surface picked up.

Apple should let us guarantee data integrity, regardless of how comfortable they feel with their hardware.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Oct 1, 2017, 02:46 PM
 
While I generally agree that Apple should support third-party storage, I don't think they will. If they feel that they can control their own storage units through checksumming in the SSD controller, I imagine that they won't bother doing anything in the filesystem.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Online
Reply With Quote
Oct 1, 2017, 07:56 PM
 
Originally Posted by P View Post
While I generally agree that Apple should support third-party storage, I don't think they will. If they feel that they can control their own storage units through checksumming in the SSD controller, I imagine that they won't bother doing anything in the filesystem.
I mostly agree with your take. Two things, though: first of all, even if Apple does the checksumming on the SSD controller (presumably in hardware), there need to be some hooks in the filesystem to tell the OS when stored files have become corrupted and — if at all possible — need to be replaced by a backup or an older copy. So there needs to be some filesystem/OS integration one way or another. Secondly, given that Apple builds its own SoCs at least for all devices other than Macs, and the SoC houses the storage controller, it is not necessary to build checksumming into the controller, they could put the hardware acceleration units onto the the CPU bit and have the OS take care of that. That would expose more of the inner workings of checksumming to the OS which, I think, might allow for more end-user functionality. With current Intel-based Macs, though, Apple would have no choice but to put it into the controller. On the other hand, there is the perennial rumor that Apple is pondering to move its Macs to ARM.*

However, at the end of the day, I don‘t care how data checksumming is implemented, as long as it is implemented. Ditto for ECC RAM, that should be standard.


* Personally, I am quite sure that Apple has had a Project “reverse”-Marklar in the works since aeons ago (there was a rumored Apple A5-based MacBook), so that just in case it needs to and decides to switch, it can do so quickly. With the exception of the high-end MacBook Pro, Apple could conceivably switch most of its mobile Macs to ARM today if it decided to. Slowly, but surely, Apple’s chip team is creeping up to Intel according to any performance measure.
I don't suffer from insanity, I enjoy every minute of it.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Oct 2, 2017, 05:16 AM
 
I think Apple wants to keep the checksumming in the SSD controller for power reasons. Sending all that data to the SOC will cost lots of power. They do need hooks to the OS though, we agree on that.

Apple recently made a new dump of Darwin (open source Mac OS X kernel) to github, and this time included all the iOS parts for the first time. It truly is one development tree.

As for Mac on ARM... being as good as Intel isn't enough. Moving CPU archs again is hard and costs money, and I don't think they will unless their own ARM hardware is noticeably better. Thet do have that option in the back pocket, though.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
 
Thread Tools
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 10:36 AM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,