MacNN Forums (http://forums.macnn.com/)
-   MacNN Lounge (http://forums.macnn.com/macnn-lounge/)
-   -   Apple Loses the Encryption Game (http://forums.macnn.com/89/macnn-lounge/502132/apple-loses-the-encryption-game/)

 
subego Jul 10, 2013 04:52 PM
Apple Loses the Encryption Game
Yet another subego troll thread?

At least you're not going to say they lost it to Google.

Oh shi...



Crypto has become a little more mainstream nowadays. I don't think it's going to be something everyone and their mother cares about, but enough people are caring now there's a market for useable (i.e. low friction) crypto tools. Let's say, for your text messages.

If you actually care about your crypto protecting you, there's always going to be the initial friction of finding a a trusted source.

That ultimately means sideloading. Not an Apple strength.

Discuss.
 
Spheric Harlot Jul 10, 2013 06:18 PM
The example you mention is actually fully encrypted, end-to-end: iMessage.

The idea being that Apple IS your trusted source. You may think of that what you will, but enabling installs the way Android does it (where keyloggers and other spyware are a definite reality) is certainly not the solution.

What system do you propose that could not be adequately solved by third-party applications from trusted developers made available on the App Store as it exists today?
 
reader50 Jul 10, 2013 06:51 PM
The PRISM program supposedly can get iMessages content. If you are backed up to your iCloud service, then Apple is already able to retrieve / reset your pass. It follows they can access all backed-up content, including iMessages. If iMessages were open source, experts could do a security audit on client and backend servers, checking for vulnerabilities. But it's closed source.

News has posted of a 3rd party developing Heml.is ("secret" in Sweedish), an open source messaging app for iOS and Android. End to end encryption, so even the developers cannot access the content. I'm not sure how much metadata can be hidden though. Apps for Mac and Win desktops / notebooks are likely to come after.
 
Spheric Harlot Jul 10, 2013 07:12 PM
Apple themselves have explicitly stated that they CANNOT decrypt iMessages.

http://www.apple.com/apples-commitme...tomer-privacy/

Quote
For example, conversations which take place over iMessage and FaceTime are protected by end-to-end encryption so no one but the sender and receiver can see or read them. Apple cannot decrypt that data.
 
shifuimam Jul 10, 2013 07:26 PM
iMessage is also proprietary and designed to work with Apple's hardware and software alone. If you have even one person in your network of users, friends, family, etc. who wants to use something OTHER than shiny aluminum shit, you're out of luck.

I think that the real answer is to not use mainstream service providers for extremely confidential communications. Don't use Gmail. Don't use iMessage. Don't use Facebook. If you're talking about needing military-grade security, don't trust third parties.

Subego: what kind of "crypto" are you thinking of? Just securing the data on your own mobile device? That's easy enough, but keep in mind that the weakest end point in your communication is the most secure you can ever hope to be. YOU might be worried about security, but if your friend from New Zealand isn't, anything you send to him is going to be inherently insecure.
 
Spheric Harlot Jul 10, 2013 07:49 PM
Quote, Originally Posted by shifuimam (Post 4238051)
iMessage is also proprietary and designed to work with Apple's hardware and software alone. If you have even one person in your network of users, friends, family, etc. who wants to use something OTHER than shiny aluminum shit, you're out of luck.
In other words: Apple has this thing figured out, provided you're willing and able to use their options, and there is actually no other working alternative?
That's pretty much exactly the opposite of subego's implication, and precisely what I was getting at.

What about the potential of third-party solutions made available via verified developers on the App Store, though?

And while recent developments seem to imply that verification and even code-signing don't mean shit on the Android side of things, do you see a solution for that, and if there is, wouldn't this path be able to provide viable alternatives?
 
shifuimam Jul 10, 2013 08:13 PM
Verification and code-signing don't mean shit on any side of anything.

Apple's verification process has proven itself to not focus much on the little guys. They use it as an excuse to block apps that compete with their own (e.g. Opera or Chrome for iOS), but there have been apps pushed to the App Store that are designed to scam (like the "I Am Rich" app, trollolol).

Ultimately, by putting all your trust in Apple's "verification" process, you are still trusting someone else with your data and personal information. Apple is not above making mistakes. I realize that what you want is for people to bow before the mighty Apple and swear allegiance until death, but let me make this crystal clear for you:

The Apple universe is not the end-all, be-all solution for all users and all use case scenarios.

I prefer Windows over OS X. I want my laptops and desktops to be upgradeable, not glued-shut, sealed-together aluminum boxes that I have to throw in the garbage when it's time for a hardware refresh. I want my phone to be unlocked and under my control, not some corporations. These are all reasons why I am not a Mac fanatic. These are also reasons why for me - and many others like me, because as much as you hate it, I am not alone in my views - an Apple-exclusive "solution" is no solution at all.

I realize that you are a die-hard Apple fanatic. I realize that a lot of people here are. But please try to think outside your own little existence for just one moment and try to understand that the rest of the world is not necessarily willing to submit to the unilateral authority of Apple.

A cross-platform solution that doesn't require one to blindly trust an American corporation is a significantly better idea than locking yourself into something that is Apple-exclusive and will very likely remain so.

What reader50 posted is much more compelling. Not only is it cross-platform, but it's being run by guys who have made it the very center of their existence to protect anonymity and security - which is one of the reasons why, as of now, their new encrypted messaging service won't be running on servers located in the United States.
 
Spheric Harlot Jul 10, 2013 08:27 PM
Shif, can the bullshit, and argue the ****ing point, will you? It's getting really, really tiresome.

Both Opera and Google Chrome ARE available on the iOS App Store. Were they blocked at some point for arbitrary reasons, or were they actually in violation of guidelines (using private APIs or so)?

Yes, Apple has let through malware. Where have EVER claimed that they hadn't? Thing is, though, that they had in place a very effective means of immediately killing the software once its potential became known. The term I had in mind when I posted the above was "reasonably secure", which may not necessarily be good enough, I realize. Do you see a way around this?

The heml thing looks interesting, especially in light of it being open-source, but how reliable is the possibility of getting trusted software and a locked-down device? These things need to be guaranteeable for the solution to even begin to make sense.

I suppose corporate IT could pre-install a vetted (or custom in-house) version on their company devices and then completely lock down the device after that, but that doesn't seem like a practical solution for "normal" use.
 
subego Jul 10, 2013 09:16 PM
@Spheric,

I don't fully trust open source crypto which I've compiled myself. I sure as shit don't have the chops to go through it and find some zero day exploit no one has caught yet. I'm really not going to trust a single developer or a corporate overlord.

@shif,

The type of crypto which brought this to mind is for text messages, though the point applies to other types of data.

I agree with your point about your crypto only being as strong as its weakest link, and if the person you're texting is used as an attack vector, there isn't much you can do about it. OTOH, if the attack vector is the message itself, merely because it exists as electronic data (as various three-letter-acronym government organizations seem to be doing), I think you can get pretty good protection from that, even if the individual vectors are shaky.
 
Waragainstsleep Jul 10, 2013 09:39 PM
 
subego Jul 10, 2013 10:44 PM
That's partially what prompted the thread.
 
OreoCookie Jul 10, 2013 10:56 PM
Why does Apple lose the encryption game? You're right that it's amazing we don't have a way to easily encrypt and securely send (checksum) e-mails. But that's hardly Apple's area of expertise or fault. And given the focus of Google, for instance, on analytics, can you imagine how motivated they are to let the user encrypt their mails so that they no longer have access to the clear text (which allows them to analyze the content)? Not very much, I thought so ;)

If anyone, I guess the big webmail providers Hotmail (now outlook.com, I think), Yahoo and Google are the ones to be able to effect change. But Apple? Other than that, I don't think Apple is far behind with cryptography: they encrypt volumes if you want in a modern way (no encrypted disk images), they use encryption in iMessage and other services of theirs. According to the talks on how FileVault 2 works, for instance, they use state-of-the-art encryption methods and algorithms.

Also, I think whether a solution is cross-platform or not has very little to do with its security: look at Skype, for instance. It's certainly cross-platform, but no more open than iMessage. Plus, it seems that since Microsoft stores more stuff server-side, the authorities have access to more data.
 
shifuimam Jul 11, 2013 01:37 AM
I'd also like to know what the possible legal implications of this new Hemlis jazz are, particularly in the US (since that's where I am, whatevs). I mentioned this to BF, who's in IT security as a career, and he recalled a particular regulation in the United States that prohibits encrypting communications using a method that is not known to the federal government.

That might not be true. I have no idea. If it is, it's more than a little problematic given the NSA's, you know...skeletons in the closet, so to speak.

Quote, Originally Posted by subego (Post 4238085)
I agree with your point about your crypto only being as strong as its weakest link, and if the person you're texting is used as an attack vector, there isn't much you can do about it. OTOH, if the attack vector is the message itself, merely because it exists as electronic data (as various three-letter-acronym government organizations seem to be doing), I think you can get pretty good protection from that, even if the individual vectors are shaky.
Well, the other thing to keep in mind, at least here in the states, is that the federal government has - and uses - tools much more powerful than your average Russian or Chinese hacker to bypass encryption. The NSA was instrumental in developing some of the encryption algorithms in use today, and they make it their job to brute-force anything that comes across their collective desk.

Of course, until recently, the assumption was that Joe Patriot's encrypted data was of no interest to the NSA, so their ability to decrypt it was a nonissue. Now that it's come to light that this isn't the case, I don't really know what the next step is. I guess if it's really private, talk to someone in person in an abandoned nuclear fallout shelter so nobody can hear you? :D

ETA:

So here's the other thing about encrypting email, text messaging, and IM conversations - a lot of people also like keeping all that stuff for historical purposes, reference, because they're data hoarders, etc. That becomes a lot harder to manage.

If you use an encrypted email service like Hushmail but also use Outlook or Apple Mail to use that service, you just broke the whole reason for using the service, because now your email is also being stored locally on your computer's hard drive. Unless you also encrypt your drive, your data is no longer as secure as you want it to be.

At what point does a concern for communications privacy start creating somewhat ridiculous measures to alleviate that concern? The NSA thing pisses me off as much as anyone else, but I haven't stopping using the Internet or SMS because of it...
 
OreoCookie Jul 11, 2013 02:03 AM
But hushmail doesn't really do anything to improve security: the weak point here is still the traffic -- if not on your end, it's on the end of the person you're communicating with. The sad news is that none of the players, Google et al and governments alike have no interest in uncrackable end-to-end encryption.
 
subego Jul 11, 2013 02:34 AM
@shif,

You're absolutely right on multiple counts. We're just approaching it from different angles.

I'm not trying to protect any actual secrets. Unless the NSA wants pictures of my dog, there's nothing I have of interest. I likewise haven't stopped sending them unencrypted.


OTOH, **** the NSA. Those are my ****ing dog pictures, and you can't touch them unless I say that's okay, or you have actual probable cause to think she's in on some puppy bank heist.

I'm not convinced there's the political will to stop the NSA, and frankly I wouldn't believe a bunch of politicians telling me "oh no, we stopped that, honest injun". However, there may be enough individuals willing to use a low friction option it will give the NSA et. al. a really hard time.

Of course, you're also right the NSA has ridiculous resources. The estimate I've heard is $5MM worth of custom gear can crack a 1024-bit key in under three days. That means:

1) 4096-bit keys are probably safe for awhile.
2) Even though the NSA can chew through a 1024-bit key, I doubt they want to be forced to crack 100,000 of them, or 1MM.
 
subego Jul 11, 2013 02:45 AM
Quote, Originally Posted by OreoCookie (Post 4238099)
Why does Apple lose the encryption game? You're right that it's amazing we don't have a way to easily encrypt and securely send (checksum) e-mails. But that's hardly Apple's area of expertise or fault. And given the focus of Google, for instance, on analytics, can you imagine how motivated they are to let the user encrypt their mails so that they no longer have access to the clear text (which allows them to analyze the content)? Not very much, I thought so ;)

If anyone, I guess the big webmail providers Hotmail (now outlook.com, I think), Yahoo and Google are the ones to be able to effect change. But Apple? Other than that, I don't think Apple is far behind with cryptography: they encrypt volumes if you want in a modern way (no encrypted disk images), they use encryption in iMessage and other services of theirs. According to the talks on how FileVault 2 works, for instance, they use state-of-the-art encryption methods and algorithms.

Also, I think whether a solution is cross-platform or not has very little to do with its security: look at Skype, for instance. It's certainly cross-platform, but no more open than iMessage. Plus, it seems that since Microsoft stores more stuff server-side, the authorities have access to more data.
My argument is even if Apple provided encryption, you couldn't trust it.

At the moment, the only halfway trustable crypto is stuff you take a direct hand in (i.e. compiling your own open source). That's hardly low friction, but I don't think you can ever completely eliminate some form of personal involvement if you want it to be solid.

The current Apple system makes personal involvement extremely difficult. Apple doesn't want you personally involved. They want everything to go through them.
 
OreoCookie Jul 11, 2013 09:26 AM
Quote, Originally Posted by subego (Post 4238123)
My argument is even if Apple provided encryption, you couldn't trust it.
Apple does provide encryption for a multitude of its products. All encryption algorithms are based on open encryption standards (e. g. AES)
Quote, Originally Posted by subego (Post 4238123)
At the moment, the only halfway trustable crypto is stuff you take a direct hand in (i.e. compiling your own open source). That's hardly low friction, but I don't think you can ever completely eliminate some form of personal involvement if you want it to be solid.
All modern crypto algorithms are usually based on open crypto standards. That's because it's damn hard to make a crypto algorithm which is secure (secure meaning that the only way is a brute force attack), so people don't just create their own. Of course, you have to trust someone in the end. But these days, the biggest problems of backdoors is that they are eventually discovered and exploited by people. I remember when a while back people found an NSAKEY in Windows (obviously, it was not meant as a back door for the NSA ;)).
Quote, Originally Posted by subego (Post 4238123)
The current Apple system makes personal involvement extremely difficult. Apple doesn't want you personally involved. They want everything to go through them.
What prevents you from compiling the source code of the OSS software of choice on your OS X box? It's as difficult/easy as on FreeBSD and Linux systems -- especially if you use one of the packet managers or brew.
 
P Jul 11, 2013 09:31 AM
Quote, Originally Posted by Spheric Harlot (Post 4238047)
Apple themselves have explicitly stated that they CANNOT decrypt iMessages.

Apple - Apple’s Commitment to Customer Privacy
Yes, but that doesn't include iCloud backups, if any. Apple can reset your iCloud password and restore your backup to another iPhone or iPad and read your messages that way. Anyone who can convince Apple support to reset your iCloud password can do it, actually. Can be fixed by not backing up to iCloud, obviously.

Quote, Originally Posted by shifuimam (Post 4238062)
Verification and code-signing don't mean shit on any side of anything.

Apple's verification process has proven itself to not focus much on the little guys. They use it as an excuse to block apps that compete with their own (e.g. Opera or Chrome for iOS), but there have been apps pushed to the App Store that are designed to scam (like the "I Am Rich" app, trollolol).

Ultimately, by putting all your trust in Apple's "verification" process, you are still trusting someone else with your data and personal information. Apple is not above making mistakes.
That's not the point about the code-signing - Google doesn't check the apps they publish in the same way as Apple (they have an automated malware scanner that apparently catches less than 50% of malware), but the codesigning is still important to verify who made it. If you trust company A to deliver your messages securely, you can at least still trust that version 2.0 is as secure as version 1.0, if they are signed with the same key. More troubling is what happens if the NSA comes knocking with a court order for Apple telling them to mess up that secure application from company A so it can be tapped - not sure how you'd detect that in the App Store (and probably you couldn't).

Quote, Originally Posted by shifuimam (Post 4238062)
A cross-platform solution that doesn't require one to blindly trust an American corporation is a significantly better idea than locking yourself into something that is Apple-exclusive and will very likely remain so.
Granted. Until I have a better idea, I spread my data over multiple services and don't link them. Should at least frustrate anyone wanting to track my habits.

Quote, Originally Posted by shifuimam (Post 4238117)
I'd also like to know what the possible legal implications of this new Hemlis jazz are, particularly in the US (since that's where I am, whatevs). I mentioned this to BF, who's in IT security as a career, and he recalled a particular regulation in the United States that prohibits encrypting communications using a method that is not known to the federal government.
I've never heard of that. It would be hard to prove, because if the method is not known, it only looks like noise. There was a rule about encrypted phones having a bypass feature to let the NSA listen in, but that was long ago.

Quote, Originally Posted by shifuimam (Post 4238117)
Well, the other thing to keep in mind, at least here in the states, is that the federal government has - and uses - tools much more powerful than your average Russian or Chinese hacker to bypass encryption. The NSA was instrumental in developing some of the encryption algorithms in use today, and they make it their job to brute-force anything that comes across their collective desk.
They have been caught red-handed trying to sneak in a vulnerability once. Wonder how many times they weren't caught. Still, many vulnerabilities are found before the standard is set, it would probably be hard to get something bad into the standard.

Quote, Originally Posted by shifuimam (Post 4238117)
So here's the other thing about encrypting email, text messaging, and IM conversations - a lot of people also like keeping all that stuff for historical purposes, reference, because they're data hoarders, etc. That becomes a lot harder to manage.

If you use an encrypted email service like Hushmail but also use Outlook or Apple Mail to use that service, you just broke the whole reason for using the service, because now your email is also being stored locally on your computer's hard drive. Unless you also encrypt your drive, your data is no longer as secure as you want it to be.
Obviously, but both Mac OS X and Windows have had encryption built-in for several versions now - or you can use something like GPG.

Also, there is a difference in degree. If someone wants to wiretap you by asking Google or whoever to give them access to your mail account, you will likely never notice, and you can be spied on for years. If someone sends a SWAT team to raid your home and grab your hard drive, you will notice - and not even the NSA can argue that they can do that without a court order.
 
P Jul 11, 2013 09:39 AM
Quote, Originally Posted by subego (Post 4238123)
My argument is even if Apple provided encryption, you couldn't trust it.

At the moment, the only halfway trustable crypto is stuff you take a direct hand in (i.e. compiling your own open source). That's hardly low friction, but I don't think you can ever completely eliminate some form of personal involvement if you want it to be solid.

The current Apple system makes personal involvement extremely difficult. Apple doesn't want you personally involved. They want everything to go through them.
There are two ways to put your own code on an iOS device without Apple knowing approving it:

1) An Apple Developer ID. Download source, compile in Xcode, load onto your own iOS device.

2) A webapp. If you can emulate old Nintendo consoles in Javascript, you can certainly encrypt text messages.

More troublesome is the fact that you basically have to trust your OS vendor in any case. If Apple or MS or Google wanted to, or were forced to do so by court order, they could install any number of backdoors that send all your encryption keys to them just in case they ever want to read anything you write.
 
P Jul 11, 2013 10:13 AM
 
OreoCookie Jul 11, 2013 10:30 AM
Quote, Originally Posted by P (Post 4238156)
More troublesome is the fact that you basically have to trust your OS vendor in any case. If Apple or MS or Google wanted to, or were forced to do so by court order, they could install any number of backdoors that send all your encryption keys to them just in case they ever want to read anything you write.
I don't think the easiest way into a system is a backdoor, but something like a root kit which exploits a bug (e. g. a buffer overflow). Once you have control over the machine, all cryptographically sound encryption algorithms will be worth zilch.

I think if people are scared of backdoors in cryptography algorithms, they're looking at the least obvious target. As you mention, P, getting to the data via the iCloud backups is a much easier way which also circumvents any encryption scheme.
 
shifuimam Jul 11, 2013 10:43 AM
Incidentally, with regards to the security of iMessage, I found something of interest:

What the NSA doesn’t have: iMessages and FaceTime chats | Ars Technica

Quote
"While Apple boasts of 'end-to-end encryption' it's pretty clear that Apple itself holds the key—because if you boot up a brand new iOS device, you automatically get access to your old messages," wrote Techdirt's Mike Masnick. "That means that Apple is storing these messages in the cloud, and it can decrypt them if it needs to."
While Apple isn't decrypting your iMessage content for their own use (advertising, etc.), it doesn't mean they are incapable of decrypting it at all. If you're backing up your phone to Apple's cloud service, your private key is indeed being stored on their servers - otherwise they'd have no way of decrypting your messages on your shiny new iPhone.

Just something to think about.
 
P Jul 11, 2013 11:17 AM
I saw that, but I wonder... If you restore your phone from a backup, then yes, you get all your stored messages, but if you add a new device, do you see all your old messages? I didn't pay too much attention, but as I remember it, I didn't get past messages this way.

Even if it is as they describe in the article above, it can be done if the key is generated from known information about you, including your password. That then makes it much easier to guess than brute forcing AES or whatever they're using, but still not trivial.
 
P Jul 11, 2013 11:32 AM
Quote, Originally Posted by OreoCookie (Post 4238168)
I don't think the easiest way into a system is a backdoor, but something like a root kit which exploits a bug (e. g. a buffer overflow). Once you have control over the machine, all cryptographically sound encryption algorithms will be worth zilch.
Sure - I think that was how the Chinese allegedly hacked Google - but what I'm saying is that if Apple got a court order that all future Macs must ship with a certain NSASecretBackdoor.kext that is provided to them, there is very little you or I or Apple can do about it. If you want to take reasonable precautions but not cut yourself off from society and live in a Faraday cage eating canned food bought with cash, you're going to have to pick an OS that you can live with and trust at least to some extent. If you don't trust Apple or MS (or what they can be coerced to do), then go compile your own Linux kernel or FreeBSD or something. If you've made the decision to use OS X, you're going to have to trust that Apple does their best in other things as well.

Looking at the options currently in the market, I think I'm better off with Apple than with the competition. Even if I trusted Google more than Apple (and I don't), I wouldn't get an Android phone until they get the patching process working. My distrust of MS is mostly based on their history, but for all that they're not the same company, Ballmer is still CEO and not looking to leave just yet (they just reorganized, btw, in what looks like an emulation of the structure Apple setup after Forstall was kicked out).
 
reader50 Jul 11, 2013 02:45 PM
Quote, Originally Posted by P (Post 4238156)
There are two ways to put your own code on an iOS device without Apple knowing approving it:

1) An Apple Developer ID. Download source, compile in Xcode, load onto your own iOS device.

2) A webapp. If you can emulate old Nintendo consoles in Javascript, you can certainly encrypt text messages.
3. Jailbreak your iOS device and load whatever program you wish.
 
subego Jul 11, 2013 04:17 PM
I know it's possible to sideload, I'm saying it's made purposely difficult because Apple don't want to play that. Needing to sign up as a developer is extra-high friction.
 
subego Jul 11, 2013 04:52 PM
Quote, Originally Posted by OreoCookie (Post 4238153)
Apple does provide encryption for a multitude of its products. All encryption algorithms are based on open encryption standards (e. g. AES)

All modern crypto algorithms are usually based on open crypto standards. That's because it's damn hard to make a crypto algorithm which is secure (secure meaning that the only way is a brute force attack), so people don't just create their own. Of course, you have to trust someone in the end. But these days, the biggest problems of backdoors is that they are eventually discovered and exploited by people. I remember when a while back people found an NSAKEY in Windows (obviously, it was not meant as a back door for the NSA ;)).

What prevents you from compiling the source code of the OSS software of choice on your OS X box? It's as difficult/easy as on FreeBSD and Linux systems -- especially if you use one of the packet managers or brew.
"Based" doesn't help me. The code needs to be 100% open.

I'm not saying you're prevented. I'm saying it's difficult. Apple intentionally makes it difficult. It's part of their ecosystem model.
 
shifuimam Jul 11, 2013 04:57 PM
Quote, Originally Posted by reader50 (Post 4238221)
3. Jailbreak your iOS device and load whatever program you wish.
And don't update, because that will break your JB and potentially prevent you from JBing for awhile, or at least until whoever does it these days finds a new exploit.

And if you're under warranty, don't take it to the Apple Store if you have a hardware problem, since they're technically required to install the latest iOS on any device that is jailbroken.
 
P Jul 11, 2013 05:08 PM
Quote, Originally Posted by subego (Post 4238231)
I know it's possible to sideload, I'm saying it's made purposely difficult because Apple don't want to play that. Needing to sign up as a developer is extra-high friction.
Installing a webapp isn't.
 
subego Jul 11, 2013 05:18 PM
Can't trust webapp crypto.
 
OreoCookie Jul 11, 2013 09:22 PM
Quote, Originally Posted by subego (Post 4238236)
"Based" doesn't help me. The code needs to be 100% open.
Referring to your title: how does that single out Apple? It equally applies to Microsoft, Google and any other software vendor out there who does not open source its products but uses some kind of encryption somewhere.
Quote, Originally Posted by subego (Post 4238236)
I'm not saying you're prevented. I'm saying it's difficult. Apple intentionally makes it difficult. It's part of their ecosystem model.
Again, I don't get it: you can compile and upload anything you want on your Mac and iOS device. How is that more difficult than on a Linux box? And to just load your personal project on an iOS device, I don't think you need a paid dev account.
 
P Jul 12, 2013 04:13 AM
Quote, Originally Posted by subego (Post 4238241)
Can't trust webapp crypto.
Why? Is the crypto less reliable because it has been crosscompiled into Javascript?
 
Waragainstsleep Jul 12, 2013 04:37 AM
Quote, Originally Posted by OreoCookie (Post 4238277)
And to just load your personal project on an iOS device, I don't think you need a paid dev account.
You do, but it isn't going to break the bank at $99.
 
OreoCookie Jul 12, 2013 07:44 AM
Quote, Originally Posted by Waragainstsleep (Post 4238309)
You do, but it isn't going to break the bank at $99.
I didn't know, I stand corrected.
 
subego Jul 12, 2013 07:04 PM
Quote, Originally Posted by OreoCookie (Post 4238277)
Referring to your title: how does that single out Apple? It equally applies to Microsoft, Google and any other software vendor out there who does not open source its products but uses some kind of encryption somewhere.
As I said. What singles out Apple is the walled garden ecosystem.

Google does not have a walled garden.


The issue with people using encryption is friction. Working outside the Apple walls adds friction. Apple doesn't want you to do it, and it's a model which almost no iPhone users are familiar with on their iPhone.

Google is fine with you working outside their walls. They encourage it.
 
SSharon Jul 16, 2013 01:52 AM
Quote, Originally Posted by subego (Post 4238236)
"Based" doesn't help me. The code needs to be 100% open.
So you want something like Cryptocat? Oh wait . . . Bad kitty! “Rookie mistake” in Cryptocat chat app makes cracking a snap | Ars Technica
 
subego Jul 16, 2013 02:22 AM
That's open source working. The only reason the exploit was discovered is because someone had the full and complete code to comb through.

If that mistake was in a proprietary fork, you'd need to trust the developers had a big enough team combing through their code to catch it.
 
SSharon Jul 16, 2013 11:57 AM
Quote, Originally Posted by subego (Post 4238810)
That's open source working. The only reason the exploit was discovered is because someone had the full and complete code to comb through.

If that mistake was in a proprietary fork, you'd need to trust the developers had a big enough team combing through their code to catch it.
The problem is that real people were really using the service. How long does one wait for all the bugs to be found?

Whether open or closed source we will always continue to find bugs to exploit. That being said, I also prefer open source software for security/encryption but I do so knowing that tomorrow everything I've encrypted could be cracked wide open.
 
subego Jul 16, 2013 03:00 PM
The only software I know of which is bug-free is TeX, and achieving that feat required an impractical development model.

With everything else, there are always going to be bugs. Finding them all is an impossibility. The question is which model is the most resilient.

It's a Catch 22. If you're a company with a big enough team to compete with open-source bug tracking, you have moles. You have moles in open-source, but it's more difficult for them to operate because they have to act as if anyone can see their malicious code at any time because, well, anyone can.

As I've said before though, at this point it isn't just bugs. The NSA is throwing so much money at this I think you need to start thinking of all your encryption having a "lifetime" before it gets cracked. They're probably on the cusp of being able to crack a single 2048-bit key if they devote a majority of their resources to it for several months. In five years they'll be using a fraction of those resources, and cracking it in a few days.

As I said upthread, current estimates are $9MM worth of hardware to crack a 1024-bit key every few days.
 
All times are GMT -4. The time now is 05:16 PM.

Copyright © 2005-2007 MacNN. All rights reserved.
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2015, vBulletin Solutions, Inc.


Content Relevant URLs by vBSEO 3.3.2