Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Community > MacNN Lounge > Where do you see Apple in 5 years?

Where do you see Apple in 5 years? (Page 3)
Thread Tools
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 30, 2013, 01:36 PM
 
Originally Posted by Spheric Harlot View Post
This distinction is arbitrary and one of degree, IMO.

Interface is not merely a pretty skin on an app. Something like the smart instruments in GarageBand is NOT a matter of just slapping a new interface on the same core functionality, because that particular functionality does not even exist on the Mac.

However, using CoreAudio APIs and the basic Logic audio engine, albeit limited to 8 tracks, for recording and playback: Does that count as "sharing code"?

So: No, Apple is most definitely not taking the same core app and simply slapping another interface on it. That would result in badly compromised hush money like the Windows RT Office touch version.

As for separate development teams: That, too, is a pretty arbitrary distinction.
It is well-known that Apple has in the past had small teams that work on whatever is prioritized, switching from, say, the Finder to iWork, and then over to iPhoto.

I do not know whether Garageband for iOS is made by the same people who make Garageband for the Mac, but I do know that GarageBand and Logic are made by the same people. Does it make a difference to you?

Thank you for finally addressing this.

It does make a difference. I predict that there will be a time when the version of Garageband you install on a desktop or laptop has the same checksum as the version you would install on a mobile device (i.e. there is only one app). I have no clue why you feel that it would be an impossibility for the mobile interface bits (skin and the functionality managed by this element) to see the light of day on non-mobile hardware, and if you still stand by the notion that at some point this will be a reality. It seems like you've changed your mind, or else misunderstood me much earlier while insisting that you did understand.

At any rate, I see no logical reason whatsoever to suggest that what I'm describing here isn't exactly what Apple is working towards.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 30, 2013, 01:41 PM
 
It's also kind of interesting that my simply repeating exactly what I said before somehow led to your understanding my point.
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jun 30, 2013, 01:41 PM
 
That has to do with your jumping into a discussion sounding like you were disagreeing, when your argument has absolutely nothing to do with with what was being discussed at the time.

Have you used GarageBand on iOS?

How do you suggest smart instruments or guitar picking, or the drums, work on a regular Mac?

These are not trivial elements of the app. They are basic to its functionality.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 30, 2013, 01:46 PM
 
Originally Posted by Spheric Harlot View Post
That has to do with your jumping into a discussion sounding like you were disagreeing, when your argument has absolutely nothing to do with with what was being discussed at the time.

Have you used GarageBand on iOS?

How do you suggest smart instruments or guitar picking, or the drums, work on a regular Mac?

These are not trivial elements of the app. They are basic to its functionality.

They don't have to, that code can lie dormant while the app is being used on a Mac.

I've been a part of this thread for several pages now, so don't give me this prostitute about how I'm interrupting a conversation and that is why you are playing dumb.

Let's face it, what is really going on here is that your religious ferver cannot bare the thought of what I'm describing, for some reason, because you feel that developmental efforts needs to result in two separate products with two separate checksums (with some code sharing between projects) in perpetuity. Why, I have no clue.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 30, 2013, 01:52 PM
 
I will add that I'm open to the possibility of the occasional app having separate mobile and desktop versions where and when this makes sense to do so, but the goal and the default will be single apps that drive multiple interfaces/input methods 95%+ of the time that ship as a single app with the same checksum no matter where it is installed.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jun 30, 2013, 04:03 PM
 
Originally Posted by Spheric Harlot View Post
Shif and Skeleton, I've now understood: they simply don't understand why interface rules cannot be broken without destroying usability. And I'm fine with that, now that it's clear that there's no point in discussing it.
Stop being a jerk. We're saying there's a way to address the same problem that DOESN'T break interface rules or destroy usability. You're acting as if (A) the windows and/or worst way to answer the problem is the only way to do it and (B) that we support that worst way.

It's as if we were debating before the 2-button (Apple) mouse days and I was saying "there's a way to provide a 2-button mouse without discarding the simplicity and intuitiveness of the 1-button mouse" and you're calling me a retard who needs to be taken out back and shot just because the 2-button mice of the day are geek-toys. Then you neatly avoid thinking you were wrong when the Apple mouse is released, by declaring that it's completely different (oh yeah and don't forget to shout that part, and tell/shout to everyone who disagrees that don't have a clue what they're talking about) because Apple's solution (to the same problem) is not the same as others' solution (to the same problem).

We're not focused on the details of the solution, we're focused on the details of the problem. The problem in this case is you have to switch machines just because you switched between sitting and standing. We think that problem is solvable, and you're claiming it's impossible to solve. We're not claiming it is solvED.

Can you possibly address that without being a total asshole about please?
     
Addicted to MacNN
Join Date: Aug 2006
Location: Somewhere in the Pacific Northwest
Status: Offline
Reply With Quote
Jun 30, 2013, 04:10 PM
 
Originally Posted by Uncle Skeleton View Post
Stop being a jerk. We're saying there's a way to address the same problem that DOESN'T break interface rules or destroy usability. You're acting as if (A) the windows and/or worst way to answer the problem is the only way to do it and (B) that we support that worst way.

It's as if we were debating before the 2-button (Apple) mouse days and I was saying "there's a way to provide a 2-button mouse without discarding the simplicity and intuitiveness of the 1-button mouse" and you're calling me a retard who needs to be taken out back and shot just because the 2-button mice of the day are geek-toys. Then you neatly avoid thinking you were wrong when the Apple mouse is released, by declaring that it's completely different (oh yeah and don't forget to shout that part, and tell/shout to everyone who disagrees that don't have a clue what they're talking about) because Apple's solution (to the same problem) is not the same as others' solution (to the same problem).

We're not focused on the details of the solution, we're focused on the details of the problem. The problem in this case is you have to switch machines just because you switched between sitting and standing. We think that problem is solvable, and you're claiming it's impossible to solve. We're not claiming it is solvED.

Can you possibly address that without being a total asshole about please?
You can't take him personally. People as a rule tend to resort to insults against the disagreeing party when they've run out of logical arguments for why that party is wrong for disagreeing with them.

It's like trying to explain why absolute truth doesn't exist to someone who is fervently fundamentalist Christian. There's nothing you can do to change their mind or make them see any reasonable middle ground, and in the end all they'll do is run out of arguments and go back to calling you names (or, in the religious world, using passive-aggressive subtle jabs that reference how much more God likes them than you, lol).

Spheric isn't the only die-hard can-do-no-wrong Apple fanatic on 'NN, but he's been the most vocal as of late. Unless Apple implements a particular feature or option, he will continue to see it as "broken", "bad technology", "stupid", or "fundamentally flawed".

However, like copy-and-paste, a command-line interface, notification bar toggles, and right-click, once Apple goes that route, it will suddenly be "magical and revolutionary".

No point in arguing with a zealot. Save your breath for people who are interested in reasonable discussion rather than tiresome evangelizing.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 30, 2013, 04:16 PM
 
Shif and Skeleton: do you see the one OS/application + multiple interface/input methods thing in the future as I do?
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jun 30, 2013, 04:18 PM
 
Originally Posted by besson3c View Post
They don't have to, that code can lie dormant while the app is being used on a Mac.
But who on earth cares?

Why is this in any way relevant? It makes nothing easier for anyone involved, and it does nothing but waste space on (currently) expensive Flash storage.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 30, 2013, 04:32 PM
 
Originally Posted by Spheric Harlot View Post
But who on earth cares?

Why is this in any way relevant? It makes nothing easier for anyone involved, and it does nothing but waste space on (currently) expensive Flash storage.
1) It allows for one development team rather than having to expend resources developing two separate applications
2) It allows for one product to have to support, release updates to, patch, etc.
3) It allows customers the opportunity to buy a single license and have what they purchase work on all of their devices
4) It eventually can allow consolidation of app stores
5) It allows customers to not have to worry about updating/testing two separate applications (for those who may not want the apps auto-updating)

I'm sure there are other points here too... If this is not obvious to you, maybe changing your tone a little to come across as a little more open-minded would be smart?
     
Addicted to MacNN
Join Date: Aug 2006
Location: Somewhere in the Pacific Northwest
Status: Offline
Reply With Quote
Jun 30, 2013, 06:59 PM
 
Originally Posted by besson3c View Post
I have no clue why you feel that it would be an impossibility for the mobile interface bits (skin and the functionality managed by this element) to see the light of day on non-mobile hardware...
Particularly entertaining about the significance of this point is that OS X is already doing this, starting with Launchpad in Lion.

Instead of a dock with a billion icons that get tinier and tinier as you use more software, it's a home screen with evenly spaced app icons on a grid.

Like, you know....the home screen (springboard) in iOS.

................

Yet the Mac doesn't have a touchscreen (yet - I still say it's inevitable, because it's going to be an expected feature as more and more OEMs make it standard). So that particular environment seems to adapt just fine between finger and mouse pointer.

Originally Posted by besson3c View Post
I will add that I'm open to the possibility of the occasional app having separate mobile and desktop versions where and when this makes sense to do so, but the goal and the default will be single apps that drive multiple interfaces/input methods 95%+ of the time that ship as a single app with the same checksum no matter where it is installed.
And, to clarify for the sake of the zealots - nobody has once indicated that it will be one or the other with no middle ground. It's not particularly surprising that any longtime Mac fanatic would have a hard time understand this. Apple has, since at least 1984, designed both its hardware and software with the "mommy knows best" mentality - they give you one option for how to do something (two if you're really lucky), and it's the only way you do it unless you want to use third-party software or, even worse, system-level hacks. When you grow up with technology that only gives you one way of doing thing, it inevitably instills in the user a very black-and-white sense of how technology works and how people use it.

A multi-button mouse was blasphemous and overly complex and impossible for users to comprehend until Apple introduced it, and since the original Magic Mouse didn't default to a two-button configuration, the Apple did it was the only way it could be done properly for all users, objectively and universally.

When you're raised to believe that there's only one way of doing any given task, it's difficult to shake that. Some people never do.

Originally Posted by besson3c View Post
Shif and Skeleton: do you see the one OS/application + multiple interface/input methods thing in the future as I do?
Of course.

It's not either/or. It's AND.

Java gets a bad rap because, face it - it's an insecure platform, and even before security was a problem (e.g. before many users had always-on Internet), most of the applications written in Java looked like shit, because they used a completely separate GUI from the operating system's native interface. However, undoubtedly one of the biggest reasons why it's still used so much is because it's platform-agnostic. This is a boon for developers who need to make an application for god-only-knows how many different clients. It's why web-based applications were Java for so long; there wasn't any other option until Flash and then later HTML5 caught up.

The basic theory behind Java is very sound. Hell, the whole point behind MVC development is to separate the code from the UI, so that there's a single application base supporting multiple user interfaces. If you think that major websites use completely separate code for their mobile sites, you're nuts. That would make a TON of unnecessary, redundant code, which is a catastrophe when the site has to be updated or overhauled or analyzed for security vulnerabilities.

We're already doing this with web-based applications and have been for years now. It was inevitable that client-side application development was going to go the same route, because it makes software development and - more importantly - management of existing code much easier.

I realize SH (and at this point, I'm not seeing anyone who agrees with him, so let's call him an island for now) will disagree with all of this, but it doesn't change the fact that it's already happening in the commercial software development industry and has been for awhile.

This conversation has been going in circles for more than a week now. Nobody's going to win...
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 30, 2013, 07:13 PM
 
Unlike Java though, Apple wouldn't need to build and have their apps run in a new runtime environment layer ala the JRE, there would simply be binary compatibility between apps intended for what is iOS and OS X today.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 30, 2013, 07:17 PM
 
OreoCookie: sorry if I was a little short-fused earlier. It's a little annoying when people don't read posts, but I realize I was more annoyed at SH doing that to me, so I didn't mean to take that out on you.

What I propose would be different than what Microsoft is doing with Windows 8. To reiterate, it seems like Microsoft is trying to bring the mobile world to the existing Windows world by amalgamating interfaces, and in the process making too many assumptions about what people want in a desktop OS as far as its mobile influence. What I'm suggesting is not to amalgamate the interfaces at all any differently than what Apple is doing today. Keep them separate - consolidate where possible like Apple has been doing, but don't force them together. However, eventually ship one OS with one set of applications for all devices... One OS/app + support for multiple interfaces/multiple input methods.
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 30, 2013, 10:51 PM
 
Originally Posted by besson3c View Post
OreoCookie: sorry if I was a little short-fused earlier. It's a little annoying when people don't read posts, but I realize I was more annoyed at SH doing that to me, so I didn't mean to take that out on you.
Thank you for the apology, I appreciate it.
Originally Posted by besson3c View Post
What I propose would be different than what Microsoft is doing with Windows 8. [...] What I'm suggesting is not to amalgamate the interfaces at all any differently than what Apple is doing today. Keep them separate - consolidate where possible like Apple has been doing, but don't force them together. However, eventually ship one OS with one set of applications for all devices... One OS/app + support for multiple interfaces/multiple input methods.
Much of what you suggest already exists today (shared APIs, etc.), and the only thing that is missing (if I understand one OS/app) properly that you should be able to use a single project for, say, OmniFocus or Reeder on both, iOS and OS X, that perhaps even compiles into a single »fat binary«.

I guess what you have in mind at the end is the promise of Windows 8 (before we knew what was Windows 8): you take an iPad, work a little on your spreadsheet or Keynote presentation. Then you »dock« your iPad and the last-used app switches to »OS X mode«, and you continue your work uninterrupted. Is that it?
I don't suffer from insanity, I enjoy every minute of it.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 30, 2013, 11:06 PM
 
Originally Posted by OreoCookie View Post
Much of what you suggest already exists today (shared APIs, etc.), and the only thing that is missing (if I understand one OS/app) properly that you should be able to use a single project for, say, OmniFocus or Reeder on both, iOS and OS X, that perhaps even compiles into a single »fat binary«.

I guess what you have in mind at the end is the promise of Windows 8 (before we knew what was Windows 8): you take an iPad, work a little on your spreadsheet or Keynote presentation. Then you »dock« your iPad and the last-used app switches to »OS X mode«, and you continue your work uninterrupted. Is that it?

Maybe, I think that's kind of a cool premise (assuming you mean that you'd continue your work uninterrupted on a laptop/desktop with an interface appropriate for it). I'm just coming at this from a development effort perspective. The day that we can take a checksum of an app on our laptops and a checksum of the same app version on our mobile devices and for these checksums to be the same is the day where I think this mobile thing has really reached full maturation, and the day where there are no barriers whatsoever to jumping between any family of device under the sun (without a specific mobile/desktop version of the app having to be written).

Like you, I don't think we are that far off, but I think this is going to have to amount to far more than simply being able to share some code. I think what we need is to be able to create an XCode project where we can build the mobile and desktop applications simultaneously and compile it into a single app. Maybe for some projects the developer won't care about a mobile interface, but if this part of the project is left untouched it allows the author to go back and build this interface and various dependent bits and supporting code without having to create a new project from scratch and have to sort of copy and paste bits over.

I think one of the big challenges here is organizing the development of something like this, and refining tools to do this. Right now when you go to create a new project in XCode, the iOS and OS X templates are split up.

Maybe this is sort of what Microsoft ultimately envisioned, but to be clear, I do not want what Windows 8 is now.
     
Addicted to MacNN
Join Date: Aug 2006
Location: Somewhere in the Pacific Northwest
Status: Offline
Reply With Quote
Jun 30, 2013, 11:15 PM
 
Originally Posted by OreoCookie View Post
...you take an iPad, work a little on your spreadsheet or Keynote presentation. Then you »dock« your iPad and the last-used app switches to »OS X mode«, and you continue your work uninterrupted. Is that it?
This is where technology is headed, I think - I'm excited to see what happens.

Have you looked at what Canonical is doing in this regard? It's pretty interesting. I can't find the video right now, but their ultimate goal is to basically have a phone that convert into a tablet OR a full-fledged desktop computer, via docking solutions.

I think it'd be pretty awesome if the iPhone or iPad could do that - dock it to have a computer interface, undock it to take it with you.

Originally Posted by besson3c View Post
Maybe this is sort of what Microsoft ultimately envisioned, but to be clear, I do not want what Windows 8 is now.
I think that ultimately that is what Microsoft is going for. While Windows 8 isn't really for everyone (I'm not planning on upgrading my main desktop anytime soon, although I might try it on my laptop), there's one benefit to its existence that's easy to forget - Microsoft is, at least for the time being, big enough to make mistakes in their products and not lose swaths of customers over it. They can venture a bit more into the unknown, try some stuff that doesn't necessarily work the way they hoped it did, and everyone else can hugely benefit from the lessons learned there.
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 30, 2013, 11:39 PM
 
Originally Posted by besson3c View Post
Maybe, I think that's kind of a cool premise (assuming you mean that you'd continue your work uninterrupted on a laptop/desktop with an interface appropriate for it). I'm just coming at this from a development effort perspective.
Originally Posted by shifuimam View Post
I think it'd be pretty awesome if the iPhone or iPad could do that - dock it to have a computer interface, undock it to take it with you.
Two things:
(1) From a user perspective, I don't think the »docking solution« is the best solution. The problem is being solved much better by the cloud: lay down your tablet (without having to dock it with your computer), open the app in the desktop and continue working where you left off. All of this is possible today without having to fudge OSes. And what is more important, I think the cloud sync solution is actually better: you don't even need to bring your tablet along or you can leave it in your backpack.

Instead of focussing on merging OSes, companies should integrate the clouds better into their OSes (which is what they're doing), that's the correct strategy.

(2) Also from a developer's point of view, I don't think the effort will decrease substantially from where it is now. Look no further than some quality iOS iPhone and iOS iPad apps such as OmniFocus. Both use the same APIs and the same OS, right? The hardware is also mostly identical. And yet it took a substantial effort to make a good iPad version of OmniFocus. It's not a problem of code, but a problem of app concept: you have to completely rethink some apps in order to make them shine on a device. Unlike desktop OSes, you need to focus a lot on how an app works on such a small screen. The contrast between iOS apps and OS X apps is even more stark. This fundamental difference between the development targets will not disappear even if you could completely re-use the code.
Originally Posted by shifuimam View Post
This is where technology is headed, I think - I'm excited to see what happens.

Have you looked at what Canonical is doing in this regard? It's pretty interesting. I can't find the video right now, but their ultimate goal is to basically have a phone that convert into a tablet OR a full-fledged desktop computer, via docking solutions.
Yes, I'm aware of their efforts, and I think that they'll have zero impact on the market. Just to be clear, if it weren't for the *nix-ness of OS X, I'd probably use FreeBSD or some flavor of Linux now, but despite all the time, Linux has had zero impact on the mass market (outside of server rooms, of course). All innovations that happen there will not become main stream.*

As I said, I don't think creating the analog of »responsive (web) design« for the GUI is the way to go, I am convinced that it simply can't work if your goal is to create good apps. Otherwise it's like programming an iPhone app and just scaling it up for use on the iPad. It technically works, but does not yield a good user experience.


* Just to re-iterate: I like Linux, I think it's crucial as a backbone to the internet, as a purveyor of open standards, etc. But despite the fact that it's free and for most people more than good enough (there are office and e-mail apps, browsers, etc.), it hasn't made a dent in Windows' market share.
I don't suffer from insanity, I enjoy every minute of it.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jul 1, 2013, 12:11 AM
 
Originally Posted by shifuimam View Post
This is where technology is headed, I think - I'm excited to see what happens.

Have you looked at what Canonical is doing in this regard? It's pretty interesting. I can't find the video right now, but their ultimate goal is to basically have a phone that convert into a tablet OR a full-fledged desktop computer, via docking solutions.

I think it'd be pretty awesome if the iPhone or iPad could do that - dock it to have a computer interface, undock it to take it with you.
It would, although it seems like what Canonical is doing is *way* too ambitious given the resources they have. I don't have high hopes that they can deliver a good product.

I think what Apple is going is smart - a slow, methodical sort of transition with some experimentation and patience along the way. Canonical might be able to make some splashes and draw some attention to themselves that might ultimately be copied but it's hard to see them being a relevant player.

I think that ultimately that is what Microsoft is going for. While Windows 8 isn't really for everyone (I'm not planning on upgrading my main desktop anytime soon, although I might try it on my laptop), there's one benefit to its existence that's easy to forget - Microsoft is, at least for the time being, big enough to make mistakes in their products and not lose swaths of customers over it. They can venture a bit more into the unknown, try some stuff that doesn't necessarily work the way they hoped it did, and everyone else can hugely benefit from the lessons learned there.
As long as Windows 7 is available, supported, there is an easy way to install it and continue to use it, etc. sure, but this pattern of one OS that people like followed by one they don't doesn't seem to be sustainable or in any way a good idea at all. They really should fix this disconnect that they have with their customers so that when they release stuff it is in tune with what customers actually want and/or would be willing to use.

It will also be a very tough road ahead of them to pick up traction in the mobile space when they seem pretty clueless about how to go about doing all of this.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jul 1, 2013, 12:26 AM
 
Originally Posted by OreoCookie View Post
Two things:
(1) From a user perspective, I don't think the »docking solution« is the best solution. The problem is being solved much better by the cloud: lay down your tablet (without having to dock it with your computer), open the app in the desktop and continue working where you left off. All of this is possible today without having to fudge OSes. And what is more important, I think the cloud sync solution is actually better: you don't even need to bring your tablet along or you can leave it in your backpack.

Instead of focussing on merging OSes, companies should integrate the clouds better into their OSes (which is what they're doing), that's the correct strategy.
I agree with this as a means of data transport, but the docking premise would also ensure that the apps that you use and your environments/configs are similar. I suppose this too can be managed in the cloud, but when it comes to commercial apps like a Photoshop or something I'm not sure about whether storing those or making those downloadable in some way from the cloud will be an easy sell. What do you think? It's of course technically possible, but it will require a lot of cooperation and buy-in between companies. When it comes to media files there is also the question of storage, ownership, etc.

(2) Also from a developer's point of view, I don't think the effort will decrease substantially from where it is now. Look no further than some quality iOS iPhone and iOS iPad apps such as OmniFocus. Both use the same APIs and the same OS, right? The hardware is also mostly identical. And yet it took a substantial effort to make a good iPad version of OmniFocus. It's not a problem of code, but a problem of app concept: you have to completely rethink some apps in order to make them shine on a device. Unlike desktop OSes, you need to focus a lot on how an app works on such a small screen. The contrast between iOS apps and OS X apps is even more stark. This fundamental difference between the development targets will not disappear even if you could completely re-use the code.
Well, for one, we are all still learning how to develop for mobile devices. Secondly, it depends on the nature of the app, but like I said to SH, there may be hours upon hours upon hours of work on the UI and making it work optimally on mobile devices to the point that a very substantial amount of development time is devoted to it, but at the end of all of this the sheer number of lines of code coming out of this is probably going to be relatively small. This means that going forward, it is going to be relatively easy to maintain the project. I realize that fewer lines of code doesn't necessarily guarantee simplicity, but it makes it easier to isolate and identify within a larger project.

As I said, I don't think creating the analog of »responsive (web) design« for the GUI is the way to go, I am convinced that it simply can't work if your goal is to create good apps. Otherwise it's like programming an iPhone app and just scaling it up for use on the iPad. It technically works, but does not yield a good user experience.
It would require more than simply miniaturization, that is definitely not what a good responsive web design entails, but it often seems to be its focal point. Because there are few really freaking awesome responsive web designs doesn't mean that the problem is with the premise of having a single app with multiple interfaces though, it could just mean that bad decisions were made about the UI and/or not enough time was spent polishing it.

* Just to re-iterate: I like Linux, I think it's crucial as a backbone to the internet, as a purveyor of open standards, etc. But despite the fact that it's free and for most people more than good enough (there are office and e-mail apps, browsers, etc.), it hasn't made a dent in Windows' market share.
I think the biggest thing that has come out of this whole world is some nice open standards. When we can't see results in the form of shipping products we tend to think that influence and relevance is minimal, but look at the entire culture behind the web and the open standards that drive it. I don't think this sort of thing exists without people working to propel open standards. Linux has been a great play place to develop these sort of open standards, even if there aren't great apps that use them. It's kind of more of a social thing, if you know what I mean?
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jul 1, 2013, 12:28 AM
 
Shifuimam: why is it, do you think, that whenever Apple releases something there is often some grumbling and complaining (e.g. Apple Maps), but not the same sort of vehement boycotts that we see with some people avoiding products like Win8?
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jul 1, 2013, 01:32 AM
 
Originally Posted by besson3c View Post
I agree with this as a means of data transport, but the docking premise would also ensure that the apps that you use and your environments/configs are similar. I suppose this too can be managed in the cloud, but when it comes to commercial apps like a Photoshop or something I'm not sure about whether storing those or making those downloadable in some way from the cloud will be an easy sell.
I am not sure whether you chose Adobe on purpose since they have just released their Creative Cloud strategy. I don't think it'll be an easy sell initially, but Adobe is using the same clawbar strategy as Microsoft: if you want a professional image manipulation tool, there is no substitute for Photoshop (on the professional level).

You're right that we need to solve quite a few problems here as well, but ultimately I think a solution via software + cloud is easier than a hardware docking + OS approach. Not only is it more versatile, it's also more interoperable.
Originally Posted by besson3c View Post
What do you think? It's of course technically possible, but it will require a lot of cooperation and buy-in between companies. When it comes to media files there is also the question of storage, ownership, etc.
Actually, I think you bring up a good point: interoperability is really the achilles heel here, and one that is very difficult to approach: most product cycles have become so short that it's just very difficult for standardization bodies to go through the motions fast enough (e. g. have a look at the various 802.11 pre-n devices which were released before the standard). But eventually, we'll have to approach the problem here. Google is no better than anyone else in this regard: they propose a video standard nobody uses (also because it has not been standardized properly so that other can rely on it), they have switched away from open standards for calendaring, for instance, etc. (I currently solve this problem in part using Dropbox.)
Originally Posted by besson3c View Post
Well, for one, we are all still learning how to develop for mobile devices. Secondly, it depends on the nature of the app, but like I said to SH, there may be hours upon hours upon hours of work on the UI and making it work optimally on mobile devices to the point that a very substantial amount of development time is devoted to it, but at the end of all of this the sheer number of lines of code coming out of this is probably going to be relatively small.
But then you wouldn't really save anything in terms of effort: if the only difference is whether you have one, two or three XCode projects which share a lot of code or one is irrelevant if a lot of the additional development time is spent on the UI.

The more important difference is another one: if you look at responsive design, it shows you the same content in a different layout. But the good iPad apps show you more/different content compared to the iPhone version. So you also have to implement this additional functionality.

I feel like what you propose has long been done with many cross-platform apps à la Photoshop: you have a common core that implements all the under-the-hood stuff and then put an OS-dependent UI on top. What's worse, you can very often easily tell when an app is not a »completely native« app. Adobe is forced to make certain compromises, e. g. it cannot adopt too many OS X-specific features in the under-the-hood code, because it'd have to fork the code. That'd probably be better for OS X Adobe customers, but perhaps too expensive to be financially viable.

Moreover, there is some under-the-hood stuff that works very differently in iOS and OS X, and will remain to work very differently for the next 5 years at least. For instance, on OS X you often want an app to do CPU-intensive stuff in the background (e. g. encoding a video), but that's something you definitely don't want on iOS. Ditto for network access. So there is a part of code that you need to keep platform-specific.
Originally Posted by besson3c View Post
I think the biggest thing that has come out of this whole world is some nice open standards.
In the realm of networking, you're absolutely right. But on the end-user side, not so much.
Originally Posted by besson3c View Post
When we can't see results in the form of shipping products we tend to think that influence and relevance is minimal, but look at the entire culture behind the web and the open standards that drive it. I don't think this sort of thing exists without people working to propel open standards. Linux has been a great play place to develop these sort of open standards, even if there aren't great apps that use them. It's kind of more of a social thing, if you know what I mean?
Many of the open standards (e. g. h.264) that have become important have nothing to do with Linux, but I agree with your point that Linux has made the open source culture popular, and the idea that companies can benefit from progress in such OSS.

I think that eventually, when progress slows down, we should think of standardization. I don't think it'll happen (at least not in the way that is useful to the end user), but after some parts of digital life have settled down, we should make it mandatory for certain services to be accessible via certain fixed APIs. Right now, none of the players really want that.
I don't suffer from insanity, I enjoy every minute of it.
     
Addicted to MacNN
Join Date: Aug 2006
Location: Somewhere in the Pacific Northwest
Status: Offline
Reply With Quote
Jul 1, 2013, 02:58 PM
 
Originally Posted by OreoCookie View Post
(1) From a user perspective, I don't think the »docking solution« is the best solution. The problem is being solved much better by the cloud: lay down your tablet (without having to dock it with your computer), open the app in the desktop and continue working where you left off. All of this is possible today without having to fudge OSes. And what is more important, I think the cloud sync solution is actually better: you don't even need to bring your tablet along or you can leave it in your backpack.
That's all well and good when you're talking about simple tasks like working with documents or browsing the Internet (although I don't know if iOS can do that - in Android, Chrome and Firefox sync what you're doing between desktop and mobile device).

However, one device that handles it all is appealing to me because it means I don't have to install software multiple times, I don't have to set up my preferences on multiple devices, and my LOCAL stuff is as accessible as my CLOUD stuff.

Not everyone wants to put all their personal files on the Internet. I sure don't. I have two file servers on my network, but your average home user isn't going to be remotely that tech-savvy. Right now, if you don't want to trust your private information with a third-party (Apple or anyone else), your only option is to keep your files with you on a thumb drive to work between machines - and that's a non-option with iOS devices, which don't support USB host or even MicroSD.

Originally Posted by besson3c View Post
Shifuimam: why is it, do you think, that whenever Apple releases something there is often some grumbling and complaining (e.g. Apple Maps), but not the same sort of vehement boycotts that we see with some people avoiding products like Win8?
You already know the answer, bro. It's the same reason why Mormons and JWs change their fundamental beliefs every time a prophecy doesn't happen on the prophesied date. Instead of admitting they were idiots, they just say "oh, we have new revelation from god".

The only vocal Mac users are the zealots. People who buy Macs because their favorite celebrities use them generally aren't posting on the Internet about how much they love Apple and how depressed and tragically sad they were when Jobs finally kicked. The zealots are the ones who will be more than happy to swallow whatever pill Apple shoves down their throats, and if it gets stuck halfway down, they just say "oh, I'll get used to it - this is still a better way of doing things" or "this must just be a problem with the user, not with Apple".

Non-Mac users are a bit more sane with technology. Since it's not a religion to the death, there's no motivation to force oneself to continue using hardware or software that is too rough around the edges for real everyday use. Instead, we just move on to whatever works for us, because we have that option.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jul 1, 2013, 03:55 PM
 
Originally Posted by shifuimam View Post
That's all well and good when you're talking about simple tasks like working with documents or browsing the Internet (although I don't know if iOS can do that - in Android, Chrome and Firefox sync what you're doing between desktop and mobile device).

However, one device that handles it all is appealing to me because it means I don't have to install software multiple times, I don't have to set up my preferences on multiple devices, and my LOCAL stuff is as accessible as my CLOUD stuff.

Not everyone wants to put all their personal files on the Internet. I sure don't. I have two file servers on my network, but your average home user isn't going to be remotely that tech-savvy. Right now, if you don't want to trust your private information with a third-party (Apple or anyone else), your only option is to keep your files with you on a thumb drive to work between machines - and that's a non-option with iOS devices, which don't support USB host or even MicroSD.
I think this might be one of those situations where the best technical solution (the cloud) does not win out because of political/non-tech issues. It is possible that the cloud could be used to do all of what you've described, but in a perfect world for the user this would be in a situation where we can have a little more control over our own cloud storage (which would of course include encryption which would pose challenges for data mining) such that most people will feel comfortable-ish with it the way that most seem comfortable-ish with a service like Dropbox, but the catch is nothing is free, so somewhere in this picture there will be some entity messing things up by trying to cash out somehow at the expense of the user such that things never materialize, or they suck.

The docking thing is at least a little more control-your-destiny.

You already know the answer, bro. It's the same reason why Mormons and JWs change their fundamental beliefs every time a prophecy doesn't happen on the prophesied date. Instead of admitting they were idiots, they just say "oh, we have new revelation from god".

The only vocal Mac users are the zealots. People who buy Macs because their favorite celebrities use them generally aren't posting on the Internet about how much they love Apple and how depressed and tragically sad they were when Jobs finally kicked. The zealots are the ones who will be more than happy to swallow whatever pill Apple shoves down their throats, and if it gets stuck halfway down, they just say "oh, I'll get used to it - this is still a better way of doing things" or "this must just be a problem with the user, not with Apple".

Non-Mac users are a bit more sane with technology. Since it's not a religion to the death, there's no motivation to force oneself to continue using hardware or software that is too rough around the edges for real everyday use. Instead, we just move on to whatever works for us, because we have that option.

I don't think I agree with this. People are religious over arbitrary things and irrational. There might be *more* of this on the Mac side, but there are plenty of rabid Microsoft fanboys out there too. Still, I haven't heard from too many (including journalists) really in defend-MS-until-the-end mode lately.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jul 1, 2013, 03:58 PM
 
It would be neat if some network disk standard ala Dropbox really emerges such that in the future we are free to choose which cloud storage solution we use, and all OSes support this standard for syncing. I know that this has sort of been attempted with WebDAV, but I'm not sure if that is what Dropbox uses?
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jul 1, 2013, 10:10 PM
 
Originally Posted by shifuimam View Post
That's all well and good when you're talking about simple tasks like working with documents or browsing the Internet (although I don't know if iOS can do that - in Android, Chrome and Firefox sync what you're doing between desktop and mobile device).
Sure, iOS can do that, it runs under the name iCloud tabs. You an also use Chrome for iOS and sync tabs between different instances of Chrome (although I haven't tried that to be honest).
Originally Posted by shifuimam View Post
However, one device that handles it all is appealing to me because it means I don't have to install software multiple times, I don't have to set up my preferences on multiple devices, and my LOCAL stuff is as accessible as my CLOUD stuff.
You can sync prefs and such if you want to. And honestly, I don't want my iPhone or potential iPad to be a clone of my Mac.
Originally Posted by shifuimam View Post
Not everyone wants to put all their personal files on the Internet. I sure don't. I have two file servers on my network, but your average home user isn't going to be remotely that tech-savvy.
Then you could use a solution like Transporter to sync local data, the actual data need not be stored in the cloud. This is where the future is going and I, for one, like that. This also puts less pressure on the internal memory: thanks to iTunes Match, I don't have to have my whole music library on my iPhone, all I need is an internet connection and I can listen to any song in my library.

You're right that certain data have different privacy requirements, but I think since there is a demand, solutions start to pop up. Once fast connections (also upstream) become more ubiquitous, I expect that peer-to-peer solutions such as Transporter will also become more popular. Other companies such as OmniGroup have released a data sync tool which relies entirely on open standards, meaning that instead of using their free service, you could set up your own server.
Originally Posted by shifuimam View Post
and that's a non-option with iOS devices, which don't support USB host or even MicroSD.
Who wants that? I carry a USB stick on my keychain and yet I prefer to use Dropbox to share files with someone sitting next to me (unless it's obscenely large). micro SD slots for expansion are pointless in my opinion, most people are much better served with something like Dropbox.
Originally Posted by shifuimam View Post
Non-Mac users are a bit more sane with technology. Since it's not a religion to the death, there's no motivation to force oneself to continue using hardware or software that is too rough around the edges for real everyday use. Instead, we just move on to whatever works for us, because we have that option.
I think that's a little naive and insulting.
Apple products have become main stream a long, long time ago. I think it's idiotic by some (not you, shif) if they're called iLemmings by others. I don't think non-Mac users are more sane with technology nor do I think they're less sane. What you see on message boards is a vocal minority for whom computers are a hobby and a joy to use. You don't see my mom here who now writes me e-mails on her iPad. She doesn't care. But I don't think some Windows, Linux or Android zealots are any better. Some of them still think of Apple and treat them as the niche computer company on the brink of implosion in some ways. Apple computers have become less boutique and more main stream due to Apple's success.

Personally, I like Apple, because they have foreseen many of the trends of technology correctly and because their products have had a positive impact on my life. The digital hub, smartphones with touch, tablet computers and the various stores: the fact that they're ultimately copied is a validation (I think copying good ideas is good for the users) -- after they're initially mocked (e. g. the iPad is just a giant iPod touch). There are aspects where other companies are much further ahead, e. g. some of Google's services (search, maps and books), and I think it's good that other companies try different approaches. E. g. Microsoft is trying to unify all OSes -- and is failing on all fronts by most people's accounts.
I don't suffer from insanity, I enjoy every minute of it.
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jul 1, 2013, 10:14 PM
 
Originally Posted by besson3c View Post
It would be neat if some network disk standard ala Dropbox really emerges such that in the future we are free to choose which cloud storage solution we use, and all OSes support this standard for syncing. I know that this has sort of been attempted with WebDAV, but I'm not sure if that is what Dropbox uses?
There are some attempts like OmniPresence or Transporter, but in the end, being able to just sync files is nowhere near enough. We're still waiting for the paradigm that makes traditional file handling obsolete, because that's what you need if you want to use tablets full time for things like writing letters and doing your taxes.

There are also plenty of complications: Dropbox cannot use end-to-end encryption, because otherwise you would not be able to share data. Some alternatives to Dropbox that encrypt data before transmission do not have the sharing features I happen to rely on. I think peer-to-peer storage could be a good solution, but we need more bandwidth (especially upstream).
I don't suffer from insanity, I enjoy every minute of it.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jul 3, 2013, 12:37 PM
 
Originally Posted by shifuimam View Post
You can't take him personally. ... No point in arguing with a zealot. Save your breath for people who are interested in reasonable discussion rather than tiresome evangelizing.
But the world is full of them, and I don't want to get out of practice wrangling them

Originally Posted by besson3c View Post
Shif and Skeleton: do you see the one OS/application + multiple interface/input methods thing in the future as I do?
Yes of course I do, but maybe not for the same reasons you do...

Originally Posted by besson3c View Post
1) It allows for one development team rather than having to expend resources developing two separate applications
2) It allows for one product to have to support, release updates to, patch, etc.
3) It allows customers the opportunity to buy a single license and have what they purchase work on all of their devices
4) It eventually can allow consolidation of app stores
5) It allows customers to not have to worry about updating/testing two separate applications (for those who may not want the apps auto-updating)

I'm sure there are other points here too... If this is not obvious to you, maybe changing your tone a little to come across as a little more open-minded would be smart?
I don't think it's a sound judgement to make such a big change that affects the user so much (as merging iOS and OS X would) based on decisions about how it would benefit the developer (like it would be cheaper to develop or you wouldn't have to deploy as much tech support). These decisions should be entirely focused on how it benefits the consumer (all the more so because often these go hand in hand). In that light, I really only agree with your #4 because I could frame it as, it's easier for the user to find a needle in a haystack than to find 3 needles in 3 haystacks.

My list would have more ways that the convergence benefits the user. Just like with the invention of the smartphone, it was the merging per se of the cell phone and the PDA that made for a better synergy, not even the interaction of functions, but simply the fact that the user only needs 1 battery-powered cyber-leash instead of 2, that is what made carrying a PDA at all a reasonable proposition for many many more people than before the smartphone existed. Basically, IMO the advantage of combining gadgets, in the abstract, falls under two categories: physical simplifying and mental simplifying. For the physical, it's easier to own 1 computer instead of 2. Charging batteries, doing maintenance, doing upgrades, and having it accessible (there is limited physical space in your workspace and on your person) are all easier the fewer physical items you own (IOW fewer items you need to own in order to meet your computing needs). And for the mental side, it's easier to learn/understand the system and its abilities/limitations, remember your organization and workflow, remember where your physical devices are and not lose them or have them stolen, and seek tech support, when you have fewer devices as well. Anything that falls outside those two umbrellas, I either don't agree with, or haven't thought of

On the other hand, those changes that have already been made which indicate a gradual convergence of iOS and OS X, I dislike. So... I don't know where that leaves me. I think my big-picture answer to that is about how convergence should only add, not replace, and so far they haven't done a good job of allowing the user to choose which behaviors to change and which to keep as before, which is what they should be doing. If we adopt the premise that they will heavy-handedly try to change behavior as an integral part of the convergence (as opposed to merely facilitate changes that the user might actively seek), then I would agree with Spheric. I just don't think it's accurate to attach that premise to the idea of convergence; they are independent (just look at photoshop, they force you to learn a new interface every year, even without convergence of anything )
     
The Mighty
Join Date: Feb 2004
Location: Excellent, the sports issue is within arm's reach, I'll be here all day.
Status: Offline
Reply With Quote
Jul 3, 2013, 08:44 PM
 
Originally Posted by shifuimam View Post
I realize SH (and at this point, I'm not seeing anyone who agrees with him, so let's call him an island for now) will disagree with all of this
So do you think he'll be spending his time on the 4th here dwelling on all the new posts?
This one time, at Boot Camp, I stuck a flute up my PC.
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jul 3, 2013, 11:08 PM
 
Originally Posted by And.reg View Post
So do you think he'll be spending his time on the 4th here dwelling on all the new posts?
The 4th is a workday where I am.

I decided I'd said all that I could, and wanted to watch it pan out. I was subsequently amazed to see Oreo agree with basically everything I'd written, while managing to put it in an almost non-confrontational way, by simply working at a different level, even in spite of shif's outrageously insulting tone.

Of course, he thread's died since he took everything apart, so...

FWIW, I think that there's a distinction between Mac zealots and being able to recognize bad interface.
I was pretty passionate about why I believe APPLE ****ed up with full-screen mode, too: it's all over these forums. And they've had to break one of the fundamental tenets of their interface — the single menu bar concept — after thirty years, just to fix it.
It's way easier to just put me off as a religious Mac nut if you've run out of arguments, though. Just pretend the 2000's never happened; it's not like people still resorting to that tactic are old enough to remember the 90s, anyway.
     
Addicted to MacNN
Join Date: Aug 2006
Location: Somewhere in the Pacific Northwest
Status: Offline
Reply With Quote
Jul 8, 2013, 09:21 AM
 
Originally Posted by Spheric Harlot View Post
...even in spite of shif's outrageously insulting tone.
Pot, meet kettle.
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jul 8, 2013, 09:50 AM
 
What did I call you?

You've made up your mind to be offended by WHATEVER I write (even when I'm genuinely trying to be helpful — like in the freelancer thread, where you had to pre-emptively strike against some bizarre imagined threat of people attacking your country or something).

I figured Oreo would be back, and he's a lot more subtle at arguing the same things as I.

And nobody ever bothered replying to his deftly made points.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jul 8, 2013, 08:55 PM
 
Sorry for my late response here guys... Life distractions!

Originally Posted by Uncle Skeleton View Post
I don't think it's a sound judgement to make such a big change that affects the user so much (as merging iOS and OS X would) based on decisions about how it would benefit the developer (like it would be cheaper to develop or you wouldn't have to deploy as much tech support). These decisions should be entirely focused on how it benefits the consumer (all the more so because often these go hand in hand). In that light, I really only agree with your #4 because I could frame it as, it's easier for the user to find a needle in a haystack than to find 3 needles in 3 haystacks.
I definitely agree that the deciding factor is not the developers, but I would make the argument that what I'm addressing here will benefit the end users, whether they realize it or not. For starters, lower cost barriers means more resources being able to be allocated.

What items in my list did you disagree with, and why?

My list would have more ways that the convergence benefits the user. Just like with the invention of the smartphone, it was the merging per se of the cell phone and the PDA that made for a better synergy, not even the interaction of functions, but simply the fact that the user only needs 1 battery-powered cyber-leash instead of 2, that is what made carrying a PDA at all a reasonable proposition for many many more people than before the smartphone existed. Basically, IMO the advantage of combining gadgets, in the abstract, falls under two categories: physical simplifying and mental simplifying. For the physical, it's easier to own 1 computer instead of 2. Charging batteries, doing maintenance, doing upgrades, and having it accessible (there is limited physical space in your workspace and on your person) are all easier the fewer physical items you own (IOW fewer items you need to own in order to meet your computing needs). And for the mental side, it's easier to learn/understand the system and its abilities/limitations, remember your organization and workflow, remember where your physical devices are and not lose them or have them stolen, and seek tech support, when you have fewer devices as well. Anything that falls outside those two umbrellas, I either don't agree with, or haven't thought of
There is a whole other set of arguments for and against convergence, I think. I honestly haven't thought too much about convergence in this context, but generally speaking I like convergence when it makes stuff more convenient without making your product Microsoft Outlook or Office, and when any compromises and sacrifices can be properly justified.

On the other hand, those changes that have already been made which indicate a gradual convergence of iOS and OS X, I dislike. So... I don't know where that leaves me. I think my big-picture answer to that is about how convergence should only add, not replace, and so far they haven't done a good job of allowing the user to choose which behaviors to change and which to keep as before, which is what they should be doing. If we adopt the premise that they will heavy-handedly try to change behavior as an integral part of the convergence (as opposed to merely facilitate changes that the user might actively seek), then I would agree with Spheric. I just don't think it's accurate to attach that premise to the idea of convergence; they are independent (just look at photoshop, they force you to learn a new interface every year, even without convergence of anything )
How do you envision the operating systems best converging?
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jul 8, 2013, 08:58 PM
 
Originally Posted by OreoCookie View Post
There are some attempts like OmniPresence or Transporter, but in the end, being able to just sync files is nowhere near enough. We're still waiting for the paradigm that makes traditional file handling obsolete, because that's what you need if you want to use tablets full time for things like writing letters and doing your taxes.

There are also plenty of complications: Dropbox cannot use end-to-end encryption, because otherwise you would not be able to share data. Some alternatives to Dropbox that encrypt data before transmission do not have the sharing features I happen to rely on. I think peer-to-peer storage could be a good solution, but we need more bandwidth (especially upstream).

It kind of sounds like what would be cool to make syncable is data structures rather than files and file formats. The data structures can of course be wrapped in some sort of file format, but something like JSON or XML cloud syncing would even make *storage* of this data in the cloud a possibility. This of course won't provide any utility for binary formats, but.... Just thinking outloud here.

Maybe shareable encryption keys would solve the problem you are describing in your second paragraph? Again, more thinking outloud here...
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jul 8, 2013, 08:59 PM
 
Originally Posted by Spheric Harlot View Post
What did I call you?

You've made up your mind to be offended by WHATEVER I write (even when I'm genuinely trying to be helpful — like in the freelancer thread, where you had to pre-emptively strike against some bizarre imagined threat of people attacking your country or something).

I figured Oreo would be back, and he's a lot more subtle at arguing the same things as I.

And nobody ever bothered replying to his deftly made points.

Your participation in these threads often becomes aggressive. You don't have to call people names for them to feel attacked.

It might take some time to shake the Mac-defender attack dog reputation you've sort of developed here. What Oreo has that you often lack is the ability to talk about this stuff calmly without losing your shit and getting emotional.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jul 9, 2013, 10:15 AM
 
Originally Posted by besson3c View Post
I definitely agree that the deciding factor is not the developers, but I would make the argument that what I'm addressing here will benefit the end users, whether they realize it or not.
Like a trickle-down benefit? I reject that. If a thing is better for both customer and developer, then it being better for the customer should be priority one, lead with that. If it happens to be better for the developer and worse for the customer, then you're basically saying that the customer should sacrifice so the developer can gain. F that. So basically all that matters is whether it's better for the customer, and whether it's better for developers doesn't really matter (if it's terribly bad for the developer and good for the customer, it wouldn't fit our starting criteria to begin with). Making an indirect argument that what's better for developers generally might be better for customers too, is just obfuscating this fact for no purpose. IMO.

For starters, lower cost barriers means more resources being able to be allocated.
Apple is arguably the most successful company in any industry right now. Whatever the limitations on their users are, dev resources isn't one of them. Or to put it another way, if given an example of a developer failing or shortcoming at Apple, they would not be able to get away with the excuse of "we just don't have the resources for that." They have plenty of resources; there are certainly other valid reasons why their products have certain limitations, and finding a way to reduce their cost barriers will not affect those limitations at all.

What items in my list did you disagree with, and why?
1) It allows for one development team rather than having to expend resources developing two separate applications
2) It allows for one product to have to support, release updates to, patch, etc.
3) It allows customers the opportunity to buy a single license and have what they purchase work on all of their devices
I disagree with these for the reason described above. Apple is flush with resources, and if that was the only thing stopping them from meeting a customer need, they could do it regardless of merging two OS teams into one. If they wanted the user to be able to get two products for one price, there is nothing stopping them, and merging OS's wouldn't help it.
4) It eventually can allow consolidation of app stores
As I said earlier, I suspect your implication is that this would make it easier on the app store (rather than the customer). I don't disagree with this point, because it would make things easier on the customer. But I do disagree with the idea that this is to make things easier on the app store and its support team. I'm not even sure that it would (is managing a walmart easier than managing two teams that each manage a store half as big as a walmart?), and I don't think it's a worthwhile argument to make, because it's kind of like saying that the customer should cater to developer convenience, which is backwards. If it's good for the customer, then do it. If it's good for the developer, then either it's good for the customer too (in which case see item 1), or it's bad for the customer in which case you're saying that the customer should shoulder a burden in order to make things easier on the developer, which is a bass-ackwards philosophy, and certainly counter to what Apple has said and done for its entire existence. I don't agree with all their choices, but at least I can say that they were trying to improve the UX, rather than making the user behave more like a programmer.
5) It allows customers to not have to worry about updating/testing two separate applications (for those who may not want the apps auto-updating)
This one I disagree with not because it doesn't focus on the customer, but because I don't think it's true. The customer would still have to worry about testing the usability of the software in both contexts (and if s/he doesn't use it in both contexts then s/he wouldn't have had to test both either way), and would also have to worry about which updates matter to them given their usage context. The user does feature testing, not bug testing, and that will still matter to the various form-factors even with just one binary.



There is a whole other set of arguments for and against convergence, I think. I honestly haven't thought too much about convergence in this context, but generally speaking I like convergence when it makes stuff more convenient without making your product Microsoft Outlook or Office, and when any compromises and sacrifices can be properly justified.

How do you envision the operating systems best converging?
Yeah it's a good question, that my brain has been quietly mulling over the past week while I'm not watching. I think it can be answered pretty simply (though not always simple to implement): it's good if it preserves (the option of) both the old methods, and it's bad if it takes away the old to make way for the new. As a developer, you should never have to take anything away from what the user already has, you should be able to only ever add, never take away. I know it doesn't always work out that way, partly because of Apple's philosophy that keeping things simple is a virtue, and sometimes taking things away serves that virtue. I say there is always a way to get both, simplicity AND not taking anything away. I'm not saying it's easy, but it's possible.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jul 9, 2013, 12:16 PM
 
Originally Posted by Uncle Skeleton View Post
Like a trickle-down benefit? I reject that. If a thing is better for both customer and developer, then it being better for the customer should be priority one, lead with that. If it happens to be better for the developer and worse for the customer, then you're basically saying that the customer should sacrifice so the developer can gain. F that. So basically all that matters is whether it's better for the customer, and whether it's better for developers doesn't really matter (if it's terribly bad for the developer and good for the customer, it wouldn't fit our starting criteria to begin with). Making an indirect argument that what's better for developers generally might be better for customers too, is just obfuscating this fact for no purpose. IMO.


Apple is arguably the most successful company in any industry right now. Whatever the limitations on their users are, dev resources isn't one of them. Or to put it another way, if given an example of a developer failing or shortcoming at Apple, they would not be able to get away with the excuse of "we just don't have the resources for that." They have plenty of resources; there are certainly other valid reasons why their products have certain limitations, and finding a way to reduce their cost barriers will not affect those limitations at all.
I didn't mean Apple developer resources, I meant third party developer resources.

You are making things overly complicated here, I think. My argument is that these efforts indirectly help the consumer, yes, but they weren't that the efforts should be designed for developers and that the consumers just benefit as a welcome side effect, but that the whole point of this is to benefit the consumers.

The whole point of making the development process easier in any sort of way is to make better products that consumers ultimately benefit from. Whether the effects are direct or indirect doesn't really matter much, the point is that the consumers benefit.

The same is true of building better machines to build car parts or better machines for testing cars - this is ultimately to make better cars that provide better experiences to the consumer. My argument was not for some weird developer toys for tinkering that yield no improved product.

1) It allows for one development team rather than having to expend resources developing two separate applications
2) It allows for one product to have to support, release updates to, patch, etc.
3) It allows customers the opportunity to buy a single license and have what they purchase work on all of their devices
I disagree with these for the reason described above. Apple is flush with resources, and if that was the only thing stopping them from meeting a customer need, they could do it regardless of merging two OS teams into one. If they wanted the user to be able to get two products for one price, there is nothing stopping them, and merging OS's wouldn't help it.
Again, this is for third party developers.

5) It allows customers to not have to worry about updating/testing two separate applications (for those who may not want the apps auto-updating)
This one I disagree with not because it doesn't focus on the customer, but because I don't think it's true. The customer would still have to worry about testing the usability of the software in both contexts (and if s/he doesn't use it in both contexts then s/he wouldn't have had to test both either way), and would also have to worry about which updates matter to them given their usage context. The user does feature testing, not bug testing, and that will still matter to the various form-factors even with just one binary.
What about IT staff for a large business that decides what apps go on the devices the company uses? They have the same usability testing to do, but they also need to make sure that updates do not break the company workflow just as they have to do the same with desktop applications today.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jul 9, 2013, 01:58 PM
 
Originally Posted by besson3c View Post
I didn't mean Apple developer resources, I meant third party developer resources.

You are making things overly complicated here, I think. My argument is that these efforts indirectly help the consumer, yes, but they weren't that the efforts should be designed for developers and that the consumers just benefit as a welcome side effect, but that the whole point of this is to benefit the consumers.

The whole point of making the development process easier in any sort of way is to make better products that consumers ultimately benefit from. Whether the effects are direct or indirect doesn't really matter much, the point is that the consumers benefit.
Thanks for clearing that up, but I don't agree that one can assume that making the development process easier is equivalent to making better products available to consumers. As a counter-example, Apple's approval process for iOS apps makes the development process harder for developers, not easier. By comparison, the Google play app store is a much easier process for developers, but the result is not better products for consumers. The Google store has more crap apps, more ad-ware, and more outright fraud-ware (apps that don't do what they claim to do).

I still think "trickle down" is an accurate description of your logic, and I still don't find it to be sound.

The same is true of building better machines to build car parts or better machines for testing cars - this is ultimately to make better cars that provide better experiences to the consumer. My argument was not for some weird developer toys for tinkering that yield no improved product.
But do better car testing machines require the user to participate in the transition? This wouldn't be an issue if the transition was invisible to the users.


Again, this is for third party developers.
Ok then, what's stopping 3rd party developers from offering both versions of their app for 1 price now?


What about IT staff for a large business that decides what apps go on the devices the company uses? They have the same usability testing to do, but they also need to make sure that updates do not break the company workflow just as they have to do the same with desktop applications today.
I'm not getting it, please walk me through it. I assume that if this issue even came up, then the company is using both mouse/keyboard and touch devices in some important capacity (like for example it's a POS software that they run from a register and from tablets). Now that there is 1 app for both instead of 1 app for each, they don't have to test the new all-in-one app on both systems? I would think they still have to test it in both environments (because that's what I do at home when I get a new does-2-things gadget). In addition, they might need to do additional tests for the transition from mobile to desk-bound and back, to make sure the app doesn't crash or lose state info when activating or deactivating the mouse/keyboard interface elements or whatever. Can you describe an example of where less testing would be appropriate, or which specific test scenario could be dropped, just because the same app is doing double-duty?
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jul 9, 2013, 03:03 PM
 
Originally Posted by Uncle Skeleton View Post
Thanks for clearing that up, but I don't agree that one can assume that making the development process easier is equivalent to making better products available to consumers. As a counter-example, Apple's approval process for iOS apps makes the development process harder for developers, not easier. By comparison, the Google play app store is a much easier process for developers, but the result is not better products for consumers. The Google store has more cheat apps, more ad-ware, and more outright fraud-ware (apps that don't do what they claim to do).
I agree that easier development process doesn't necessarily make for better end-products, but in this case, it most definitely is easier to consolidate development resources, attention, and efforts into a single app rather than having to build and support multiple applications, at least in most cases.

As far as your app store example, this is more of a procedural and policy issue than something that requires extensive developmental resources. I'd characterize this as Apple's guidelines being stricter and enforced in a stricter manner. Yes, this does impact development, but the differences between a policy like this and the issue of developing one vs. two apps is quite significant.

I still think "trickle down" is an accurate description of your logic, and I still don't find it to be sound.

But do better car testing machines require the user to participate in the transition? This wouldn't be an issue if the transition was invisible to the users.
Why would users have to participate in or be aware of this transition?

Ok then, what's stopping 3rd party developers from offering both versions of their app for 1 price now?
Nothing, but what is your point here?

I'm not getting it, please walk me through it. I assume that if this issue even came up, then the company is using both mouse/keyboard and touch devices in some important capacity (like for example it's a POS software that they run from a register and from tablets). Now that there is 1 app for both instead of 1 app for each, they don't have to test the new all-in-one app on both systems? I would think they still have to test it in both environments (because that's what I do at home when I get a new does-2-things gadget). In addition, they might need to do additional tests for the transition from mobile to desk-bound and back, to make sure the app doesn't crash or lose state info when activating or deactivating the mouse/keyboard interface elements or whatever. Can you describe an example of where less testing would be appropriate, or which specific test scenario could be dropped, just because the same app is doing double-duty?
I don't understand why you seem to misunderstand everything I say.

I didn't say that they wouldn't have to test the one app. I said that having one app to test and support can require less effort than having to test two separate apps and discovering their quirks, differences, disparity, unique bugs, whatever.
     
Clinically Insane
Join Date: Apr 2003
Location: 46 & 2
Status: Offline
Reply With Quote
Jul 9, 2013, 04:54 PM
 
In 5 years I see Apple as a substantially smaller, yet still quite profitable, niche company (again). This could change, there are faint rumblings of some groundbreaking biotech research going on, but that will likely be 7-10 years into the future.
"Those who expect to reap the blessings of freedom must, like men, undergo the fatigue of supporting it."
- Thomas Paine
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jul 9, 2013, 05:00 PM
 
Originally Posted by besson3c View Post
I agree that easier development process doesn't necessarily make for better end-products, but in this case, it most definitely is easier to consolidate development resources, attention, and efforts into a single app rather than having to build and support multiple applications, at least in most cases.
"Easier is not always better, but in this case it is easier?" Did you mean to say in this case it is better? If so, I'm not convinced of that. (and if not then I don't know what you're talking about )

As far as your app store example, this is more of a procedural and policy issue than something that requires extensive developmental resources. I'd characterize this as Apple's guidelines being stricter and enforced in a stricter manner. Yes, this does impact development, but the differences between a policy like this and the issue of developing one vs. two apps is quite significant.
It doesn't take more developer resources in order to output apps that meet Apple's higher bar?


Why would users have to participate in or be aware of this transition?
The same reason they (users) had to participate in the transition away from Classic. There will now be apps that don't behave like the others (orphaned non-universal apps). Don't you think the whole Classic thing was a burden on users?


Nothing, but what is your point here?
You brought this up as a consequence of merging OSes, but since they can choose to do it with or without merging OSes, then there's no reason to consider it as a factor.



I don't understand why you seem to misunderstand everything I say.
In this case it is you who is misunderstanding what I say

I didn't say that they wouldn't have to test the one app.
...twice. What I'm asking is why wouldn't they have to test the one app twice, while before (now) they had to test two apps twice (total), yielding an overall testing requirement of exactly the same as before (or more, if they have to test the combination/transition separately).

My point is that I don't think it would produce less testing, I think what it would produce is more use cases. Those additional use cases (like switching between walking and sitting without interruption) are a win for the user, because it's something they can do that they couldn't do before. But in general, more use cases mean more initial testing and proving, not less. So that's why I'm surprised/disagreeing when you say that less testing is something to be gained from this proposal.

I said that having one app to test and support can require less effort than having to test two separate apps and discovering their quirks, differences, disparity, unique bugs, whatever.
If there is any quirk, difference, disparity or bug in the mobile version of a particular app, then having that mobile version turned into a mobile "mode" of the desktop version won't magically fix the quirk/bug/disparity. Either the developer will have to find and fix that problem (which they can do already), or the problem will still be there. The only freebie you'd get is that if you really need it to work the "desktop way," then you can use the desktop "mode" on your mobile device (if that's feasible given the physical constraints of the device's interface). It doesn't stop there from being differences between the different interfaces developed by the same developer.

Basically, most apps would be like the iTunes mini-me mode. There would be a separate interface designed for small (mobile) mode, and just like if it was a standalone mobile version of the app, it would be vulnerable to having its own independent bugs, quirks and disparities. What do we gain from iTunes mini-mode being inside the same app as the big-boy mode? Easier testing with your workflow? Fewer disparities between modes? Fewer bugs? I say no, no and no. I say what we gain is workflow contiguity. If they were separate apps, we wouldn't be able to seamlessly switch from one to the other at will, whereas now we can. The rest of that stuff, I think it takes just as much effort as if they were two apps.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jul 9, 2013, 05:02 PM
 
Originally Posted by Shaddim View Post
...there are faint rumblings of some groundbreaking biotech research going on, but that will likely be 7-10 years into the future.
eyePhone? The one that goes inside your eye?
     
Clinically Insane
Join Date: Apr 2003
Location: 46 & 2
Status: Offline
Reply With Quote
Jul 9, 2013, 08:07 PM
 
Originally Posted by Uncle Skeleton View Post
eyePhone? The one that goes inside your eye?
What little I know I can't talk about, about all I can say is it brings new meaning to the phrase "plug & play".
"Those who expect to reap the blessings of freedom must, like men, undergo the fatigue of supporting it."
- Thomas Paine
     
Addicted to MacNN
Join Date: Aug 2006
Location: Somewhere in the Pacific Northwest
Status: Offline
Reply With Quote
Jul 10, 2013, 06:32 PM
 
Originally Posted by besson3c View Post
What Oreo has that you often lack is the ability to talk about this stuff calmly without losing your shit and getting emotional.
He also doesn't rehash the same shit fourteen times in a row while ignoring the opposing side's points (e.g. "point and click and touch WILL NEVER WORK TOGETHER EVER EVER EVER EVER!!!!!one!!!!1!!!!!" even when that point is refuted several times with real-world examples and use case scenarios).

I will always have more respect for people in the IT industry who don't just respond to every question, statement, or argument with "but Apple is better because ______" or "just use Apple's _____ product". There is life without Apple, shockingly enough.

Originally Posted by Shaddim View Post
What little I know I can't talk about, about all I can say is it brings new meaning to the phrase "plug & play".
Dirty.
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jul 10, 2013, 07:01 PM
 
Originally Posted by shifuimam View Post
He also doesn't rehash the same shit fourteen times in a row while ignoring the opposing side's points (e.g. "point and click and touch WILL NEVER WORK TOGETHER EVER EVER EVER EVER!!!!!one!!!!1!!!!!" even when that point is refuted several times with real-world examples and use case scenarios).
I only repeat points that are obviously getting lost in the desperation of needing to paint the other side as an idiot (I am as guilty of this as you are).

In fact, your hyperbolic all-caps bit there is exactly what I went out of my way NOT to claim. You missed it.

I specifically noted that there are a bunch of situations where it can work.

My point (15th time now) is that having an interface NOT work once out of ten times is enough to consider it a broken, and thus unpredictable interface.


Originally Posted by shifuimam View Post
I will always have more respect for people in the IT industry who don't just respond to every question, statement, or argument with "but Apple is better because ______" or "just use Apple's _____ product". There is life without Apple, shockingly enough.
That, shockingly enough, starts with actually reading what people write, rather than deciding you already know what they're saying because you've got them all figured out.

Dissing me by completely misrepresenting my point doesn't *really* reflect all that nicely upon you.

It's also rather amusing that the thread you replied to five minutes before you posted that, and that you are obviously referring to (as nothing of the sort has been posted in this thread), is about encrypted communications, and you actually agreed with me, except going on to point out that there is no real alternative to using Apple's own proprietary protocols on their shiny aluminum shit.
     
Clinically Insane
Join Date: Apr 2003
Location: 46 & 2
Status: Offline
Reply With Quote
Jul 10, 2013, 07:27 PM
 
Originally Posted by shifuimam View Post
Dirty.
Just let your mind wander on that phrase for a little while, then think about other tech they've released in the last 3-4 years.
"Those who expect to reap the blessings of freedom must, like men, undergo the fatigue of supporting it."
- Thomas Paine
     
Addicted to MacNN
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Jul 11, 2013, 10:37 AM
 
I don't think I've weighed in on this thread yet.

It looks to me like wearable sensors and more advanced accessories for the iPhone and iPad are the immediate future. There is already a raft of health based add-ons and I don't think the iWatch rumours are completely unfounded. I think they are simply a little premature.

Smartphones are already starting to run short of truly compelling new features in terms of hardware that you can cram into the handset but killer features still seem to be a powerful factor in driving sales. I'm not sure Google Glass will properly take off until it can be done using a contact lens but that will happen sooner or later.

There was a lot of computer technology that people said would never reach a plateau but it has happened. With HD video and gaming on phones, tablets and consoles and various internet content directly on your TV, the biggest reasons to keep a desktop PC at home have vanished. The tablet market has murdered netbooks already and laptops will be next for consumers and office workers.
The most interesting questions this poses for Apple is whether they will be prepared to take a bigger plunge than ever before when it comes to diversifying their products. Online services has been something they were reluctant to go all in with until relatively recently which is why its finally working better for them than it ever did before. All the extra talent they had to hire to build a phone was clearly a big step 6 years ago, but what about when they have to start making medical-grade, sterile implantable tech? This tech will kill off the mobile phone and the iPod sooner or later because we'll be able to build it all into our heads, but Apple is going to have to raise its game considerably. The whole "don't be an early adopter" philosophy takes on new meaning when it comes to surgery. Will we see operating theatres in Apple Stores?
I have plenty of more important things to do, if only I could bring myself to do them....
     
Forum Regular
Join Date: Jul 2013
Status: Offline
Reply With Quote
Jul 14, 2013, 09:50 PM
 
After using iOS 7 since Beta day 1 on my iPhone and iPad... OS X looks old and tired. The days of crayola UIs and cheesy icons are very numbered. I predict the next release of OS X, OS 10.10 or rather OS 11, will come out next year: 2014 and be totally redesigned. Clean and minimal, very much the same design language as iOS 7.

Right now iOS and OS X are worlds apart. There's little common design language between them. Fire up the Messages App in OS X and compare it to the SMS App in iOS 7. It's absurd: iOS 7's is totally a different look where OS X's Nessages App looks just like iOS 6 and before. It's pretty hideous.

Apple has no choice but to redesign OS X to get it inline with their new design direction and to have a seamless, consistent multiple device experience.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jul 14, 2013, 10:16 PM
 
Originally Posted by theothersteve View Post
After using iOS 7 since Beta day 1 on my iPhone and iPad... OS X looks old and tired. The days of crayola UIs and cheesy icons are very numbered. I predict the next release of OS X, OS 10.10 or rather OS 11, will come out next year: 2014 and be totally redesigned. Clean and minimal, very much the same design language as iOS 7.

Right now iOS and OS X are worlds apart. There's little common design language between them. Fire up the Messages App in OS X and compare it to the SMS App in iOS 7. It's absurd: iOS 7's is totally a different look where OS X's Nessages App looks just like iOS 6 and before. It's pretty hideous.

Apple has no choice but to redesign OS X to get it inline with their new design direction and to have a seamless, consistent multiple device experience.

Other than aesthetics can you please provide us with some specific examples so that I can better understand what you are referring to here?
     
Forum Regular
Join Date: Jul 2013
Status: Offline
Reply With Quote
Jul 14, 2013, 10:35 PM
 
Originally Posted by besson3c View Post
Other than aesthetics can you please provide us with some specific examples so that I can better understand what you are referring to here?
The design language. Apple's new design language is minimal and flat. OS X uses a 10 year old design language full of heavy gradients and texture. Prior to iOS 7, OS X and iOS shared a very similar design language. It's been all over the media about the new design direction. It's a big deal. Having read through their new UI transition guide and attending the WWDC, it's clear and now a matter of record that Apple has a new design philosophy with all the rules and policies that go along with it.

In their new design docs, they lambaste the kind of UI and UX that OS X has. You're going to see OS X move over to a flatter, thinner look with more use of white space. Dock icons will be clean and minimal. Gradients will be understated. Stock UI icons will look similar to iOS 7, and multitouch will feel more integrated into Apps.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jul 14, 2013, 10:50 PM
 
Originally Posted by theothersteve View Post
The design language. Apple's new design language is minimal and flat. OS X uses a 10 year old design language full of heavy gradients and texture. Prior to iOS 7, OS X and iOS shared a very similar design language. It's been all over the media about the new design direction. It's a big deal. Having read through their new UI transition guide and attending the WWDC, it's clear and now a matter of record that Apple has a new design philosophy with all the rules and policies that go along with it.

In their new design docs, they lambaste the kind of UI and UX that OS X has. You're going to see OS X move over to a flatter, thinner look with more use of white space. Dock icons will be clean and minimal. Gradients will be understated. Stock UI icons will look similar to iOS 7, and multitouch will feel more integrated into Apps.

What's the difference between "design language" and aesthetics, or user experience?

It sounds like you disagree with the direction Apple is going as far as user experience goes? I took what you wrote to mean that you don't like the direction they are going in as far as interface design and usability.
     
 
Thread Tools
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On
Top
Privacy Policy
All times are GMT -4. The time now is 12:37 PM.
All contents of these forums © 1995-2015 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2015, Jelsoft Enterprises Ltd., Content Relevant URLs by vBSEO 3.3.2