Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Community > MacNN Lounge > Where do you see Apple in 5 years?

Where do you see Apple in 5 years?
Thread Tools
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Online
Reply With Quote
Jun 24, 2013, 09:36 AM
 
So after seeing iOS 7 and the new Mac Pro, a few thoughts formed in my mind about the future of Apple within the next 3-5 years, and I was wondering what you guys are thinking.

Here's my take: To me, the redesign and re-architecturing of iOS 7 is all about apps, to offer to developers simpler ways to implement UIs that you find in, say, Letterpress. It's I think an underrepresented aspect of iOS 7, because much of the focus lies on the color palette and the icons. Many new APIs are dedicated to making it easier for programmers to animate objects, e. g. by giving them physical properties such as mass and a coefficient of restitution, and then using a physics engine to enable interactions that go beyond mere transitions. In my mind, that's probably also the reason why springboard is still just an app launcher (that's fine with me). Apple allows you to send tweets or mere status updates directly within the OS, but for anything else, they want you to use the Facebook app or some Twitter client. This way they're fully leveraging and furthering their app ecosystem advantage.

iOS 7 is clearly about the iPhone (and to a lesser degree, its phoneless brother, the iPod touch). I believe the next version (2014), iOS 8, will be all about the iPad, and Apple will introduce more differentiation between iOS for the iPhone and iPad. Hopefully they'll emphasize the collaborative aspect and address »file management« (I use it in quotation mark, because it won't be traditional file management) as well as cooperation between different apps.

Last, but not least, in 2015, it's OS X turn to receive its UI overhaul. I expect something less radical, but we should still be able to tell, iOS and OS X 10.11 are from the same family. The focus should lie on making it more of a platform for pros »who need better trucks«. Under the hood, the new Mac Pro shows clearly where things are going: utilize the GPU for calculations. (One of the new Mac Pro's GPUs is not connected to any display port (!), so it will just idle -- unless you put it to good use.) A first step could be a rewritten Aperture X in addition to Final Cut Pro X which uses the GPU for much of its computations. Probably many of these advances will eventually trickle down to iOS once iOS devices have acquired sufficient GPU omph to use the die area for computing tasks as well as driving the UI.

You'll notice that I haven't mentioned services so far. That's not because I believe these will not play an important role, but only because this is where I have the least idea about Apple's plans. I think services will be integral to the solution to the »file management« problem, and they are iterating on iCloud's file management.

In short, the focus of iOS development will be to make it a better platform for app developers while the focus of OS X and Mac development is to build damn good trucks for fewer people with more specialized interests.
I don't suffer from insanity, I enjoy every minute of it.
     
Addicted to MacNN
Join Date: Aug 2006
Location: Somewhere in the Pacific Northwest
Status: Offline
Reply With Quote
Jun 24, 2013, 10:49 AM
 
Preface: I'm not trolling, I promise.

New APIs notwithstanding, iOS 7 looks extremely underwhelming. I seriously think that it's an indicator that Apple has fallen behind in the mobile market, big time. Users want cool features; they don't care about the same things developers care about. When you compare iOS's extremely limited springboard to both Android's home screen widgets and Windows 8 RT/Windows Phone 7's live tiles, there's just nothing compelling there anymore - and that's just the beginning of where iOS is lacking.

We're moving away from traditional desktop computing in the consumer market. This is what matters to Apple, because they are absolutely not an enterprise-oriented technology company. It's only been in the last five years that Apple has made any real efforts to get into the small business market at their retail stores, and what they're doing just doesn't work for businesses with tight schedules and even tighter budgets. With that in mind, I don't know that anything Apple does to OS X is going to be that spectacular to your average consumer, simply because they're going to be more interested in what Apple's mobile products can do.

I posted awhile back about the new project Canonical is working on with Ubuntu - a single device runs the OS, and that device can be connected to a tablet dock for a larger mobile experience, or a desktop dock for a full-on Ubuntu desktop experience. This is where technology is going and needs to go. It's just not good enough anymore to have four different devices constantly having to sync over wifi or cellular networks in order to keep updated. Microsoft is sort of moving in this direction with Windows 8, in that the OS has unified the mobile and desktop experiences into one product.

I should add that Microsoft is way ahead of the curve on the developer side there, too - Windows 8 introduces an entirely new way of developing software with Metro. It's further abstracted hardware and architecture from the software, so developers can write a single application that will work on both traditional desktops and laptops and new ARM-based tablets. This is the direction Apple needs to go, too. They sort of tried to make OS X more mobile-like with Launchpad, but what they really need to do is create an environment where software developers don't have to work on two different applications for two different architectures.

I definitely disagree that Apple's desktop hardware is going to move toward focusing on people with specialized interests. If anything, the new Mac Pro is clearly moving in the opposite direction. It's tiny, its internal hardware can barely be upgraded (video cards use a proprietary form factor, I'd bet dollars to donuts that the CPUs are soldered down in the name of saving space, the SSD isn't a standard 3.5" or 2.5" drive), and worst of all, the only way it can really be expanded is with external peripherals. That might be okay for consumers, but professionals don't want to have eight different boxes stacked on their desk just to get their work done.

Apple's turned the Mac Pro from the last expandable desktop computer in their lineup into a glorified overpowered Mac Mini. That's not going to cut it for specialized applications - and since Thunderbolt is more or less proprietary at this point (e.g. non-Apple OEMs don't use it), it's not like there's a glut of external peripherals ready to go for early adopters.

Quite frankly I think Apple needs to drop desktop hardware entirely and focus on mobile hardware and unifying OS X and iOS. It would allow them to turn OS X in a commodity product that can actually compete with Windows in the business and consumer markets, and the R&D that would have been spent on always-slightly-thinner machined aluminum enclosures can go toward actually making iOS "the most advanced mobile operating system ever". At this point they've just spread themselves too thinly across too many markets, and it's showing.
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Online
Reply With Quote
Jun 24, 2013, 11:16 AM
 
Originally Posted by shifuimam View Post
New APIs notwithstanding, iOS 7 looks extremely underwhelming. I seriously think that it's an indicator that Apple has fallen behind in the mobile market, big time. Users want cool features; they don't care about the same things developers care about.
From Apple's standpoint, users want apps, this is where they spend the majority of their time. So anything that makes developing for iOS more powerful and simpler is a win for consumers in the end. They introduced sprite kit to make development of retro-looking games easier, for instance. Also the move to this new interface is in part due to the fact that most modern, successful app use similar design language already (i. e. developers are forced to design a custom UI if they want a good-looking app).
Originally Posted by shifuimam View Post
We're moving away from traditional desktop computing in the consumer market. This is what matters to Apple, because they are absolutely not an enterprise-oriented technology company. It's only been in the last five years that Apple has made any real efforts to get into the small business market at their retail stores, and what they're doing just doesn't work for businesses with tight schedules and even tighter budgets.
I've forgotten about business in my post. While you're right that Apple is not a business company (simply put, their mission is not to make businesses happy, but customers), they're making huge inroads with iOS devices. Just to give you two examples: I have a lot of friends who are Mac and iOS developers and finding a job for them couldn't be easier. There are a lot of programming-for-hire companies for iOS devices which are using iPads and iPhones for things like inventory management. Here in Japan, in many regular clothes shops (also chains) and such, iPhones are used to find out whether the shirt in the size you want is in stock. Some smaller shops have iPad POSes in lieu of your usual, computer-based register.
Originally Posted by shifuimam View Post
Apple's turned the Mac Pro from the last expandable desktop computer in their lineup into a glorified overpowered Mac Mini. That's not going to cut it for specialized applications - and since Thunderbolt is more or less proprietary at this point (e.g. non-Apple OEMs don't use it), it's not like there's a glut of external peripherals ready to go for early adopters.
I don't think at all the new Mac Pro is a glorified Mac mini, it's a machine with very interesting specs: Apple has cut the core count in half (because very few pieces of software can utilize 2 x 12 = 24 cores) but instead chose to go with two very, very powerful GPGPUs. One of these GPUs is not even connected to any Thunderbolt port. And their decision to make the Mac Pro a custom machine will make it a lot cheaper: if you buy the graphics cards alone on (6 GB FirePro W9000 is the closest-matching card), you pay ~$3,500 -- a piece. And since I think the Mac Pro will be cheaper than $7k, they put all that computing power in an affordable package. To me their bet with the Mac Pro is that the massive increase in GPGPU performance will much more than offset the other limitations. Also their use of proprietary PCIe SSDs compared to 2.5" or 3.5" SSDs gives the user a significant boost in performance. The new MacBook Air's SSD already is faster than any SATA SSD on the market (because the Air's SSD also has a PCIe interface while SATA SSDs are limited by the throughput of the SATA interface).

I think the success of the Mac Pro really hinges on whether apps can really make use of the GPUs for more ordinary compute tasks (something quite common in scientific computing these days). If that turns out to be wrong, the price-performance of a Mac Pro is really hard to beat (despite its limitations). If Apple gets this wrong or is too optimistic about the time scale, the machine will bomb in the market.
I don't suffer from insanity, I enjoy every minute of it.
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jun 24, 2013, 01:03 PM
 
Wow. Absolutely nothing personal shif, but it's amazing to read such thorough and detailed assertions and completely disagree with every single one.

(except for the one about the consumer market moving away from traditional computers. That's dead-on. But I'll get to that.)

All the more fascinating because I actually didn't respond to the original post because everything he wrote seemed to obvious (to me) as to be almost banal. Perspectives can be SO different.

Originally Posted by shifuimam View Post
Preface: I'm not trolling, I promise.
Neither am I.

I see Apple's products from the perspective of a long-time user who's working in the creative professional market.

Originally Posted by shifuimam View Post
New APIs notwithstanding, iOS 7 looks extremely underwhelming. I seriously think that it's an indicator that Apple has fallen behind in the mobile market, big time. Users want cool features; they don't care about the same things developers care about. When you compare iOS's extremely limited springboard to both Android's home screen widgets and Windows 8 RT/Windows Phone 7's live tiles, there's just nothing compelling there anymore - and that's just the beginning of where iOS is lacking.
Every time you generalize "users wanting" something, you're very, very likely to be wrong.

While you're right that they don't care about the same things developers care about, they also care even less about the things geeks care about. If developers go apeshit about some new framework, it will at least usually mean great new software. If geeks go apeshit over some new function, it will usually mean an added layer of complexity and inconsistency with almost no actual use to anybody who doesn't fiddle with technology for a living or a hobby.

Most of "us" are probably happiest when we DON'T deal with technology. The springboard is reduced to the absolute minimum functionality. It's a basic interaction layer that does nothing beyond providing the simplest possible interface to allow me to turn my device into whatever I want it to be and get completely out of my way the millisecond I launch the app. There is no other layer to it, no mixed modes of interaction or visual interface.

I'd even argue that having the live clock and calendar icons is done merely for reasons of consistency, rather than actual utility. Apple doesn't give two shits about widgets, but they DO get that it's mildly irritating to have two clocks on screen whose times don't correspond. Nobody will be using the home screen clock icon to actually tell time (that's prominent on the lock screen in 99 out of a hundred use cases).

Windows Phone comes at that from the opposite direction: Live tiles are supposed to be primarily widgets, but the system can also be used as an app launcher.

Originally Posted by shifuimam View Post
We're moving away from traditional desktop computing in the consumer market. This is what matters to Apple, because they are absolutely not an enterprise-oriented technology company. It's only been in the last five years that Apple has made any real efforts to get into the small business market at their retail stores, and what they're doing just doesn't work for businesses with tight schedules and even tighter budgets. With that in mind, I don't know that anything Apple does to OS X is going to be that spectacular to your average consumer, simply because they're going to be more interested in what Apple's mobile products can do.
OS X is not built for "business" businesses. OS X is built for me, and it will be further and further refined to my particular needs. And those of my video-making partners. And those working on the images to go with the sound and video we make.

It is currently still a general-purpose consumer operating system as well, but it will be moving away from that as iOS devices and their OS become more potent over the next decade.

Originally Posted by shifuimam View Post
I posted awhile back about the new project Canonical is working on with Ubuntu - a single device runs the OS, and that device can be connected to a tablet dock for a larger mobile experience, or a desktop dock for a full-on Ubuntu desktop experience. This is where technology is going and needs to go. It's just not good enough anymore to have four different devices constantly having to sync over wifi or cellular networks in order to keep updated. Microsoft is sort of moving in this direction with Windows 8, in that the OS has unified the mobile and desktop experiences into one product.

I should add that Microsoft is way ahead of the curve on the developer side there, too - Windows 8 introduces an entirely new way of developing software with Metro. It's further abstracted hardware and architecture from the software, so developers can write a single application that will work on both traditional desktops and laptops and new ARM-based tablets. This is the direction Apple needs to go, too. They sort of tried to make OS X more mobile-like with Launchpad, but what they really need to do is create an environment where software developers don't have to work on two different applications for two different architectures.
As should be abundantly clear from my comments above, I could not disagree more with this. OS X is moving towards specialization, not a one-size-fits-all munging of technical compromises.

Originally Posted by shifuimam View Post
I definitely disagree that Apple's desktop hardware is going to move toward focusing on people with specialized interests. If anything, the new Mac Pro is clearly moving in the opposite direction. It's tiny, its internal hardware can barely be upgraded (video cards use a proprietary form factor, I'd bet dollars to donuts that the CPUs are soldered down in the name of saving space, the SSD isn't a standard 3.5" or 2.5" drive), and worst of all, the only way it can really be expanded is with external peripherals. That might be okay for consumers, but professionals don't want to have eight different boxes stacked on their desk just to get their work done.

Apple's turned the Mac Pro from the last expandable desktop computer in their lineup into a glorified overpowered Mac Mini. That's not going to cut it for specialized applications - and since Thunderbolt is more or less proprietary at this point (e.g. non-Apple OEMs don't use it), it's not like there's a glut of external peripherals ready to go for early adopters.
I think you completely misrepresent "professionals" there.

Professionals will deal with whatever fits their needs best. Working with external boxes is the absolute norm, and having Thunderbolt 2 means that whatever setup you build instantly becomes ridiculously flexible, as you can simply plug in your client's laptop, and that becomes the studio brain, or you can take selected components (or in fact the Mac Pro, which is now tiny tiny) on the road for tour use with the laptops. Internal expansion has become rare, and Thunderbolt 2 is actually faster than any internal storage expansion available to date (except 16x PCIe Flash expansion).

There are actually a number of Thunderbolt PCI expansion options on the market already, and for the rest, it will take some time to shift everything from Firewire to Thunderbolt for interfacing and storage, but thanks to six ports and cheap Thunderbolt-Firewire dongles, there's no rush, and the MacPro is actually more expandable than it's ever been before (with the sole exception of 16x PCI slots).

"Professionals" also rarely upgrade graphics cards — even video professionals (it's not like previous-generation Mac Pros had copious graphics upgrade options available, either, let alone officially supported ones). Most Mac Pros are bought, set up with whatever components are necessary for the job, and then left alone at least until they're written off, except for storage and memory upgrades occasionally. Then, they're either sold off to be replaced, or moved to secondary studios with a new machine purchased for the main workspace, or just kept in use until they become inadequate for daily use, and then replaced. I know ZERO pros who have upgraded processors. I'm not sure I know of even one who has upgraded his graphics card.

The Mac Pro is so OBVIOUSLY a production machine, geared to a very specific market, and geared towards heavily OpenCL-boosted processing, with silly flexible expansion options (what's the verdict on RAM? 128 GB max in this first generation iBin Pro?), that calling it a "glorified overpowered Mac mini" is a bizarre assessment.

What "specialized applications" is it "not going to cut it for", in your opinion?

Final Cut Pro X, Aperture X, and Logic X (Logic! You forgot Logic, Oreo!) are going to scream on this beast, once they've been OpenCL'ed.

Originally Posted by shifuimam View Post
Quite frankly I think Apple needs to drop desktop hardware entirely and focus on mobile hardware and unifying OS X and iOS. It would allow them to turn OS X in a commodity product that can actually compete with Windows in the business and consumer markets, and the R&D that would have been spent on always-slightly-thinner machined aluminum enclosures can go toward actually making iOS "the most advanced mobile operating system ever". At this point they've just spread themselves too thinly across too many markets, and it's showing.
At present, Apple is spreading themselves across exactly four markets, AFAICS:

1.) consumer handhelds.
2.) business handhelds.
3.) consumer PCs.
4.) non-business professional PCs.

and the corresponding software.

The traditional-computing OS X does not need to compete with Windows. Apple tried that in the 90s, and failed spectacularly. They didn't actually start making money until they realized that there is no money to be made in the commodity PC market, and just turned their back on it entirely. And in hindsight, they could not have made a decision, because Apple is making money hand-over-fist from their Mac lines, while commodity PC manufacturers are dying along with their market.

Apple's commodity-market product lines run iOS. Windows does not currently compete in the commodity market in any meaningful way. And Apple has no interest whatsoever in competing in the business PC market.

I don't understand how you can claim that Apple is spreading themselves too thinly, and in the same breath suggest that they address a shrinking market that they deliberately exited fifteen years ago?
( Last edited by Spheric Harlot; Jun 24, 2013 at 01:14 PM. )
     
Posting Junkie
Join Date: Apr 2007
Location: Iowa, how long can this be? Does it really ruin the left column spacing?
Status: Offline
Reply With Quote
Jun 24, 2013, 01:14 PM
 
Originally Posted by shifuimam View Post
Thunderbolt is more or less proprietary at this point (e.g. non-Apple OEMs don't use it),
List of Thunderbolt-compatible devices - Wikipedia, the free encyclopedia

Acer, Asus, Dell, HP, and Lenovo all have Thunderbolt ports.
     
Mac Enthusiast
Join Date: Dec 2007
Status: Offline
Reply With Quote
Jun 24, 2013, 02:03 PM
 
OMG OMG Apple is doomed!!
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jun 24, 2013, 02:24 PM
 
Originally Posted by shifuimam View Post
I posted awhile back about the new project Canonical is working on with Ubuntu - a single device runs the OS, and that device can be connected to a tablet dock for a larger mobile experience, or a desktop dock for a full-on Ubuntu desktop experience. This is where technology is going and needs to go.
Strongly disagree. The fundamental difference between an iPad and a Mac laptop is the input mode. In the same way, the fundamental difference between the first Mac and the terminal-based interfaces that came before it was the input mode. By removing the terminal entirely, developers were forced to make Mac apps that used the new input mode - the iOS did the same thing again. Unifying them again wasn't hard, but it was only once MS did the Apple thing and hid the terminal deep in the OS that Windows became a real GUI competitor.

Originally Posted by shifuimam View Post
It's just not good enough anymore to have four different devices constantly having to sync over wifi or cellular networks in order to keep updated. Microsoft is sort of moving in this direction with Windows 8, in that the OS has unified the mobile and desktop experiences into one product.
The applications should not be the same across phone-tablet-laptop-desktop - the data that the applications work on should be the same, and each application should be optimized for its environment. Give them all room to soar, and let the best paradigm win. That it will likely be the one I enjoy less is unfortunate for me.

Originally Posted by shifuimam View Post
I should add that Microsoft is way ahead of the curve on the developer side there, too - Windows 8 introduces an entirely new way of developing software with Metro.
You have written this before, and it was wrong then and it is wrong now. "Metro" - a term which MS won't use anymore - is a design language. A style, if you will. Windows 8 has a redesigned user interface that uses this design language, and so does Windows Phone 8. Windows 8 has also added support for the ARM instruction set. Windows 7 and its predecessors already supported x86, x64, DEC Alpha, MIPS, PowerPC, Clipper, SPARC (although not publicly available) and Itanium. Being the 9th supported architecture is not "entirely new". The NT kernel is apparently amazingly portable, supporting so many architectures over its lifetime, but we knew that already.

You can make an application that runs on both. That's not hard to do (and also not new, because you could with Windows CE and 98 back in the mists of time) - the reason noone does it on any other platrom is because it is silly. Why burden your single executable with that big range of hardware to support? Well, unless you have a near monopoly on the dying platform and desperately want to expand it to the new platform, of course.

Originally Posted by shifuimam View Post
It's further abstracted hardware and architecture from the software, so developers can write a single application that will work on both traditional desktops and laptops and new ARM-based tablets.
You can write a Java application that runs on anything from a mobile phone from the nineties to a massive server, by way of a Bluray player and a regular desktop. You can do the same thing in Flash, btw. Doesn't mean it's a good idea.

Originally Posted by shifuimam View Post
This is the direction Apple needs to go, too. They sort of tried to make OS X more mobile-like with Launchpad, but what they really need to do is create an environment where software developers don't have to work on two different applications for two different architectures.
Why? They are by far the biggest on the one platform that's growing, and they're actually gaining on the one the is dropping as well. What they're doing is working out well. When they tried to do what you're proposing - with the webapps on the iPhone 1.0 before the App Store opened - noone wanted to develop for it. When they relented and switched to the current model, the App Store exploded.

If you're worried about working on more than one app somehow being more taxing on the developers, let me assure you that it is much harder to make the same application work on anything from a 3.5" touch screen to a 27" mouse&keyboard setup than to make multiple applications that do the same thing. Just look at games, for instance - even when the platform owner makes games, they don't make them for both the portable and home theater consoles at once. They frequently make different but similar games for both, but they don't share code.

Originally Posted by shifuimam View Post
I definitely disagree that Apple's desktop hardware is going to move toward focusing on people with specialized interests. If anything, the new Mac Pro is clearly moving in the opposite direction. It's tiny, its internal hardware can barely be upgraded (video cards use a proprietary form factor, I'd bet dollars to donuts that the CPUs are soldered down in the name of saving space, the SSD isn't a standard 3.5" or 2.5" drive), and worst of all, the only way it can really be expanded is with external peripherals. That might be okay for consumers, but professionals don't want to have eight different boxes stacked on their desk just to get their work done.
Let me reassure you about the CPU: Intel doesn't make Xeon E5 processors that are soldered on. They're all FCLGA (soldered CPUs are BGA). There isn't even a BGA version of socket 2011 (what you need for all those PCIe lanes). The SSD is not a 2.5" drive (there are no 3.5" SSDs, and there is no reason for them) because the 2.5" drives exist for replacements of old SATA drives, and the old SATA bus is getting too slow. The newest Macbook Air has a significantly faster interface than any SATA - you can't seriously expect the MP to be slower. Exactly what format the Mac Pro uses is not known - it may be a M.2 drive, which is an emerging standard for a drive that does exactly what Apple seems to want - but even if it's not, it should be easy enough for third parties to make replacement drives. They already make drives for the Macbooks with non-standard SSDs. Given that it should be much easier to replace the drive in the new MP, that should be a given market.

Why is it terrible for professionals to have external expansions? If you want to put a number of drives in it, just buy one big external chassie to put them in and you have two boxes instead of one - and two that are smaller. Putting things internally is good when 1) it's required for speed, 2) you can save a PSU, or 3) for ease of transport. 1) is no longer relevant with Thunderbolt. 2) is a factor, but putting a massive PSU in a box costs money that can be better spent on other things. 3) should be significantly helped by the fact that the new MP is so much smaller than the old one.

Originally Posted by shifuimam View Post
Apple's turned the Mac Pro from the last expandable desktop computer in their lineup into a glorified overpowered Mac Mini. That's not going to cut it for specialized applications - and since Thunderbolt is more or less proprietary at this point (e.g. non-Apple OEMs don't use it), it's not like there's a glut of external peripherals ready to go for early adopters.
What are those specialized applications that can't be served by the new MP?
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Addicted to MacNN
Join Date: Aug 2006
Location: Somewhere in the Pacific Northwest
Status: Offline
Reply With Quote
Jun 24, 2013, 02:28 PM
 
Originally Posted by OreoCookie View Post
From Apple's standpoint, users want apps, this is where they spend the majority of their time. So anything that makes developing for iOS more powerful and simpler is a win for consumers in the end. They introduced sprite kit to make development of retro-looking games easier, for instance. Also the move to this new interface is in part due to the fact that most modern, successful app use similar design language already (i. e. developers are forced to design a custom UI if they want a good-looking app).
I guess what I'm getting at is that being able to have a single unifed interface and API for both desktop/laptop and mobile device, developers have the opportunity to make better software even more efficiently, so you can use the same app between phone, tablet, and laptop, only pay for it once, and know that your files and whatnot will be compatible across the board.

That's pretty compelling IMO.

I mean, think about this - what if the 11" MBA were, instead of a relatively low-powered Intel laptop at a relatively high price, a high-end tablet/keyboard combo that ran iOS, had a touchscreen, but offered what it has now (USB ports, SD slot, whatever else it has)? That'd be pretty damn awesome for someone who might already own an iPhone or iPod Touch (and therefore has already bought into the iTunes App Store universe).

I've forgotten about business in my post. While you're right that Apple is not a business company (simply put, their mission is not to make businesses happy, but customers), they're making huge inroads with iOS devices. Just to give you two examples: I have a lot of friends who are Mac and iOS developers and finding a job for them couldn't be easier. There are a lot of programming-for-hire companies for iOS devices which are using iPads and iPhones for things like inventory management. Here in Japan, in many regular clothes shops (also chains) and such, iPhones are used to find out whether the shirt in the size you want is in stock. Some smaller shops have iPad POSes in lieu of your usual, computer-based register.
FWIW, I think that the Square credit card system is freaking AWESOME for small businesses. I've talked to the managers/owners at some local coffee shops about using it, and it's pretty great. The fees are lower, support is better, and it uses commodity hardware - which means that replacing or upgrading is significantly easier and cheaper than using a proprietary POS system.

I always, always want the option of standardized commodity hardware over proprietary closed systems. I think it's pretty great that a business just starting out can go to the Apple Store, buy the cheapest iPad model, and be up and running with a credit card system immediately.

The problem Apple still has with business customers is the lack of support. I realize that enterprise is a different beast - if you spend enough money on the Apple ecosystem, you can negotiate whatever support contract you want. When I worked at Purdue, on the other hand, they just paid to have their own techs to maintain Apple certifications so they could order replacement parts through GSX and still maintain their warranties.

Small businesses don't have either of these options. Office-based small businesses that rely on their computers to make money don't have the time or resources to take a computer to an Apple Store for, say, an LCD replacement on a laptop or iMac. Apple Stores don't do while-you-wait repairs for the most part anymore. Not only that, but if you're a business that has iMacs or Mac Pros, lugging those things to an Apple Store just to have a repair done is probably not an option for you.

I should also add that, at least with Dell, if you buy machines through their small business division, you can even have them ship parts to you (as a normal person without an official Dell certification), do the repair yourself, ship the bum part back via a prepaid shipping label, and your warranty is still intact. Macs are so difficult to disassemble now that it's not even something you'd want to do in most cases.

It's not difficult - as a hardware OEM, if you want to market yourself to small businesses, you've absolutely got to offer an onsite option on your warranties. It's like the only thing that keeps more businesses from switching to Mac, aside from the possibility of proprietary software that requires Windows. If Apple offered (a) accidental damage options for desktops and laptops and (b) onsite support options for extended warranties, even I could legitimately recommend a business switch to Mac if it met their technology needs and was within their budget.

I don't think at all the new Mac Pro is a glorified Mac mini, it's a machine with very interesting specs: Apple has cut the core count in half (because very few pieces of software can utilize 2 x 12 = 24 cores) but instead chose to go with two very, very powerful GPGPUs. One of these GPUs is not even connected to any Thunderbolt port. And their decision to make the Mac Pro a custom machine will make it a lot cheaper: if you buy the graphics cards alone on (6 GB FirePro W9000 is the closest-matching card), you pay ~$3,500 -- a piece. And since I think the Mac Pro will be cheaper than $7k, they put all that computing power in an affordable package. To me their bet with the Mac Pro is that the massive increase in GPGPU performance will much more than offset the other limitations. Also their use of proprietary PCIe SSDs compared to 2.5" or 3.5" SSDs gives the user a significant boost in performance. The new MacBook Air's SSD already is faster than any SATA SSD on the market (because the Air's SSD also has a PCIe interface while SATA SSDs are limited by the throughput of the SATA interface).

I think the success of the Mac Pro really hinges on whether apps can really make use of the GPUs for more ordinary compute tasks (something quite common in scientific computing these days). If that turns out to be wrong, the price-performance of a Mac Pro is really hard to beat (despite its limitations). If Apple gets this wrong or is too optimistic about the time scale, the machine will bomb in the market.
Well, if I'm wrong about the Mac Pro in the long term, then I'm wrong. Personally, for specialized computing work, I'm still going to prefer a real desktop. You're right on the GPU bit, though - if you can leverage both GPUs for complex math functions that would normally only hit the CPU, the Mac Pro could be compelling to certain markets (well, that and the price point, obviously).

But I also prefer building all my own desktops, because it's really super fun. So I get that I'm not the consumer the Mac Pro is marketed toward. Frankly I've never understood why anyone would buy one (whereas the Power Macs were pretty awesome and I love my G4).
     
Professional Poster
Join Date: Jul 2005
Location: Winnipeg, MB
Status: Offline
Reply With Quote
Jun 24, 2013, 02:30 PM
 
I think in five years Apple will have finally tweaked the icons in iOS enough that they make sense again. Hopefully by then the pendulum will have swung back closer to the middle once they realize that a microphone is still more easily identifiable than a sound wave to normal people.

I'm hoping by then iCloud will be a more robust option including universal file types with versioning for graphics and audio (and video while they're at it?)

Hopefully 16 gigs will stop being entry level.

The Apple TV will have 802.11 AC support, that combined with the MFi game pads will allow iOS developers to leverage iOS devices as home consoles, which will really screw over Microsoft and Sony.

I hope Apple will not go in the direction of overly active icons. What they're doing with the clock in iOS 7 is fine, but frequently updating visual noise is not something I look for in a platform.

As far as the Mac Pro I love the direction they're moving in, and hope that I'll have the cash to buy one of them one of these days. I think the inclusion of a second GPU may mean that we'll start seeing more verbose GPUs in the MacBook Pro. Probably after the MacBook Air can go retina.

I think over the next year and a half all of Apple's major OSes will be moved over to something that more closely resembles iOS 7 ... which is sad because OS X looks soooo much better.

I also hope that Apple is going to start focusing more on the pro market software wise and will continue to frequently upgrade the Mac Pro, now that they have a computer they like there again.
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jun 24, 2013, 02:48 PM
 
Originally Posted by shifuimam View Post
I guess what I'm getting at is that being able to have a single unifed interface and API for both desktop/laptop and mobile device, developers have the opportunity to make better software even more efficiently, so you can use the same app between phone, tablet, and laptop, only pay for it once, and know that your files and whatnot will be compatible across the board.

That's pretty compelling IMO.
For somebody who only has a single finger to work with, yes.

For everybody else, it will be quite a drag to either only be able to use a single finger to operate their app, or to not have access to any multi-touch-operated functionality on the desktop mode, when all they have is a single pointing device.

This difference is not arbitrary, and not easily coded for. It is the most fundamental consideration when building an interface.

Originally Posted by shifuimam View Post
The problem Apple still has with business customers is the lack of support. I realize that enterprise is a different beast - if you spend enough money on the Apple ecosystem, you can negotiate whatever support contract you want. When I worked at Purdue, on the other hand, they just paid to have their own techs to maintain Apple certifications so they could order replacement parts through GSX and still maintain their warranties.

Small businesses don't have either of these options. Office-based small businesses that rely on their computers to make money don't have the time or resources to take a computer to an Apple Store for, say, an LCD replacement on a laptop or iMac. Apple Stores don't do while-you-wait repairs for the most part anymore. Not only that, but if you're a business that has iMacs or Mac Pros, lugging those things to an Apple Store just to have a repair done is probably not an option for you.
As I said above:

Apple does not make business computers. They do not market their Macs towards businesses, and any business that uses them knows what they're getting into. They know that they can't compete with Dell, or Compaq, or HP, or IBM, or Gateway, or Packard-Bell, and they don't want to. Interestingly, the business PC market hasn't exactly worked out well for most of those, either.

Apple DOES make business handhelds, and absolutely zero of your arguments apply there. Zero. Because, at least right now, if you have a defect in one of your devices, you walk into the Apple Store and walk out with a functioning device, IT runs it through their iOS management and setup software, and that's it.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jun 24, 2013, 05:56 PM
 
Originally Posted by Spheric Harlot View Post
For somebody who only has a single finger to work with, yes.

For everybody else, it will be quite a drag to either only be able to use a single finger to operate their app, or to not have access to any multi-touch-operated functionality on the desktop mode, when all they have is a single pointing device.

This difference is not arbitrary, and not easily coded for. It is the most fundamental consideration when building an interface.
This is an intriguing view of interface design. I view multi-touch much like a multi-button mouse. They are definitely nice to have, but god help you if a user needs to know what they all do in order to use your app (because many users will ignore them). But then again, I'm just a novice at touch-interface apps. Are there some good counter-examples* that I'm not thinking of?

*examples of apps where multitouch is a necessity instead of a luxury? I mean, I have 5 fingers on my hand, and I have 5 buttons on my mouse. Are they really such an incompatible comparison as you make it sound?
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jun 24, 2013, 06:21 PM
 
For starters, figure out how to zoom in and out when you try to control the iOS interface from a single-pointer device. Ctrl-scroll on a Mac does something very different: that zooms the entire interface, blowing up every pixel.

Next, discard all apps that are actually built around multi-touch interfacing — any of the thousands of apps that have sprung up that allow you to manipulate pixels, or act as controllers, often replacing expensive dedicated hardware.

Contrarily, borrow someone's iPad sometime and try to control your Mac via a VNC app from the iPad's touch interface. It's quite enlightening as to some of the basic assumptions that are completely thwarted (context-"click"? Touch and hold? Errr...how do I drag anything?). These issues can be overcome if need be, by neat hacks, but they remain kludges that make it work, and in no way actual permanent interface solutions. If you want to make something work equally well on both platforms, you're either building two completely separate designs, or building a massive kludge of compromise.
     
Addicted to MacNN
Join Date: Aug 2006
Location: Somewhere in the Pacific Northwest
Status: Offline
Reply With Quote
Jun 24, 2013, 07:06 PM
 
The combination of a detachable touch device for lighter work and a dockable mouse and keyboard is, well, amazing. Left/right click, multitouch trackpad, everything you could ask for.

I use my TF300 to RDP into my Windows servers all the time. It's flawless and I have a real mouse to do everything I might need to do. Once I'm done, I can disconnect, undock, and go right back to the fanciful multitouch tablet-only form factor whenever I want.

A docking iPad+keyboard/mouse combo would be pretty rad. Something tells me it would sell, too.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 24, 2013, 08:26 PM
 
Originally Posted by Salty View Post
I think in five years Apple will have finally tweaked the icons in iOS enough that they make sense again. Hopefully by then the pendulum will have swung back closer to the middle once they realize that a microphone is still more easily identifiable than a sound wave to normal people.


I chuckled that your first thought in this thread was what icons will look like
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 24, 2013, 08:30 PM
 
I still think that Apple has demonstrated over the years and with some of their current behavior that as far as their mobile market goes, Android will become the defacto business platform, and iOS and OS X will be for creative professionals and upper middle class home users/consumers.

I.e. Apple will eventually move closer to where they were before this mobile explosion.
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Online
Reply With Quote
Jun 25, 2013, 12:06 AM
 
Originally Posted by besson3c View Post
I still think that Apple has demonstrated over the years and with some of their current behavior that as far as their mobile market goes, Android will become the defacto business platform, and iOS and OS X will be for creative professionals and upper middle class home users/consumers.
You predict a recurrence of the old status quo of Apple vs. Windows from the 1990s and early 2000s. I see no evidence of that happening. Amongst the programmers-for-hire I know, the only development for Android that's going on is for consumer-facing apps. E. g. if you're a big newspaper, you want an Android app as well as an iOS app. But within the industry, I haven't heard of a single example of a big deployment of Android devices and apps. I'm sure there are some, but the momentum behind this change is huge. Just the problem of fragmentation makes it very hard for developers to deploy to various Android devices.

And I think the biggest reason for that not happening is that a lot of the innovation involves a confluence of hardware and software. If, say, Apple decides the best way to switch between users on iOS 8 is a built-in fingerprint scanner, then it simply equips all its devices with one.
I don't suffer from insanity, I enjoy every minute of it.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 25, 2013, 12:27 AM
 
Originally Posted by OreoCookie View Post
You predict a recurrence of the old status quo of Apple vs. Windows from the 1990s and early 2000s. I see no evidence of that happening. Amongst the programmers-for-hire I know, the only development for Android that's going on is for consumer-facing apps. E. g. if you're a big newspaper, you want an Android app as well as an iOS app. But within the industry, I haven't heard of a single example of a big deployment of Android devices and apps. I'm sure there are some, but the momentum behind this change is huge. Just the problem of fragmentation makes it very hard for developers to deploy to various Android devices.

And I think the biggest reason for that not happening is that a lot of the innovation involves a confluence of hardware and software. If, say, Apple decides the best way to switch between users on iOS 8 is a built-in fingerprint scanner, then it simply equips all its devices with one.


It is definitely that way now, but I think that this is mostly because Apple has a big headstart on Google in providing the tools to create rich, high quality mobile apps. Once developers can create apps that are at least "good enough" for business, I see no reason in the world for mass hardware purchases to not be on the cheaper platform.

I was at a meeting not too long ago where we heard from this taxi service that had a really neat approach to the taxi business. Instead of rate clocks rates were based on actual distance to the destination as reported by Google Maps, the fleet was entirely Toyota Prius based, and taxi drivers were equipped with Android tablets. If this kind of thing becomes common and these tablets can be built to be good enough to hold up, why wouldn't the owner of this company buy, say, 1000 of these tablets if they were $50-100 cheaper and the software worked just fine on Android?

At this point you could make the argument that the software would run better on an iPad somehow, but this is going to be a tough sell. Windows XP was what it was because it was cheap and basically worked, most of the time. The cheaper platform that basically works will be the next Windows XP for mobile.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 25, 2013, 12:31 AM
 
The Android fragmentation issue is huge, I'm actually struggling with this smoke now dealing with video playback issues on Android 2.3. It sucks, I'm not an Android fan right now, but you have to think that eventually these problems will be solved. 5 years is a long time.
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Online
Reply With Quote
Jun 25, 2013, 01:36 AM
 
Originally Posted by besson3c View Post
It is definitely that way now, but I think that this is mostly because Apple has a big headstart on Google in providing the tools to create rich, high quality mobile apps. Once developers can create apps that are at least "good enough" for business, I see no reason in the world for mass hardware purchases to not be on the cheaper platform.
Android is not cheaper to develop for in this environment, that's exactly the point.
The problem is really fragmentation: with a Windows PC, it doesn't matter whether the beige box has a Dell or HP badge on it, but the same cannot be said for Android handset. You cannot mix and match handsets, because different handsets from different manufacturers have different skins and upgrade schedules. What's worse, there are multiple variations of the »same« model for different markets (e. g. the Samsung Galaxy 4 has different SoCs in different markets). Hence, unlike PCs, different Android handsets are not interchangeable. Even if you stick to a particular model, say Samsung's Galaxy line, it's not clear that the Samsung Galaxy S III you have received last year will ever get the same software than your colleague's new S 4.

You don't have these problems with iPhones and iPads. Software in the business world has a really long shelf-life, and I think it's much more likely that you'll be able to migrate different iOS versions than different versions of Android + skin + customization.
Originally Posted by besson3c View Post
The Android fragmentation issue is huge, I'm actually struggling with this smoke now dealing with video playback issues on Android 2.3. It sucks, I'm not an Android fan right now, but you have to think that eventually these problems will be solved. 5 years is a long time.
I think it's not out of the question that the Android market share will implode in a few years' time, depending on whether Samsung will switch to its own fork of Android or Bada. Samsung is the only Android handset manufacturers who makes a profit. The new rising Chinese manufacturers don't use Google services in their Android forks so that Google does not profit from Huawei's rise in popularity. HTC is struggling to survive. Motorola is kept at arm's length by Google (Pichai reiterated this point during his interview at All Things D).
I don't suffer from insanity, I enjoy every minute of it.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 25, 2013, 01:52 AM
 
Originally Posted by OreoCookie View Post
Android is not cheaper to develop for in this environment, that's exactly the point.
The problem is really fragmentation: with a Windows PC, it doesn't matter whether the beige box has a Dell or HP badge on it, but the same cannot be said for Android handset. You cannot mix and match handsets, because different handsets from different manufacturers have different skins and upgrade schedules. What's worse, there are multiple variations of the »same« model for different markets (e. g. the Samsung Galaxy 4 has different SoCs in different markets). Hence, unlike PCs, different Android handsets are not interchangeable. Even if you stick to a particular model, say Samsung's Galaxy line, it's not clear that the Samsung Galaxy S III you have received last year will ever get the same software than your colleague's new S 4.
.

So your position is that all of this is unchangeable?
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jun 25, 2013, 02:10 AM
 
Originally Posted by shifuimam View Post
The combination of a detachable touch device for lighter work and a dockable mouse and keyboard is, well, amazing. Left/right click, multitouch trackpad, everything you could ask for.

I use my TF300 to RDP into my Windows servers all the time. It's flawless and I have a real mouse to do everything I might need to do. Once I'm done, I can disconnect, undock, and go right back to the fanciful multitouch tablet-only form factor whenever I want.

A docking iPad+keyboard/mouse combo would be pretty rad. Something tells me it would sell, too.
It's necessary for operating the completely different interface, though.

That's the exact opposite of software that will run on both platforms equally usably.
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jun 25, 2013, 02:15 AM
 
Originally Posted by besson3c View Post
So your position is that all of this is unchangeable?
Do you see any indication at all that it is changing?

What is happening on the Android side to reign in fragmentation issues among the myriad manufacturers and devices?

Currently, it seems to me that the only way theses issues might get simplified is when all Android manufacturers except Samsung actually go out of business from hemorrhaging money year after year. Perhaps Samsung will then have migrated its sixty available models to Bada, and Google will finally have actual control over the platform, because Motorola will be the only remaining manufacturer that uses it.
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jun 25, 2013, 02:24 AM
 
Originally Posted by besson3c View Post
I was at a meeting not too long ago where we heard from this taxi service that had a really neat approach to the taxi business. Instead of rate clocks rates were based on actual distance to the destination as reported by Google Maps, the fleet was entirely Toyota Prius based, and taxi drivers were equipped with Android tablets. If this kind of thing becomes common and these tablets can be built to be good enough to hold up, why wouldn't the owner of this company buy, say, 1000 of these tablets if they were $50-100 cheaper and the software worked just fine on Android?

At this point you could make the argument that the software would run better on an iPad somehow, but this is going to be a tough sell. Windows XP was what it was because it was cheap and basically worked, most of the time. The cheaper platform that basically works will be the next Windows XP for mobile.
What happens when one of those tablets breaks down in a year and a half? Will the next one conform to hardware specifications the software has been developed for? Will you still get tablets that run on the 2.3 version you've written for, or have all the tablets actually been updated continually to newer OS versions as you updated your software to run on them (the way it works on iOS), so you can just go out and buy an off-the-shelf replacement and install your app?

Windows XP had going for it that it had a HUGE installed user base, and that businesses could rely upon Microsoft making sure that whatever they built next was more or less backwards-compatible with their software investment. (People also tend to overlook that having to support XP for ten years was the LAST thing Microsoft wanted, and was only the result of a completely botched platform introduction with Vista.)
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Online
Reply With Quote
Jun 25, 2013, 03:21 AM
 
Originally Posted by besson3c View Post
So your position is that all of this is unchangeable?
No, not at all. As far as I understand, you have argued that Android will eventually get an upper hand because it is the »Windows of the mobile world«, and thus eventually will get an upper hand. I don't think this is true and I have argued why.

Many of these problems (fragmentation, lack of revenue from Android handsets to Google as a way to finance Android development) are consequences of a birth »defect« of Android: it's open source so that anyone can fork Android. I don't see how this can ever be undone, and thus, I see no indication that this will change. If I were on Google's Android team, fragmentation would be incredibly frustrating: they put all the energy and effort into improving it, but given that the largest share of users are still on some flavor of Android 2.3 (~39 %), developers cannot reap benefit from all the advances in the APIs and the system. Instead, you have to program for the lowest common denominator -- which is an OS from 2010. So in many cases, you're stuck with the capabilities of an OS that is only roughly comparable to iOS 4!

If you are working for a large company, the lack of updates has a second important consequence: lack of security fixes! Rather, there are security fixes, but you don't necessarily receive them at all or in a timely fashion.
I don't suffer from insanity, I enjoy every minute of it.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jun 25, 2013, 07:12 AM
 
Originally Posted by Spheric Harlot View Post
For starters, figure out how to zoom in and out when you try to control the iOS interface from a single-pointer device.
That has already been figured out many times over. In google maps for instance there are two redundant solutions: scroll wheel and floating "+" and "-" buttons. There's also the Photoshop-esque magnifier tool which is established in the computer metaphor lexicon, as is the cmd/ctrl + and - keyboard shortcuts.

Don't forget that the entire need for constant pinching and zooming is itself a kludge of compromise for the benefit of the mobile form factor, and most apps could simply not offer anything special for zooming (in desktop mode) and no user would even notice (a web browser for instance). Any time it would be necessary would be when you're using your tablet as a laptop-alike, in which case you could simply use the tablet's multitouch.

As I said there are as many button combos as finger combos, and you could reinvent this wheel to map a button combo to the mutlitouch combo that had previously reinvented the various zoom conventions from the desktop computing arena, but that's not even necessary. People were zooming in and out inside applications before the iPhone was an apple in Apple's eye


Next, discard all apps that are actually built around multi-touch interfacing — any of the thousands of apps that have sprung up that allow you to manipulate pixels, or act as controllers, often replacing expensive dedicated hardware.
I already told you I'm not familiar enough with mobile apps to have ever used, seen or heard of any of those. Can you please name an example so I can know what you're talking about?

Contrarily, borrow someone's iPad sometime and try to control your Mac via a VNC app from the iPad's touch interface.
Sorry I won't be installing apps on someone's iPad that is nice enough to let me borrow it. You'll have to actually make your point using your words

It's quite enlightening as to some of the basic assumptions that are completely thwarted (context-"click"? Touch and hold? Errr...how do I drag anything?).
How does the magic trackpad not make this a solved problem? IOW, what can you not do without using the hardware button of a trackpad? Or are you just insisting that we decline to use solutions that are already established?

These issues can be overcome if need be, by neat hacks, but they remain kludges that make it work, and in no way actual permanent interface solutions.
I think it's a cop out to pretend that kludges can't become permanent interface solutions. The cursor was a kludge at one point, and the right-click and scroll wheel. All it takes is for users to get used to them, and then they're as permanent as anything else.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jun 25, 2013, 07:26 AM
 
Originally Posted by OreoCookie View Post
No, not at all. As far as I understand, you have argued that Android will eventually get an upper hand because it is the »Windows of the mobile world«, and thus eventually will get an upper hand. I don't think this is true and I have argued why.
But android hardware is cheaper for consumers. I agree with besson3c, though I expect it will take longer than 5 years to make the shift. Android is still very immature which is the cause of a lot of the problems you described, but it will stop being so volatile in time, and then the cheapness of the gear will gradually win marketshare.
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Online
Reply With Quote
Jun 25, 2013, 08:26 AM
 
Originally Posted by Uncle Skeleton View Post
But android hardware is cheaper for consumers.
Sure, that's why Android has by far the largest marketshare. But its profitshare is much, much smaller, and Android isn't developed for free. No one except Google seems to be capable of developing Android (have you heard much about Amazon's fork lately?), so if Google cannot sustain its development, the fact that devices are cheaper means nothing, because we're almost back at square 1.
Originally Posted by Uncle Skeleton View Post
I agree with besson3c, though I expect it will take longer than 5 years to make the shift. Android is still very immature which is the cause of a lot of the problems you described, but it will stop being so volatile in time, and then the cheapness of the gear will gradually win marketshare.
You don't see the danger of Android development no longer being worth it for Google in case Samsung switches to another OS?
I don't suffer from insanity, I enjoy every minute of it.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jun 25, 2013, 08:36 AM
 
Sure, that's a risk, but the same risk exists for PCs. Samsung makes PCs too, and Windows is even more dependent on MS than android is on Google. (So why is it more of a problem for mobiles?)
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jun 25, 2013, 08:44 AM
 
Originally Posted by Uncle Skeleton View Post
That has already been figured out many times over. In google maps for instance there are two redundant solutions: scroll wheel and floating "+" and "-" buttons. There's also the Photoshop-esque magnifier tool which is established in the computer metaphor lexicon, as is the cmd/ctrl + and - keyboard shortcuts.

Don't forget that the entire need for constant pinching and zooming is itself a kludge of compromise for the benefit of the mobile form factor, and most apps could simply not offer anything special for zooming (in desktop mode) and no user would even notice (a web browser for instance). Any time it would be necessary would be when you're using your tablet as a laptop-alike, in which case you could simply use the tablet's multitouch.
"Most" isn't good enough. "Most" graphics apps, for example live and die by constant zooming-in and -out.

But yes, so you solve this particular instance by cluttering the interface with on-screen widgets that are there solely for the benefit of using a mouse, even though there is a far simpler and more effective interface when running on a touchscreen interface.

And that is exactly what you would do in every single other case.

Or you'd do it like Office for Windows RT, and simply take the mouse interface and space everything far enough apart to be pushable with a finger. Yay. Zero advantage for enabling touch. Zilch.

Originally Posted by Uncle Skeleton View Post
As I said there are as many button combos as finger combos, and you could reinvent this wheel to map a button combo to the mutlitouch combo that had previously reinvented the various zoom conventions from the desktop computing arena, but that's not even necessary. People were zooming in and out inside applications before the iPhone was an apple in Apple's eye
Have you never used an iPad? The point of multi-touch is not to have several control buttons at once, it is to be able to touch several points on the screen at the same time. If I have an interface that allows me to, say, move four sliders at the same time, having five buttons on a single mouse that controls one pointer is as useful as having fourteen kinds of nails when I'm trying to whitewash a house.

Put another way:
One of the really cool and fun things about finger painting is not that it does similar things to what you can do with a single brush. It's that you have ten fingers to work with.

Originally Posted by Uncle Skeleton View Post
I already told you I'm not familiar enough with mobile apps to have ever used, seen or heard of any of those. Can you please name an example so I can know what you're talking about?
Well, iPhoto for one.
Paper.
MIDI Designer
Lemur
GarageBand.

Originally Posted by Uncle Skeleton View Post
Sorry I won't be installing apps on someone's iPad that is nice enough to let me borrow it. You'll have to actually make your point using your words
Twenty seconds with a VNC client make obvious what I can spend weeks trying to explain to you.

Originally Posted by Uncle Skeleton View Post
How does the magic trackpad not make this a solved problem? IOW, what can you not do without using the hardware button of a trackpad? Or are you just insisting that we decline to use solutions that are already established?
Read carefully: you're controlling a Mac FROM A MULTI-TOUCH INTERFACE.

You've got it backwards.

Originally Posted by Uncle Skeleton View Post
I think it's a cop out to pretend that kludges can't become permanent interface solutions. The cursor was a kludge at one point, and the right-click and scroll wheel. All it takes is for users to get used to them, and then they're as permanent as anything else.
The cursor was a REVOLUTION, not a kludge. It led to the creation of far superior interfaces, but ONLY if the legacy crap was completely removed (as Oreo or P wrote above: Windows didn't become a viable Mac competitor until the command line was buried far enough away that developers couldn't assume it would be used anyway).

How was the scroll wheel a "kludge"?

I define a "kludge" as something that worsens the product for being there, but is necessary to work around a problem.

You can implement extra controls for "mouse button down" or "context click" when attempting to control a mouse-based interface from a pointerless multi-touch OS, but that's still going to be a pain in the ass. The only way to get around that is to remove the problem and not allow cursor-based software in the first place.
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Online
Reply With Quote
Jun 25, 2013, 08:46 AM
 
Originally Posted by Uncle Skeleton View Post
Sure, that's a risk, but the same risk exists for PCs. Samsung makes PCs too, and Windows is even more dependent on MS than android is on Google. (So why is it more of a problem for mobiles?)
I should have been more careful: I'm not talking about the risk to Samsung: Samsung already has its own operating system and you can buy Samsung smartphones with at least three different OSes these days. No, the risk does not lie with Samsung, the risk lies with Google in my opinion: they have to be able to upfront the development costs, and if the profit share drops to zero because Samsung goes its own way, there is very little money to go around.

Do you understand my point now?

PS Samsung just announced it's getting out of the desktop PC market (they'll still sell laptops, though).
I don't suffer from insanity, I enjoy every minute of it.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jun 25, 2013, 08:55 AM
 
Originally Posted by OreoCookie View Post
I should have been more careful: I'm not talking about the risk to Samsung: Samsung already has its own operating system and you can buy Samsung smartphones with at least three different OSes these days. No, the risk does not lie with Samsung, the risk lies with Google in my opinion: they have to be able to upfront the development costs, and if the profit share drops to zero because Samsung goes its own way, there is very little money to go around.

Do you understand my point now?
I got it the first time. How is it any different a risk than if MS stops developing/profiting from Windows because Samsung goes its own way (like to linux)?
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Online
Reply With Quote
Jun 25, 2013, 08:59 AM
 
Originally Posted by Uncle Skeleton View Post
I got it the first time. How is it any different a risk than if MS stops developing/profiting from Windows because Samsung goes its own way (like to linux)?
The difference is that Microsoft makes billions off the sale of Windows licenses, Google does not. The only two handset manufacturers which are profitable are Apple and Samsung, so if Samsung stops selling Android phones, the profit share of Android handset makers is zero.

So Google depends strongly on a single handset maker, but Microsoft does not depend on a single PC manufacturer.
I don't suffer from insanity, I enjoy every minute of it.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jun 25, 2013, 10:28 AM
 
Originally Posted by Spheric Harlot View Post
"Most" isn't good enough. "Most" graphics apps, for example live and die by constant zooming-in and -out.
Right, they "live and die" by this kludge in response to the screen being too small to even begin to work without constant zooming. That's why I'm saying kludges aren't as big a problem as you seem to be implying.


But yes, so you solve this particular instance by cluttering the interface with on-screen widgets that are there solely for the benefit of using a mouse, even though there is a far simpler and more effective interface when running on a touchscreen interface.

And that is exactly what you would do in every single other case.
In many cases (most maybe?), yes you do need to provide the user with multiple ways to get where they need to go, because not every user knows about the "far simpler interface," or for some reason they can't or don't want to use it. For example, I'm looking at google maps on android right now and they have both the clutter and the multitouch, and the double-tap which is another one I forgot about before. This many-ways-to-skin-a-cat situation is ubiquitous in both mobile and desktop computing.


Have you never used an iPad? The point of multi-touch is not to have several control buttons at once, it is to be able to touch several points on the screen at the same time. If I have an interface that allows me to, say, move four sliders at the same time, having five buttons on a single mouse that controls one pointer is as useful as having fourteen kinds of nails when I'm trying to whitewash a house.
It's not quite that useless; you could allow the user to map certain sliders to certain buttons or combos of buttons (in any given user's workflow there is a finite number of controls they need to operate). You could also shift-lock different widgets as needed, or toggle buttons or whatever. I did notice the video of MIDI Designer shows a user making two sliders move using only one finger

I think the real issue here is that you're basically saying "it wouldn't benefit anyone because it wouldn't benefit me." You're a power user, in a particular market, and for non-power users or even for power-users in other markets, a lot of these problems are irrelevant. Adding the capability to do this would not force you to use it, but it has a good chance of being a big improvement for other users.


Put another way:
One of the really cool and fun things about finger painting is not that it does similar things to what you can do with a single brush. It's that you have ten fingers to work with.
Right, you can finger paint with just 1 finger, but often it's way better to use multiple at once. Likewise, you can compute using 1 paradigm (touch vs mouse/keyboard) but often it's way better to be able to use multiple at once, or to switch back and forth at will.


The cursor was a REVOLUTION, not a kludge. It led to the creation of far superior interfaces, but ONLY if the legacy crap was completely removed (as Oreo or P wrote above: Windows didn't become a viable Mac competitor until the command line was buried far enough away that developers couldn't assume it would be used anyway).

How was the scroll wheel a "kludge"?

I define a "kludge" as something that worsens the product for being there, but is necessary to work around a problem.
Obviously the modern mouse worsened the product in some way, otherwise the Mac wouldn't have had to be dragged kicking and screaming towards it by 3rd party peripherals.

The cursor is a kludge the same way +/- palettes are a kludge; it's a blemish or clutter on the screen/content that shouldn't be necessary. All it does is indicate your point of attention. If there were another way to know the user's attention, the cursor would be horrific. The only difference between "kludge" and "revolution" is perception. If users had perceived the cursor as clutter, it would be as dead as those little nipples that used to be on laptops in place of trackpads.

You can implement extra controls for "mouse button down" or "context click" when attempting to control a mouse-based interface from a pointerless multi-touch OS, but that's still going to be a pain in the ass. The only way to get around that is to remove the problem and not allow cursor-based software in the first place.
What's wrong with 2-finger tap or long tap? I've seen these implemented already before, they can't be that bad.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jun 25, 2013, 10:37 AM
 
Originally Posted by OreoCookie View Post
The difference is that Microsoft makes billions off the sale of Windows licenses, Google does not. The only two handset manufacturers which are profitable are Apple and Samsung, so if Samsung stops selling Android phones, the profit share of Android handset makers is zero.

So Google depends strongly on a single handset maker, but Microsoft does not depend on a single PC manufacturer.
You expect this to remain the status quo? Were there no times when only 1 PC maker ruled the market? IBM? Dell?
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Online
Reply With Quote
Jun 25, 2013, 11:51 AM
 
Originally Posted by Uncle Skeleton View Post
You expect this to remain the status quo? Were there no times when only 1 PC maker ruled the market? IBM? Dell?
As far as I remember, there was no time when a single personal computer manufacturer dominated the whole market. IBM had plenty of competition by non-Windows/non-DOS computers back in the day when it introduced its first PC (e. g. Amiga, Atari and Apple). And Dell has never held a position similar to Samsung, there were always other big companies that were roughly comparable in size to Dell (e. g. HP or IBM/Lenovo).
I don't suffer from insanity, I enjoy every minute of it.
     
Addicted to MacNN
Join Date: Aug 2006
Location: Somewhere in the Pacific Northwest
Status: Offline
Reply With Quote
Jun 25, 2013, 12:53 PM
 
Originally Posted by Uncle Skeleton View Post
I think the real issue here is that you're basically saying "it wouldn't benefit anyone because it wouldn't benefit me." You're a power user, in a particular market, and for non-power users or even for power-users in other markets, a lot of these problems are irrelevant. Adding the capability to do this would not force you to use it, but it has a good chance of being a big improvement for other users.

Right, you can finger paint with just 1 finger, but often it's way better to use multiple at once. Likewise, you can compute using 1 paradigm (touch vs mouse/keyboard) but often it's way better to be able to use multiple at once, or to switch back and forth at will.
This pretty much sums up why products like the Transformer and a number of Windows 8 and Windows RT tablet/keyboard combos are so awesome. You have all the power of the multitouch display, but you also have a real QWERTY keyboard for heavy-duty textual stuff (like coding, remote server management over SSH, and working on documents and email) and a multitouch trackpad with real left/right mouse buttons for cursor-heavy stuff (particularly VNC, RDP, and TeamViewer for remote desktop management and access).

It's not "or", it's "and". You know, like the new Ford commercials. Nobody wants to stay at a bed OR breakfast or eat sweet OR sour chicken. It's not multitouch screen OR keyboard+mouse. It's multitouch AND traditional input methods.

If people didn't want this, there wouldn't be a market for keyboard docks for the iPad. The only thing missing there is the real laptop form factor (e.g. a hinge rather than a stationary desktop-only product that the iPad docks into, or a product like Logitech's keyboard that snaps onto the iPad and has to be removed, turned on, and paired with bluetooth to work), which is more comfortable to use when traveling and safely protects the display and keyboard when not in use.

As I've already pointed out, there's also the fact that a real keyboard dock + tablet combo enables expansion. My transformer dock has a full size SD slot and a USB host port, plus a second battery that sacrifices itself to keep the tablet's battery charged, so even when I'm out somewhere without access to a charger, the tablet stays up and running for hours.

Other things could be added, too, like an integrated smart card reader for business users, even bigger batteries, and other data connections (Thunderbolt, for instance, or wired LAN for applications where wifi is prohibited or not feasible).

What's wrong with 2-finger tap or long tap? I've seen these implemented already before, they can't be that bad.
It's not bad at all. I've used various apps in Android and iOS for remote access, and even without a mouse, it's relatively painless to use. These apps generally have a startup overlay screen (that can be turned off, of course) to remind you of the basic shortcuts for right click, click and drag, scrolling, etc.

When I worked at the USDA, the IT department started a pilot to use Transformer Infinity tablets in place of iPads and expensive ultrabooks (this was for the executive staff who demanded fancy gadgets and, being the government, they generally got what they wanted). It was a significantly better experience for end-users. Applications were available through Citrix, and the device itself was configured to sync the user's Exchange data (mail, contacts, calendars, and notes). Users who frequently traveled between multiple locations around the DC area found that they could sit at their desk and work in Citrix, go elsewhere, and pick up right where they left off with minimal pain. The tablet was essentially turned into completely portable thin client with the added advantage of giving access to email and documents (stored locally on the device) without having to log in to Citrix.

I'm not sure why anyone would want to insist that the idea of a tablet with additional functionality is a BAD thing.

Originally Posted by OreoCookie View Post
As far as I remember, there was no time when a single personal computer manufacturer dominated the whole market. IBM had plenty of competition by non-Windows/non-DOS computers back in the day when it introduced its first PC (e. g. Amiga, Atari and Apple). And Dell has never held a position similar to Samsung, there were always other big companies that were roughly comparable in size to Dell (e. g. HP or IBM/Lenovo).
I don't really know any numbers, but wasn't Dell much bigger than the competition in the consumer computing market for awhile in the late 90s and early 2000s?
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jun 25, 2013, 03:13 PM
 
The maximum worldwide market share of any single manufacturer since 1996 was 19%, by HP in 2007.

Market share of personal computer vendors - Wikipedia, the free encyclopedia
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 25, 2013, 05:00 PM
 
Originally Posted by Spheric Harlot View Post
What happens when one of those tablets breaks down in a year and a half? Will the next one conform to hardware specifications the software has been developed for? Will you still get tablets that run on the 2.3 version you've written for, or have all the tablets actually been updated continually to newer OS versions as you updated your software to run on them (the way it works on iOS), so you can just go out and buy an off-the-shelf replacement and install your app?

Windows XP had going for it that it had a HUGE installed user base, and that businesses could rely upon Microsoft making sure that whatever they built next was more or less backwards-compatible with their software investment. (People also tend to overlook that having to support XP for ten years was the LAST thing Microsoft wanted, and was only the result of a completely botched platform introduction with Vista.)


You are preaching to the choir in terms of the current mess, I don't disagree. You and Oreo might be right that the odds are against things getting much better.

I haven't used the Bada OS, but it just seems logical to me that no matter what the technology is there is an opportunity that Apple is not snatching up in making the cheapest devices for businesses similar to what Dell/Microsoft and the other players did. If there are business reasons why this will be difficult to pull off similar to or identical to the reasons that have been cited, so be it, I'm just coming at this from more of a bird's eye view in identifying the market opportunity.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 25, 2013, 05:05 PM
 
Originally Posted by Uncle Skeleton View Post
But android hardware is cheaper for consumers. I agree with besson3c, though I expect it will take longer than 5 years to make the shift. Android is still very immature which is the cause of a lot of the problems you described, but it will stop being so volatile in time, and then the cheapness of the gear will gradually win marketshare.
Exactly, although one hole in my logic is that Apple won't want to cater to those that want the cheapest gear. Maybe Apple will cut deals with companies that want large quantities of devices, or will make some sort of ultra-cheap device just for greater market penetration, even if they have to sell them at a loss. It's hard to see them doing this today, but again, my arguments are based on the premise that 5 years is a long time, so...
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 25, 2013, 05:08 PM
 
Also, to re-iterate, as far as Spheric's points about the hidden costs of having to support all of the legacy steal out there and software that was (as per his example) written for Gingerbread and not tested in Jellybean, these arguments about hidden cost are not going to matter to Joe Sixpack business guy, I guarantee it.

Mac users have been going on for years about the costs of having an insecure Windows OS dependent on anti-virus protection, but people don't give a shit, they just apply duct-tape that is Norton Anti-virus (or whatever the anti-virus software du jour is) and consider it problem solved.

The lowest price always wins.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 25, 2013, 05:12 PM
 
Originally Posted by OreoCookie View Post
Sure, that's why Android has by far the largest marketshare. But its profitshare is much, much smaller, and Android isn't developed for free. No one except Google seems to be capable of developing Android (have you heard much about Amazon's fork lately?), so if Google cannot sustain its development, the fact that devices are cheaper means nothing, because we're almost back at square 1.

You don't see the danger of Android development no longer being worth it for Google in case Samsung switches to another OS?
I don't. I think people overvalue Samsung, their real trump card is Android. I think they are just creating Bada as an insurance policy, and a bit of an experiment. There are lots of people that can create handsets, the real trick is making the software to run on them. If it's not Samsung it could just as easily be any other handset maker.
     
The Mighty
Join Date: Feb 2004
Location: Well the sports issue was within arm's reach but they closed up shop and kicked me out. And I'm out of toilet paper.
Status: Offline
Reply With Quote
Jun 25, 2013, 06:13 PM
 
Well here are my predictions:

[1] the iMac will no longer be a desktop computer. It'll still be a full-featured Mac, running Mac OS, but it will be about the size (and thickness) of an iPad and can be used either as a desktop machine (i.e. plug in to electrical outlet and use a wireless keyboard/mouse), or as a touch portable like the iPad, and in either case, you can connect a larger monitor. The only full desktop computer left will be the Pro. Everything else will be portable or portable-desktop. (Computer labs will still use the iMac, or an educational version of it.)

[2] the need to have a working internet connection to sync between Mac and iPhone is stupid. Apple will implement their own wireless communication technology, something like Bluetooth (maybe Airtooth), that will phase out Bluetooth on wireless devices. (Issues with Bluetooth: constant energy hog, not useful for persistent networking or sharing big files, etc.)

[3] I'm really not sure what's going to happen between iOS and Mac OS, but I expect the two OSs will remain separate.

[4] iPod classic and shuffle will be phased out completely.
This one time, at Boot Camp, I stuck a flute up my PC.
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 25, 2013, 06:33 PM
 
Originally Posted by And.reg View Post
[2] the need to have a working internet connection to sync between Mac and iPhone is stupid. Apple will implement their own wireless communication technology, something like Bluetooth (maybe Airtooth), that will phase out Bluetooth on wireless devices. (Issues with Bluetooth: constant energy hog, not useful for persistent networking or sharing big files, etc.)

Why is it stupid? The fact that files (other than music) are uploaded to iCloud constantly, not just at sync time, makes it quite practical. It would be less practical and slower to have to do big data comparisons at sync time, which would be the case if the cloud didn't exist.
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jun 26, 2013, 07:23 AM
 
Originally Posted by Uncle Skeleton View Post
Right, they "live and die" by this kludge in response to the screen being too small to even begin to work without constant zooming. That's why I'm saying kludges aren't as big a problem as you seem to be implying.
That's not why you constantly zoom in and out of an image. You zoom in because you can more easily work on details when you can see them, and better work on other things when you have a broader overview. Unless you're working with a twelve-foot screen, you're probably always going to be zooming in, same way a painter will move closer to the canvas or farther away, depending upon what he's doing.


Originally Posted by Uncle Skeleton View Post
In many cases (most maybe?), yes you do need to provide the user with multiple ways to get where they need to go, because not every user knows about the "far simpler interface," or for some reason they can't or don't want to use it. For example, I'm looking at google maps on android right now and they have both the clutter and the multitouch, and the double-tap which is another one I forgot about before. This many-ways-to-skin-a-cat situation is ubiquitous in both mobile and desktop computing.
I assume you're talking about the mobile web view. Ugh.

It's ugly, redundant, ungainly, cluttery, and annoying. The Google Maps app on iOS has none of that clutter.

But that's because it doesn't need to compromise.

Originally Posted by Uncle Skeleton View Post
It's not quite that useless; you could allow the user to map certain sliders to certain buttons or combos of buttons (in any given user's workflow there is a finite number of controls they need to operate). You could also shift-lock different widgets as needed, or toggle buttons or whatever. I did notice the video of MIDI Designer shows a user making two sliders move using only one finger
Hey, of course I can program any interface tailor-made to whatever input device I have lying around. But if I'm going to completely redesign the interface around whatever input device I have lying around, I'm basically rewriting the entire user layer, which is exactly why one-app-fits-all is not going to happen (or at least, not going to work).

Originally Posted by Uncle Skeleton View Post
I think the real issue here is that you're basically saying "it wouldn't benefit anyone because it wouldn't benefit me." You're a power user, in a particular market, and for non-power users or even for power-users in other markets, a lot of these problems are irrelevant. Adding the capability to do this would not force you to use it, but it has a good chance of being a big improvement for other users.
I think you completely misjudge the market.

iPad is as different from traditional computers as Macintosh was from text-based computers. Most people don't notice because it's a completely natural progression, and because it looks largely the same.
But it starts off with things like pull-down menus and submenus.

The consumer market will stone you if you force them all to use six-button mice with custom scrollwheels and programmable interface widgets. The consumer market, when something doesn't work, just goes "huh. that sucks" and puts it aside.

I may be a power user, but that term has thankfully become quite meaningless over the past decade. Power user used to mean that one would delve deeply into the technology and spend weeks configuring every set-up and programming every detail of every environment.

Nowadays, it just means "willing to deal with whatever complexity is necessary to get the job done".

I spend my money on stuff that works the way I do and reduce complexity, and the market has moved towards me in a big way.


Originally Posted by Uncle Skeleton View Post
Right, you can finger paint with just 1 finger, but often it's way better to use multiple at once. Likewise, you can compute using 1 paradigm (touch vs mouse/keyboard) but often it's way better to be able to use multiple at once, or to switch back and forth at will.
Way to twist my point.

If you have a medium that requires ten fingers to make any sense, it is completely pointless to make it available for a single brush.

If I have a console with twenty-four sliders on it, making a version available with a single slider and twenty-four clickable buttons next to that makes zero sense, because it does nothing whatsoever that isn't already available.

Tell me what would be the point of this app, for example, reduced to a single-pointer interface:

Adobe Eazel for Photoshop

The only reason that even EXISTS is because of multi-touch.



Originally Posted by Uncle Skeleton View Post
Obviously the modern mouse worsened the product in some way, otherwise the Mac wouldn't have had to be dragged kicking and screaming towards it by 3rd party peripherals.

The cursor is a kludge the same way +/- palettes are a kludge; it's a blemish or clutter on the screen/content that shouldn't be necessary. All it does is indicate your point of attention. If there were another way to know the user's attention, the cursor would be horrific.
And guess what? When you're using an interface built entirely for direct manipulation through touch, that's exactly what it is.

Originally Posted by Uncle Skeleton View Post
The only difference between "kludge" and "revolution" is perception. If users had perceived the cursor as clutter, it would be as dead as those little nipples that used to be on laptops in place of trackpads.
The difference between revolution and kludge is improvement vs. detriment.

The mouse pointer is not just an "indicator of your point of attention". It is an entire proxy layer placed between you an the machine you are trying to operate. The mouse pointer isn't you, or your attention. It is a remotely operated tool that you can use to select things, manipulate things, and even open lists of commands to select from without having to type them.
It is the single focus of EVERYTHING you can do with a computer at a time.
It's the difference between you at a workbench, surrounded by tools, and you at a glass box, using a joystick to operate a robotic arm that can have pick up various tools.

Eliminating that is the elimination of an entire layer of abstraction is a HUGE thing. Putting it back, but suggesting that there is no difference, because it can still be operated by touch, is thoroughly confused and a profound misunderstanding of what's actually happened.

I'm not exaggerating, either: I've been working with this stuff for 25 years, and spent a decade of those years selling this stuff and supporting customers. The pros ("power users") will deal with whatever they have to to get their job done, and always have. But the Regular Joes are people to whom a computer is an alien thing, and it's largely the interface-by-mouse-proxy that does this. It's a stunning contrast to see those same people first encounter the iPad: That's just a natural extension of the world around them.

And application interfaces need to be completely re-thought for that.
     
The Mighty
Join Date: Feb 2004
Location: Well the sports issue was within arm's reach but they closed up shop and kicked me out. And I'm out of toilet paper.
Status: Offline
Reply With Quote
Jun 26, 2013, 08:47 AM
 
Originally Posted by besson3c View Post
Why is it stupid?
Let's say there's a thunderstorm, and the power goes out, or work is being done on the line and I can't connect to the internet... or some other nonsense, or say I'm visiting at a cabin and don't have internet, or only dial-up. Because of these circumstances, I can't sync wirelessly between my computer and iPod? That's bulls***. The devices are 2 FEET APART, I shouldn't have to use the whole entire internet to sync.
This one time, at Boot Camp, I stuck a flute up my PC.
     
Addicted to MacNN
Join Date: Nov 2002
Location: Rockville, MD
Status: Offline
Reply With Quote
Jun 26, 2013, 10:31 AM
 
Originally Posted by Spheric Harlot View Post
That's not why you constantly zoom in and out of an image. You zoom in because you can more easily work on details when you can see them, and better work on other things when you have a broader overview. Unless you're working with a twelve-foot screen, you're probably always going to be zooming in, same way a painter will move closer to the canvas or farther away, depending upon what he's doing.
I have a twelve-foot screen, I use it every day (for watching videos). Does that mean you agree with me?

(I'm not using it while writing this post, but more to the point all the same apps run on the twelve-footer as run on this two-footer; that doesn't make any of the apps break, all it means is that the user has the flexibility to choose which apps work better on which screen).



I assume you're talking about the mobile web view. Ugh.
Nope, it's the native app that came with my phone. It works better than the iOS maps apps in my experience. For one thing (that's on topic), zooming out to get an idea of what you're looking at is a real chore with multitouch, but pecking at the "-" button 5-6 times in succession is natural and easy. (or is there a secret multitouch trick I don't know about for quick zoom-outs?)



Hey, of course I can program any interface tailor-made to whatever input device I have lying around. But if I'm going to completely redesign the interface around whatever input device I have lying around, I'm basically rewriting the entire user layer, which is exactly why one-app-fits-all is not going to happen (or at least, not going to work).
You are being single-minded. All your reasoning relies on there only being one way. You imply that allowing a mouse means you can't use your multitouch anymore. That is not it at all. They are not mutually exclusive. The app developer can support both ways without detracting from either, in the best case. Even in the worst case the developer only cares about multitouch and the mouse-version is limited to single-finger, but the app still runs. Some users will only ever use single-finger anyway so this is not a problem. Others like you will only use the app on multitouch hardware so this is not a problem. One day you might even wish to open the app on your desktop for its non-primary function like accessing/converting data or to take a quick screenshot for a user guide you're writing (because the writing itself is far easier on a desktop).



I think you completely misjudge the market.

iPad is as different from traditional computers as Macintosh was from text-based computers. Most people don't notice because it's a completely natural progression, and because it looks largely the same.
But it starts off with things like pull-down menus and submenus.
I'm not saying they're the same, I'm saying that keeping them separated is a kludge. Some tasks are far better suited to one, and some tasks are far better suited to the other, but sometimes you need to do both tasks in the same workflow. Do you get what I'm saying?


The consumer market will stone you if you force them all to use six-button mice with custom scrollwheels and programmable interface widgets. The consumer market, when something doesn't work, just goes "huh. that sucks" and puts it aside.
But that's not true if you force them to use 6 fingers at the same time?

These advanced gestures are just as obscure as advanced button mapping or advanced keyboard shortcuts. Their availability must be strictly optional, or else the consumer market would stone you.


Nowadays, it just means "willing to deal with whatever complexity is necessary to get the job done".

I spend my money on stuff that works the way I do and reduce complexity, and the market has moved towards me in a big way.
Keeping the tablet and PC two separate things instead of one is more complex. Picking up your work to take with you when you go to the water cooler (or water closet) is "stuff that works the way I do." Your description here supports (the option of!) merging the two, not stubbornly keeping them separate.


Way to twist my point.

If you have a medium that requires ten fingers to make any sense, it is completely pointless to make it available for a single brush.
Way to presuppose your conclusion.


If I have a console with twenty-four sliders on it, making a version available with a single slider and twenty-four clickable buttons next to that makes zero sense, because it does nothing whatsoever that isn't already available.
Ok, and if I have a text-based app with no sliders on it, then making a version available with no sliders and a cursor makes plenty of sense. Query: if there are useful applications for a technology alongside useless ones, does that combo make the entire technology (A) useless or (B) useful?

Tell me what would be the point of this app, for example, reduced to a single-pointer interface:

Adobe Eazel for Photoshop

The only reason that even EXISTS is because of multi-touch.
Tell me what would be the point of a battery monitor app, or a trackpad driver, on a desktop with neither battery nor trackpad? The only reason they EXIST is because of the laptop. By your logic, laptops and desktops should run different incompatible operating systems. If any app doesn't make sense on both, then NO app makes sense on both. Amirite?


And guess what? When you're using an interface built entirely for direct manipulation through touch, that's exactly what it is.
That's exactly what I said.


The difference between revolution and kludge is improvement vs. detriment.
That's another way of saying what I said; those are measures of perception.


I'm not exaggerating, either: I've been working with this stuff for 25 years, and spent a decade of those years selling this stuff and supporting customers. The pros ("power users") will deal with whatever they have to to get their job done, and always have. But the Regular Joes are people to whom a computer is an alien thing, and it's largely the interface-by-mouse-proxy that does this.
Bingo, improvement is a measure of perception. The power users see complexity as improvement, while J6Ps see the very same complexity as detriment, and they're both right.


And application interfaces need to be completely re-thought for that.
Allowing it to run in 1 finger mode too will take nothing away from your precious new frontier of interface enlightenment.
     
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Online
Reply With Quote
Jun 26, 2013, 12:39 PM
 
Originally Posted by besson3c View Post
The lowest price always wins.
I think that's flat-out wrong, especially since the profit share is usually in the upper market segments. Lowest price only works for commodities like toilet paper, stuff that anyone can make. There are few companies in the world who can write operating systems or produce cars.
I don't suffer from insanity, I enjoy every minute of it.
     
Grizzled Veteran
Join Date: Mar 2002
Location: NY
Status: Offline
Reply With Quote
Jun 26, 2013, 12:42 PM
 
Exactly what it is today. A Consumer Electronics Co. The only big difference IMO, is that all devices will run a Hybrid OS.
Somewhat like OSX Desktop & OSX Mobile, but not OSX. Maybe Apple OS Liger. Desktop, and Mobile? Maybe not. IMO, we will have a better sense of direction when pricing on the new MP is released. I could be way off base, but there are rumors of an "all in one" OS in development in Cupertino.
To know your Enemy, you must become your Enemy.”
Sun Tzu
     
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 26, 2013, 01:31 PM
 
Originally Posted by And.reg View Post
Let's say there's a thunderstorm, and the power goes out, or work is being done on the line and I can't connect to the internet... or some other nonsense, or say I'm visiting at a cabin and don't have internet, or only dial-up. Because of these circumstances, I can't sync wirelessly between my computer and iPod? That's bulls***. The devices are 2 FEET APART, I shouldn't have to use the whole entire internet to sync.
Can't you sync over WiFi using iTunes?

I'm okay with secondary local sync mechanisms as a supplement to iCloud, but not with replacing iCloud.
     
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Jun 26, 2013, 07:46 PM
 
Originally Posted by glideslope View Post
Exactly what it is today. A Consumer Electronics Co. The only big difference IMO, is that all devices will run a Hybrid OS.
Somewhat like OSX Desktop & OSX Mobile, but not OSX. Maybe Apple OS Liger. Desktop, and Mobile? Maybe not. IMO, we will have a better sense of direction when pricing on the new MP is released. I could be way off base, but there are rumors of an "all in one" OS in development in Cupertino.
Could you point me to those rumors?

The only things I've read are musings such as those presented in this thread, or by the kids on the Verge forums. Nothing approaching actual "rumors".
     
 
Thread Tools
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On
Top
Privacy Policy
All times are GMT -4. The time now is 12:39 AM.
All contents of these forums © 1995-2015 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2015, Jelsoft Enterprises Ltd., Content Relevant URLs by vBSEO 3.3.2