Apple is pushing computer users as fast as it can toward a centrally controlled computing ecosystem where it makes all the decisions about what native applications may be used on the devices it sells — and takes a cut of every dollar that is spent inside that ecosystem. This is a direct repudiation of its own history, and more broadly that of the larger personal-computing ecosystem, where no one can stop anyone else from writing and distributing software that other people might want to use.
Steve Jobs says Apple is a curator, nothing more. This grossly understates the control. Jobs says Apple has “made mistakes” in being the police, judge, jury and executioner in its Disney-style world, and is working hard to perfect the system.
But this is a disconnect with reality. Central control, no matter how well-intentioned, is itself the problem, not the solution. The “enlightened dictator” is fiction. And dangerous.
I realize that I won’t persuade the many people who prefer to live in gated communities, believing they can leave any time they wish. But switching costs will only get higher over time for those who choose to live in the Apple ecosystem.
As noted, I’ve been happy in the relatively free Mac world. But given the slowing pace of Mac OS development, there’s reason to believe Apple is mostly milking Mac OS users. Will it phase out serious PC development? Or will it eventually move its command-and-control methods up the value chain to the Mac? Apple says it’s committed to the Mac’s future. I’m not so sure, especially after Jobs, speaking at the Wall Street Journal’s All Things Digital conference earlier this month, made it clear that he believes the iPhone/iPad ecosystem is the real future of personal computing, with PCs becoming a much smaller player. (I’m a believer in tablets, and am planning to put my money there on the Android OS when tablet manufacturers adopt it in tablet-sized formats.)