Utter rot.
This used to happen all the time in the 1980s and 1990s, before the DoJ anti trust lawsuit really got rolling.
It was most obvious in office apps (ever wonder where the third-party spelling checkers and grammar checkers went? Or the standalone mailmerge applications? Microsoft added their functionality to Word and killed an entire add-on market at a stroke each time they did so), but a load of that stuff happened in Windows too (the graphical shell that became an OS in its own right). The most flagrant late example was web browsing; the most recent one I can think of (not being a Windows user) was their antivirus/malware add-in.
(Honestly ... young 'uns these days ... wanders away mumbling into beard and waving walking stick in the air.)
The combination of dominating two or more levels is what made it lethal, OS/GUI/Application companies aplenty back then but only one company that did them all and that used its own internal knowledge in order to make life very hard for the competition. And the final key in the lock was the Application level and nobody that made it big in the systems sphere ever got there besides Microsoft.
Microsoft may have lost their anti-trust lawsuit but the damage was done and it won them the war until the web came along.
If Sir Tim should be remembered for anything at all it was for breaking the stranglehold Microsoft had on personal computing, freeing us from the Application level headlock.
Oh, and in my time we didn't have walking sticks.
Proprietary file formats - what happens when you don't set out to make a shared format, or an evil plot for lock-in and dominance? You decide.
Yes, and the reason for the DOJ anti-trust lawsuit? Developers were screaming bloody murder.
"Angry developers" enter into this in no meaningful sense. To the players actually involved in that fight, on all sides, developers were simply cogs.
Apple doesn't hold a monopoly on mobile operating systems and, even if it did, it would take more than simply putting someone out of business with a new offering to put it on the wrong side of the law. It's easy to see that once a company gets as big as Apple, any new feature it builds is going to put someone out of business somewhere.
(Instapaper will be fine, by the way).
Microsoft has practically walked on egg shells when dealing with additional solutions/features since Windows 95 due to fears of anti-trust and also alienating developers.
How long did it take them to get a s/w firewall and anti-virus solution? 10+ years after the fact. It wasn't because they couldn't figure it out.
How about simple features to VS.NET that have current 3rd party add-ons? It just does not happen.
Recently on HN there was a post saying that you shouldn't chase anything obvious and generic without expecting it to get 'filled in'. A cloud service for audio? Yep, when the big players get around to it, you're out. An imaging collation app for dentists? The big players are not going to do that.
> authority on taste and acceptable content
I don't know about that. If I want something that Apple doesn't approve for their store - say Grooveshark, for plausible legal complications in the future - I get the Grooveshark app anyway, from some other store that's legal (hint: jailbreak).
Personally speaking: I would rather have someone make sure (for free) that the app that I download isn't buggy, or crash-y, or other stuff. In practice, I've never missed something on the App Store (iPad 2 that I bought a few months ago) although I miss a lot of stuff on the Android Market (Nexus One that I own since a few years now). I got an Android before I got an iOS device going by such arguments from the couch. But I'm glad I can choose my phone, and it's going to be iPhone 4S when it launches here. I'm glad you can choose your phone, I presume that'll be an Android. That way, both of us can be happy.
Marco needs to get over his fears[1] and store the offline data in Documents. It is user generated and is absolutely not transient nor re-downloadable given that a core feature of Instapaper is offline reading.
Apple will get a deluge of bug reports and questions from all app devs that make apps that need to cache content for offline use - but not back it up or store it in iCloud - and will rectify the situation in some way. (I don't think Instapaper is in that camp but that's kind of beside the point.)
I know that most people hold Apple to a higher standard than many other companies but let's not forget Hanlon's Razor: "Never attribute to malice that which is adequately explained by stupidity." This is merely an oversight. Apple is never this hostile to the user experience, and the current guidelines make for a positively horrid user experience. It will be rectified. Is there a short-bets website? I'll make that bet any day.
[1] I think he was correct not to take chances in getting the first iOS 5 version out, but I hope that the minute it was "Processing for App Store" he had a build ready for submission that stores content in Documents to feel out the review team's reaction to it.
I won't excuse Apple for acting like a King, but I think Jeff should find another poster boy for benevolent dictators. Microsoft is famous for steamrolling third-party developers, both from their applications group and their systems group.
I think this rant would read better if it complained about ALL proprietary platforms and used Apple as an example, rather than disingenuously implying that they are the rotten fruit in the barrel.
p.s. Joel Spolsky once said that companies always try to "Commoditize their complements." If you as a developer can create something that adds value to the platform in a broad way, it's inevitable that the platform owner is going to want to commoditize it, either by giving it away or making it easy for your competition to drive prices down to negligible levels.
Building it into the platform is the ultimate commoditization.
I suspect that Jeff is in a Microsoft-centric culture, so he simply doesn't spend as much time talking to Apple developers as he does Microsoft developers, so of course he hears more from them. Likewise, there are way more Windows developers than OS X developers, so you'll always hear more about anything from them.
iOS is popular, so there ought to be plenty of developers out there in the long tail. Maybe that's the real problem: Most developers are trying to grab a little tiny piece of the pie with a niche side-project, so they don't have the same perspective as someone who has employees and a big marketing budget sunk into their business.
If this supposition is correct, you'll see the same dynamic with other walled gardens from Microsoft and Google and whomever else gets into the "curated app store" business.
I'm not trying to defend Microsoft. It's all too clear they have been very anti-competitive. But if windows had the controls on it that apps on iOS had we'd be hearing folks call for criminal prosecutions.
I understand the gee whiz factor of Apple. I own a bunch of Apple stuff and I love their design. I also understand that if you don't control your garden, all kinds of weeds grow in there. But geesh, folks, Jeff is correct. Perhaps this is the best future we could hope for, but it is an extremely sub-optimal destination compared to where we thought we were going.
The problem is, as devs we are spoiled by the web, where you can just push out new code that says or does whatever you want it to without any consequence because the web is a "relatively" safe runtime, so nobody cares.
We are like children who grew up with a silver spoon in our mouth and we've been asked to endure plastic. Sure, it's still a spoon, but it's not silver and that pisses us off.
Well, yes, and that's why we had crappy phones for years before iPhone and Android arrived, as noted by Jeff Atwood. I think it shows again that, not unlike the first PCs and their open slots for third-party cards, openness is the right move.
Many people here seem to renounce to this, mostly because Apple's walled garden is currently at a pinnacle, but in my opinion it is an accident. Apple's products manage to grasp most of the attention around for emotional reasons, and some forget that the future is not built on closed formats, closed markets, etc. I am surprised that it is not so obvious, especially for US citizens. The Web builds on open protocols. The PC-era built on the fact that you could open the box and plug things of your doing in it. Wikipedia, which everyone uses constantly and forget, is an open community task.
My two years-old kid is learning a lot, but it is not linear, he sometimes go backward a bit. I think we are in this backward pulse, with everyone suddenly dissmissing the possibility to open, change, modify softwares, in exchange of a (temporary) slightly better user experience.
To the Apple lover crowd: please feel free to downvote, and check my past comments to downvotes them also if you didn't already. I'm used to it now, I don't care about my karma. Anyway, I'll loose some more re-upvoting all those perfectly acceptable comments I find grayed in some threads, apparently because they are not respectful enough to the King.
Oh, and I like Apple, I learnt computing on an Apple ][, I applauded to the very smart Unix move for MacOS, many of my colleagues and friend own a MPB or A, but none would be enough of a zealots to try to hide out comments expressing concerns about walled gardens.
Apple is milking their success right now, because developers have iPhones and they want to support their favorite hardware. But every story like this drives off a handful of developers. And every week the simple economies look more and more favorable to the competing platform.
Really, the Apple ][ analogy is very good. That platform too ultimately killed itself not for technical reasons, but because Apple refused to give up its margins in the face of the C64 and PC markets and assumed that its stocked developer mindshare would save it. It didn't.
We are still in the early history of popular computing. It may be that openness tends to win in the end but I think this is far from established as fact at this point. I think you put this in unnecessarily black and white terms too. The future may favor hybrids of open data models and closed, native clients.
Back then, they had a huge lead over other platforms due to the then-revolutionary GUI environment on the Mac. Then Jobs was forced out of Apple, and their marketing and product design couldn't keep up with what was being developed in the more open and more chaotic corners of the industry. It's no coincidence that the most open hardware architecture won, and that the company that developed the most successful OS for that platform became the dominant software company for over a decade.
Android is to iOS in 2011 what Windows was to MacOS in 1990, and in 2025, Apple will probably look a lot like it did in 1995.
No, we're like adults used to being able to do what we want with our own hardware. Being able to do what you want with hardware is the historical computing norm (gaming consoles being the exception). Android-based devices carry on this tradition so I use them, rather than paying Apple $99/year for the privilege of putting applications I've written on my own hardware.
2. The cleanup feature doesn't really support his point. If I store data on my phone and the phone deletes it all without warning when it thinks I have too much, that's not protecting me at the expense of the app developer - that's just plain screwing me and the developer at the same time. Honestly, I find it incomprehensible that any professional could possibly have considered it a good idea, and I think it's indicative of Apple's manic secrecy that it wasn't headed off early instead of being ignored until release.
I know Apple's doing really well in the market lately - by innovating quicker than anybody else, which has been fantastic for everyone. But in the long run, this arrogance is not going to be good for them. It shot them in the foot for two decades with the Mac, and it's going to bite them now.
The "correct" place to put user data on iOS 5+ is any place that syncs with iCloud. That's the crux on this issue: iOS 5 is Apple's assertion that App Store users are their customers, not the app developer's customers, and they want to handle the backup and security around their customers data.
And honestly, if /tmp were the only filesystem I were allowed to touch, then yeah, I'd try to do something with it. That's kinda screwed up.
Um? My understanding is that Apple has forbidden developers from storing in cloud-sync areas anything that is re-downloadable or temporary. Instapaper pages are almost by definition re-downloadable; and there's no need for permanence or even synchronisation with the cloud, just a need for the system not to silently delete the files.
Is my understanding of Apple's policy flawed?
Is it really so hard for me to get some type of notification from the OS saying I have very little free space and I have x amount in temporary files stored and give me the option to remove them. It could tell me that removing these files may have adverse effects on 3rd party applications I might have installed.
I understand the impulse to look at however many millions of IOS devices and to immediately want to get into that market, but the long tail is not a comfortable place to be in a land of 99c standard prices. Having an arbitrary and capricious landlord makes it worse.
I don't like Apple philosophy in the slightest and don't own any apple gear (old ipods excepted), so I'm not a fan of theirs in the slightest. But this 30% thing is just a non-issue. They're a retailer providing content. 30% is a boringly normal number for the retailers cut.
If amazon wanted to share their profits with the retailer (like book publishers have to with brick and mortar stores) then those books would be available. If Apple allowed those purchases for free, then they don't get any benefit for providing the service.
At a basic level, it's because her mental model of How Things Work broke. She thinks that it's reasonable that using the Amazon Kindle application on any platform she ought to be able to buy books from Amazon as part of that app. After all, she's signed in, Amazon knows who she is, at most they ought to ask for a password confirmation to let her spend her money.
And that doesn't work.
So she was unhappy with Amazon. Reasonable. She was so unhappy, she posted on Facebook about it, where a dozen of her friends pointed out that Apple wanted a 30% cut of content sold on their device.
Their device. Not her device. She paid for it, but now she sees that it isn't hers.
She thought that an iPod was like a computer: you use it, you select software to put on it, you use the software.
Her mental model breaks AGAIN, because she thought she owned the iPod. Now she finds out that it's actually an Apple-owned store that doesn't like competition.
Apple got their benefit when they sold her a device. They got more benefit when they sell her software to go on the device. Why should they get more benefit for interfering in a transaction between her and Amazon?
She knows what rent-seeking means, and she doesn't appreciate it.
http://www.amazon.com/forum/kindle/ref=cm_cd_ttp_ef_tft_tp?_...
Even big apps will eventually include features initially provided by plugins. See Photoshop or Jira.
Not to mention the strategy of web giants like Google who will purchase existing successful commercial companies and the offer their product for free, thus crippling entire markets. See Google Analytics, Earth and Sketchup.
I know that the Android equivalent has problems (piracy, for instance), but I'd rather have that than something completely locked down.
And seeing as Windows is not and never was an OS for power users, they will lose the remaining desktop niche - which will be populated by exactly those power users - as well once the tablet will be our primary multimedia communication device.
Apple is gunning hard for a market worth one trillion dollars, not Marco Arment. He is inconsequential collateral damage in Apple's race to a 100% managed computing experience.
If you develop something that is just an 'add on' or a missing feature you are setting yourself up for eventual trouble.
Such products have a life cycle and you can't reasonably expect the situation to continue unchanging forever.
If so, did they actually ask Marco before they took that link out?
If so, did Marco ask them to take it out?
And, if he did - which, having heard Marco speak on this topic, I do not assume, but merely suppose - was that the right call?
My understanding was that an App Store developer might kill for that kind of free publicity. Could it be, for example, that Apple stopped linking Instapaper so as to avoid playing favorites? Might one of Instapaper's competitors have complained about that link?
But Apple is pushing the envelope: they are the first platform to break the "specific device limit". Android competes on phones, Windows on the desktop, Amazon with content, Samsung on hardware. But Apple is everywhere. And they are not the underdog anymore. This is Tim's Cook real challenge, and we wish him luck.
You know what I also don’t understand? What this has to do with the big open vs. closed debate. Apple implemented a new feature in their own browser. Google can just as well implement the exactly same feature in their browser. Open vs. closed doesn’t figure into this. At all.
That whole cleaning behavior of iOS debate is just stupid. Apple screwed up. So what.
http://www.tbray.org/ongoing/When/200x/2003/07/12/WebsThePla... http://weblog.raganwald.com/2004/11/sharecropping-in-orchard...
Yes, because operating systems have been written that make you a true sharecropper - not only you do need permission from the liege to distribute your app, but the permission can be revoked at any time, whenever your lord feels like it.
This was not the case with the proprietary systems of the past. Even Microsoft, being criticized as the evilest of evil did not try to stop you from distributing your Windows apps.
Courtesy of Google search by image: http://www.google.com/insidesearch/searchbyimage.html
Google can arbitrarily and capriciously exclude them from their index. When google excludes you from the index, there is no appeal, there is no explanation, and, unlike Apple, google will not publish a set of (reliable) rules. (It gives a lot of advice but is inconsistent.)
Also, like Apple, if you are not able to get in the big leagues for distribution, you can distribute your product thru other, less popular channels that are more of a hassle.
Unlike Apple, however, which give you explicit feedback on the feature that was the problem (with screenshots if needed) and always cites chapter and verse from the handbook for the exclusion, google will not tell you why, or give you any way to resolve it.
With Apple, you can resolve the issue and resubmit it. Your app will be on the store in about 7 days. With google, even if you figure out what the problem is, and you resolve it, you have no way of knowing if you'll ever be let back into the index.
As a web developer on SEO forums, I hear of these cases all the time, but when you analyze these sites, usually 1 of 3 things is happening:
- Web developer(s) made a mistake, causing a (search engine) accessibility problem.
- The site is in violation of the Google webmaster guidelines.
- The site lacks (unique) content, or otherwise doesn't contribute at all to a healthy search engine index.
I never read of a case of Google arbitrarily or capriciously excluding sites from their index, offering them no way to appeal. In general, I also think the advice Google gives is far from inconsistent.
This could be a popular position for a web developer who got a site de-indexed, but maybe apply Occam's razor first. A mistake? A trend? An algo change? Worthless content? Blackhat SEO? Bad architecture? Got hacked?
Or do you want to jump immediately to Google arbitrarily removing well-intentioned sites from their index? I guess then you can blame bad luck of the draw.
Example: "Your website has been removed from our indexes because it displays too many X's.", when no other search engine has thought of filtering websites based on X.
And because the decision is probably based on many, many parameters and weighted, it would have to tell you the values (and descriptions!) of each one of them, making the message even more sensitive.
When was the last time you hacked something together for yourself on an iOS device?
Are there any alternative channels that don't require jailbreaking?
and obviously, there are other, extremely simple and effective ways to reach a website: typing in the address bar, links by email, facebook, twitter, etc., etc.
the app store is the one and only gateway to the iOS platform.
This is truely unprecendented. Microsoft could screw you by cloning your app, but they never blocked third-party applications, nor tried to be the commerce gateway to the internet.
If Apple succeeds in making webapps obsolete, and competition cannot be strong enough to force it to be fairer and more reasonable in its app store policy, than to me an ipad/iphone app world sounds like a regression from the webapp world.
And this is why I never understand why so many Apple users want Android & Windows to fail. As a customer, you should want other platforms to be succesful, so that we don't end up again with a monopolistic platform that screws us all. Didn't we try this before??
Huh? I thought it was the other way around, no? (i.e.: it's not fee, must die, etc)
And really, webapps aren't going anywhere. The App Store was created because of user demand (not because Steve Jobs was a genius), and with Webkit and newer mobile IE it's probably the best time ever to write an web app. It's up to developers to choose.
This whole 30% thing comes from a new demographic that has never had any previous experience with business, who don't really understand the value of distribution channels and middlemen.
While in theory it's nice that people can run any application they wish, in practice, it sucks. People end up having to be experts on computer security. As a group of computer professionals we've pounded on this for twenty years and it simply isn't fucking working. If telling people to be careful what programs they run or what websites they visit worked, it would have worked long ago.
Instead, I give them ipads for casual browsing and they're finally secure. My parents don't need my help to get pictures off their camera. There finally is a way for non experts to securely use the internet and applications -- just buy stuff from the app store. It won't spam you, it won't steal information, it won't install spyware, and it will most likely do what it claims to do. If not being able to run arbitrary apps is the price we pay... well, we tried doing it the other way for 20+ years and it didn't work.
While in theory it is nice that people enjoy essential liberties, in practice, they suck.
Franklin sends his regards.
> it won't steal information
So, it's okay if Apple collects your personal data instead of some criminal? Sure is hypocritical.
>well, we tried doing it the other way for 20+ years and it didn't work.
You know why it doesn't work? Because we let people use computers, but don't require them to learn how to use them. I still don't get why people think they are entitled to use a computer. It's the same as demanding to drive a car without knowing how to operate it, in addition to having no clue about traffic rules. We're not attacking the root of the problem, we're simply slapping the symptoms around.
Criminals: steal tens of thousands of dollars from your bank account. And you may well be stuck with the losses. Apple: knows a bit of your personal information. So yeah, those are comparable.
And computer training? Another stupid idea that demonstrably doesn't solve the problem. People have been trying that since at least the windows 95 era. Strangely, there are still bot nets, viruses, malware, spyware, electronic theft, etc. But I'm sure it's going to really work any day now.