Apple, on the other hand is almost the opposite. They're obsessive about upgrading everyone and leaving the past behind. That's great because they're free to make bold and innovative changes. The downside of course is that you and your files sometimes get left behind.
Both are valid strategies in my opinion and appeal to different customers. If you feel like backwards compatibility is a really important feature then whether you want to admit it or not, you probably would be happy as a Microsoft customer.
The great thing about #1 is that if Apple really did leave money on the table here, any third-party developer can collect it. Just put a Mac with two versions of keynote behind a REST endpoint and charge a buck or two per conversion. If you can turn a profit at that, not only do you get the smug satisfaction of clearly winning an internet argument, but also there's a cash prize.
If, however, that sounds like a waste of a perfectly good weekend, an economic bet that is unlikely to pay off, it would be an equal waste of a weekend for an Apple engineer. Apple is not a charity; they are a business that takes calculated risks, and they didn't like this one. Did they miscalculate? If you think so, there is no reason not to fill the gap yourself.
They leave the door open to losing customers to companies like Microsoft with these kinds of decisions. But that's the choice they made.
Software is hard. I think it's pragmatic for software vendors to have a strong, transparent philosophy about the trade-offs so that consumers can make the right choice. As the grandparent points out, Microsoft values backwards compatibility. If you value that too, buy Microsoft.
This is not a useful statement. If you convert the "should" to a "shall", and try to design a system around that requirement alone, it would prove very hard. Even designing a clock around that requirement is very hard (http://longnow.org/clock/).
If you back off from the "shall", then you are in the standard world of engineering tradeoffs, which is where you started.
What features do you want to give up for "eternal"?
This can be said for all sorts of things, from the Commodore 64 to Keynote. If you want to be able to access the data on an NES cartridge you are going to need special hardware and software. Yes the ROMS are available now online, but that's only because someone with the hardware and software made it available to others.
This is the nature of proprietary formats.
A better option is to stick to open standards. If your presentation is done in something like HTML, there is a 100% chance you will be able to view it in even 50 years as it is standardized and there are many implementations available for viewing it.
Of course, if your presentation depends on some javascript calculations or a remotely hosted jquery, those probably won't be working in 50 years time either..
It seems to me you can do pretty much anything whilst having a small set of conversion scripts to get the old data into your preferred new format.
Its not "big data", not streaming data, not rocket science.
Absolutely. One is for customers who don't want to lose any file or data. The other is for those who don't mind losing old stuff, no matter how important it may be. Now that's bold and innovative.
I'm not saying 'go full Stallman'. I'm just saying think that when ever you hand over your data to a private company if they consider it as important as you consider it.
Proprietary software is fine, but if long-lasting hardware is dependent on it bad things happen when that software company decides it's no longer worth supporting.
Its easy to think, "Why don't they use [New Hotness Software ]?" Which on the surface seems to be a good idea. Until you absolutely need sub-millisecond precision I/O, then you kinda start to cry when you realize how hard precise timing in computers is.
If you use lab equipment in say Linux, BSD, OSX, Windows. Your using a time shared, not Read-Time OS. So your I/O events aren't when the event happens, but when the scheduler is damn well ready to let you know the event happened.
The easiest example is a timing equipment I was using to count digital pulses form a quartz crystal. In a 'modern' secure OS I couldn't really get below 0.1% margin of error. Which wasn't acceptable for the equipment which wasn't low enough for our uses. I fall back to an older insecure real-time platform and bumped up to 0.005%.
Security is great. I attend security conferences in my spare time and try to stay up to date on the topic. The main problem is when you get into most this computing everything isn't running glibc and win32. Hacking isn't very easy unless you know the system to start with.
Just to emphasize how prevalent this is, I work in a 5 story life sciences research center with hundreds of personnel. All of our lab computers are airgapped, many of them stuck on OS versions >10 years old.
I used OS X 10.1 the other day. That was certainly an experience.
While I like non-proprietary software, there is nothing inherit that makes much of a difference for the lay person. As the complexity of the software goes up, so does the level of what is considered a 'lay person'. For any sufficiently complex software and non-pervasive problem - you're SOL in both cases. Standards help more than proprietary/non-proprietary. Lower complexity also helps more.
(Standards wouldn't have helped either.)
In my opinion this is totally unacceptable, and fundamentally opposite their philosophy of solving hard usability problems to make the user's life easier. If they were so committed to that, then they would automatically read and update old files, and do so accurately.
More and more, Mac OS X is becoming a shell for other peoples (preferably open source) software that I actually trust :\
You basically have to pirate it in order to do these kinds of shenanigans.
Apple has its pros and cons, and one of those is that you have to keep upgrading regularly.
Most proprietary software will offer upgrade paths from older versions.
In theory, open source is better because the old code is still out there and you can get it up and running to upgrade your data. In practice, it can be pretty tough to get open source code that hasn't been maintained in years to build and run properly on a current OS install.
With Libre software (rather than OSS), you the user stay in charge. It may require rolling up your sleeves (things have changed, and you can now easily find companies that will provide support in Libre software), but in the end you keep control.
MS is one of the best in supporting old formats. That's part of how and why the managed to keep their monopoly on the desktop. There's not many OSs out there where you can start a 20 year old software and it runs almost always perfectly. (Try that on Linux or OSX.)
https://plus.google.com/115250422803614415116/posts/hMT5kW8L...
As for MS, my experience has been different. All too often, when trying to pull up old (mid-90's) Word documents, MS Office chokes, where Libre/OpenOffice chugs along just fine.
On OS X that would certainly fail (even much software that's between 5-10 years old would fail) this test.
But Linux provides a very stable ABI.
20-year backwards compatibility might be stretching it for Linux, since it's only around 20 years old to begin with. But if you have (e.g.) a statically linked Debian binary from the late 1990s, I'd bet almost anything it would still work on your desktop today.
Pages '13 can't open templates created with Pages '09.
Pages '13 can't copy and paste lists with numbers into plain text.
There's a reason why it has 2 stars on the app store. Somebody seriously dropped the ball at Apple. Office never pulled shit like this. As soon as '14 is out for mac, I'm removing iWork. Caring about design is one thing, caring about design more than the existing work of your users is another.
Adobe’s marketing will tell you otherwise, of course. I used to share an office at Berkeley with Paulo Ney de Souza, who had a wonderful collection of “legacy” pdf files that could no longer be opened in Acrobat that he would trot out for the Adobe sales people when they came by (he was helping to get MSP off the ground at that point).
PDF is probably the best choice for preserving “design”, but I wouldn’t trust it for preserving content any more than any other format. Always keep a plain text copy.
I agree, but have a look at PDF/A (A is for Archiv{e|al}): http://en.wikipedia.org/wiki/PDF/A
> PDF/A is an ISO-standardized version of the Portable Document Format (PDF) specialized for the digital preservation of electronic documents.
> PDF/A differs from PDF by omitting features ill-suited to long-term archiving, such as font linking (as opposed to font embedding).
Edit: yep.
That's worked out fairly well so far. Stuff I wrote back in the 90s is still readable. Probably not that great if you're a designer, who needs something to look just so when printed. But for near enough everything I do, it's more than sufficient.
Easy to convert too, since it's just tagged text.
The date came sooner than expected.
Patents would not discourage manual conversion. I.E., a human looks at the old presentation and recreates it in the new software. I'm just not sure that there are any presentation worth this cost. God knows that most presentations I have been subjected to are not.
I don't see why it's reasonable to expect to be given software from the CD/DVD era as a digital download. It would be nice, yes, but Adobe will not give you a digital download of CS5 (I tried.) I'd be surprised if Microsoft would give you a digital download of Office 2003.
The inability to read old documents is shitty, yes, but Apple made a solution available. If you need 5-year-old software, then it's not unreasonable that you need some now-obsolete hardware.
AFAIK it's not necessary to buy Apple-branded drives - there are cheaper alternatives. You can also "share" the CD drive of any other modern Mac on the same WiFi network - I've used the family iMac to load Creative Suite onto my MBA. I bet they'd also let you use an optical drive at the Genius Bar, even if you're out of warranty.
By comparison, my current version of office easily allows saving (not just opening, but saving) in versions compatible with office '97. That's ~17 years of saving backward compatibility.
Though a lot of the reasons for the complaints is that you don't need generally need a copy of Office 2003.
I think we've reached the point where computers have become much more than powerful enough for a lot of the common tasks people use them for. The rest is just marketing with an aggressive "newer is better" campaign.
IMHO "forced deprecation" is nearly never a good thing. Change in software (and hardware) should be an evolution, not a revolution. Fix bugs and add features, don't take away what was there before. I think a lot more people value stability over "latest fashion" than what companies and the like would want you to think, so they can keep you consuming.
If one were to go Microsoft on the Mac, what are my choices? Office 2011? (it's 2014 now after all)
I know there's OpenOffice too
#Apple #iWork #fail #proprietary #OpenOffice
FFS, this is "HackerNews"... write a bash script to do it for you. Better yet, use some "super 1337 h4x0r google-fu" and search "convert keynote 08 to keynote 09" and you'll find a bash script to do it in 5 seconds[1].
[1]: https://www.google.com/search?q=convert+keynote+08+to+keynot...
But this makes it even less clear why Apple can't support their own old formats. Because it's too tricky to be worth dedicate developer time?? Or because it's a deliberate policy? This is a fundamental requirement of the software.
1) "Fresh Mavericks install." He should run Software Update and tell us he has done so, if he's going to write a blog post about whether stuff works.
2) He should file a radar (Apple's umbrella term for bug report / feature request) and share the radar number with us so we know he's at least going through the channels Apple has provided for concerns like this. http://radar.apple.com/
Or... he could just write a blog post, but I'm saying it would likely be more effective if he also did these two things, in addition to his blog post.
Wait, he shouldn't have to file a radar? True, in an ideal world, he shouldn't have to. But you know what they say about ideal worlds...
Latex has a fairly long life time. A friend found the 20-year old source for his thesis and thought it would be fun to see if it would still compile in latex. Sure enough, it did, after just a few header lines being changed (sometimes library names change).
Source is plain ASCII. Output is PDF. Interpreter is TeX. All three are very, very robust for backward compatibility.
That Microsoft is being promoted for backwards (and forwards) compatibility strikes me as ... comical.
In 2013, documents are STILL being produced into a proprietary format that nothing open source can read (directly) a decade later.
This is why I love Python and Perl for data munging.
As far as proprietary in general, MS Office has done a pretty good job other than their Office 2003 XML stuff which had to be broken when they lost the patent ruling to i4i. Anyone looking at document longevity now can easily use their XML formats.
I'm not saying that it is permanent problem, there is solution for it for sure. It is just unnecessarily cumbersome and mostly unexpected: I can imagine dropping support for files from 1995 not for files from 2008.