I was the owner of many Acorn machines including BBC B, Master, A410, RiscPC 600. The hardware, clearly designed or at least originated by Sophie Wilson was remarkable. It was robust, well designed and incredibly expandable. To this day there is not a single computer that actually made sense more than anything that Acorn kicked out. A human could learn everything about it in intimate detail without a problem.
However the software was a source of constant pain. Firstly nothing was finished initially when the Archimedes came out. The Arthur OS was apparently named as a "A Risc operating system by THURsday" because their internal OS project, apparently Unixlike, went down the crapper during development and they had to hack something up quickly so they had a minimum viable product. What I ended up with was a barely usable OS that consisted of a quick port of Acorn MOS from the BBC Master series and a naff GUI chucked on top for my £1400 investment (a hell of a lot back then and even now) that wasn't fixed properly until RISC OS 2 came out in 1989 so I sat there with a lemon for a year. After that we were stuck with a cooperatively multitasked operating system with a worldview completely different to anything else at the time or in the future. A lot of progress was made but it never had any prospects despite a lot of us clinging onto the initial investment.
Now I certainly enjoyed the platform but in retrospect, I'd have invested my money in something else back then if I knew what was going to happen.
I full respect the achievements here and more importantly the legacy (I have 12 ARM processors still in various things in my house!) but for us footsoldiers who paid up back then, it wasn't all love and happiness.
But it was only after Acorn imploded in 1998, and a a couple of years of working with Linux that I thought "hmmm, you mean I can write shared libraries for my C code that aren't kernel modules?" and "what, you mean the computer can just switch away from my task even though I've not called Wimp_Poll? What if I'm not done?" and "What, you mean the OS will just kill my task if I address some memory I'm not supposed to? How does it not know I didn't intend to patch the OS from my desktop application?" etc. etc.
Also the Archimedes (at least) was pretty much the most expensive computer on the planet at the time - something like £3000 in 1988 money - it's amazing they sold so many to people just on the strength of Zarch :) ( https://www.youtube.com/watch?v=ALfnZjCiuUQ )
As to the worldview, everyone had their own worldview before the internet, / vs \ vs : vs .
Mac OS Classic had filetypes, same as RISC OS. Bit odd that the web uses them really, considering it has mimetypes.
The RISC OS GUI is still miles ahead of everyone else. Its application packaging was pretty good, easy to edit (although the !boot system should be restricted in what it can do).
But they never went through the transition that Windows did (twice), or Mac did to OS X.
The underlying OS was not amazing. There were some good parts; the relocatable module system was very elegant, if rather alien by modern standards; the ability to resize nearly all memory areas on the fly (by dragging bars in the GUI!) was great; proper pluggable filesystem modules in the 1980s were a revelation; the built-in BASIC was super fast and reasonably comfortable to program in, even though it didn't have structured types (some of the built-in ROM software was written in Basic!)...
But the bad parts are bad. The underlying technology is really primitive, being a non-preemptive OS with no memory protection that's worth anything, and then it's suffered from many years of ad-hoc organic growth... e.g. a lot of the core APIs pass flags in the top 8 bits of addresses, because back then the ARM had a 24 bit address bus. Running on a modern machine with more than 16MB of RAM? Good luck with that. APIs are duplicated everywhere with slight changes. There are lots of undeclared dependencies between modules (including recursively, which isn't really supported). Platform independence isn't really a thing, except when it was crudely bolted on for some platforms.
Plus there are some... questionable... design decisions. My favourite is the big chunk of myserious code in the main system allocator which gropes up the call stack every time you try to allocate memory. Why? It's looking for stack frames belonging to the system allocator, so that it can tell whether it's being called reentrantly. Why is it doing this? So it can detect whether it's being called from an interrupt, at which point it goes through an alternative call path and return memory from a different pool!
If anyone's interested, a few years back I wrote a proof-of-concept RISC OS kernel reimplementation called R2: http://cowlark.com/r2/index.html
Unlike the real thing, it runs everything in user mode except for a tiny stub to handle SWIs. (There's a built in ARM emulator.) It's complete enough to run Basic. While reverse engineering the operating system I found out way, way too much about how RISC OS worked inside. shudder
9x was just so finicky that you didn't really want to push the multitasking.
The Acorn 'Unicorn' was excellent but too expensive and this is roughly where the parent comment comes in. So Acorn did a lot of good stuff before they lost the plot, roughly around the time they released the Archimedes, which was an amazing machine for the time but the Amiga had far surpassed anything Acorn could offer software wise and the ST and Commodore were gobbling up the lower end.
I had a 1040ST and an Amiga 500 at the time as well (spot the geek) and the software wasn't that great on those platforms either IMHO. Even PDS on DOS was nicer to program in with the 2kg pile of manuals.
The killer was the rise of the PC and you know what; I'm glad it killed everything. Perhaps controversially, a couple of years down the line and as someone who wanted to get shit done back then, things like Windows 3, VB, Word, Excel, OLE appearing were clearly the future.
I take your point about Acorn as a business but I used a 'lab' full of RISC-OS 2 bases A310s at College with 20Mb Rodime hard drives. We did scanning, dtp, Genesis multimedia packages &c and various home grown projects.
What else could we have bought at that time for similar use cases? Not trolling, my memory of the time is hazy and I recollect being extremely underwhelmed by DOS based PCs in another 'lab'.
EDIT: flashbacks to Aldus Framemaker on Apricot PCs, Amstrad PCW spreadsheet applications being used in a theatre box office, and an early 9" screen Mac being used with some form of DTP software.
Risc OS was roughly contemporary with Windows 3.0 and Apple's system 7, offering a cheaper and seemingly faster system (although rather idiosyncratic in its use of the mouse menubutton and drag-and-drop instead of save dialogs). It booted from ROM in something like a second.
I still love my Beeb, great machines. Very well written manuals too!
If you took her work out of the world a surprising number of items would suddenly stop working.
Around the same time I entered high school and they had a full classroom full of networked BBC Micros!
*whoami
As gadders mentioned : she should be knighted.
Archimedes 310s(?) were in use for medical graphics applications (expanded RAM and I think I recall they had floating point units of some kind). There was also a parametric CAD application that was used by Lucas(?) for a bit. The music software Sibelius was released first on this platform having been coded in assembler by the brothers Finn for speed.
The subsequent financial and ownership history of the ARM processor design isn't as inspiring I'm afraid.
So yes gongs for MS Wilson!
Here's an interview from the Centre for Computing History: https://youtu.be/ZMEBj3FM2aw
[1] Vague recollections of an article in Acorn User. I'd really have to dig out those old magazines to be sure.
Wow!
With today's CPUs you probably wouldn't stand a chance, no matter how good you are.
At the time - and I still sort of do - I thought that, despite its idiosyncrasies, RiscOS was way ahead of Windows and the Mac. For example the Mac only got scaling scrollbar widgets with System 9, and the boot time was amazing. (Upgrading from RiscOS 3 to RiscOS 4 by carefully prising out and replacing a bunch of ROM chips was fun.)
In hindsight they made the right choice, but I was bitterly disappointed when the parents bought a Mac over a RiscPC. Teenaged me didn't let them forget how unhappy I was for quite a while...
Getting RiscOS ported to the Raspberry Pi was a great move, a real blast of nostalgia, but I'd love to see what a modern-day version would be like – 64 bit, great graphics, lightning fast, weird extra mouse button.
It makes me happy that their technical legacy lives on in ARM, and their educational spirit is continued with the Pi.
http://people.cs.clemson.edu/~mark/admired_designs.html#wils...
http://speleotrove.com/acorn/acornPictures.html
[Edit: That's not my Web site BTW]
Here's a pic of me testing the sophisticated visual output system: http://i.imgur.com/O8czwKo.jpg
[1] http://www.bunniestudios.com/blog/?p=3554
[2] http://techreport.com/news/27878/atom-x3-chips-target-cheap-...
Consider the following scenario. Let's say Sophie was born female and never needed to transition. Now, let's say that she got married after Acorn folded in the late '90s, and took her husband's name. In that case, we'd refer to her by the name she uses now, not the name she used then. The same goes in this case.
[1] https://en.wikipedia.org/wiki/Lynn_Conway
[2] http://ai.eecs.umich.edu/people/conway/conway.html
[2] https://en.wikipedia.org/wiki/Mead_%26_Conway_revolution
It also seems to be more likely (than in other fields) to find trans woman in IT, but I worry that this is simply confirmation bias at work? I wonder if there is any research on the relative statistics of trans folk in the IT industry?
EDIT: Thinking about it, perhaps it is just that, as with the GP comment, people are more likely to bring the fact up in a field like IT, where 'facts' are held as more important, and in other areas it would simply be ignored or not mentioned?
How else are we going to keep track of one person's name changes over the years? It's just common sense, really.