Seeing this makes me think about how many modern applications could learn a few things from the old Mac OS 8/9 Human Interface Guidelines. [0]
[0] http://mirror.informatimago.com/next/developer.apple.com/doc...
But forget about Mac OS 9, we don’t even have native OS X or Windows interfaces anymore. Doesn’t help that Apple dropped the ball on following their own UI guidelines.
As a javascript developer, I'm sorry that I don't have the patience/smarts/skills/time to learn C++/Qt/GTK/WxWidgets/etc. HTML/CSS/Javascript is all I know and it's probably all I will reasonably stay with for a while because of various circumstances. And Electron lets me use the knowledge I already have to make things.
That's not to say that Electron doesn't have all these issues, of course. But I feel that this policing of how people write their own software (especially when it's something purely done as a hobby and/or just to share something with people) is getting somewhat out of hand.
Here's a hypotetical/philosophical question for the community in general. Given no other alternatives to do X, what is preferable? An Electron app that allows people to do X, or no app at all?
Wait you can't just call the respective APIs anymore? I thought the reason people don't use them is not because they aren't available but that 2 different codebases would have to be maintained hence things like Electron.
https://hellosystem.github.io/docs/developer/ux-guidelines.h...
The author has certainly done a great job of capturing the zeitgeist of the era (the garish bevels on the Spotify app are spot on!), but I would love to see how the OS 9 UI would hold up in modern times, on a retina screen with more than 256 colours, modern anti-aliased typography and much more screen real estate.
If you go into System Preferences > Accessibility > Display and turn "Increase contrast" on, it adds clear lines around almost everything, which is the closest I've found to that. I tried it for a while, but I found it too harsh in the end. It would be nice to have a setting which was inbetween the two extremes.
I mean, neat concept, but really lacking in the user hostility that today's users demand
Let's see, what is my first extant contribution to the Internet… oh yeah, I thought this was handsome. It was a theme for an explorer.exe shell replacement.
https://www.wincustomize.com/explore/litestep/154/
https://skins14.wincustomize.com/1/53/153855/6/154/preview-6...
In my defense, it was 2001-2002, I mostly had MS Paint at my disposal, and I was 14 (and not like a smart 14).
https://www.austingroupbugs.net/view.php?id=1044
https://developer.blackberry.com/native/reference/core/com.q...
The Zoom one reminded me of https://en.wikipedia.org/wiki/CU-SeeMe --- yes, videoconferencing was actually possible on the hardware of the time.
It genuinely feels like it could fit into a couple of MB of RAM back in those days and been super snappy. I literally do not use slack for anything that IRC wouldn't do back in the late 90s, with the exception of threads, embedded images, and emojis.
It is crazy how close but how far IRC was. Session persistence, notification support even when offline, better admin UI was really all it needed. And probably be totally centralised.
You may enjoy this article: https://news.ycombinator.com/item?id=21831951
The only problem is that it's not free (with unlimited trial, though) and the development seems to have stopped. Still works okay for text chat, though.
And, wow, actual separate windows for things! I am constantly frustrated by Slack’s inability to show more than one conversation at a time.
Low-res bitmap icons, bitmap fonts, no antialiasing, no composition, etc.
On Linux you can still have a pretty old school desktop with some of these elements... but may not run well on HiDPI.
I don’t think that high-res icons, vector fonts, aliasing and composition meaningfully effect today’s hardware — we can and do have GPUs barely sweating on even much more demanding functionality and these are mostly parallelisable/easily cacheable things. If anything, they take up some GPU memory.
What makes some of today’s apps resource hogs is the different abstraction level which can be useful (accessibility was absolutely not something important back then and even today it is not without lack), cross-platform, but often much more leaky than it could be.
I was in middle school in 2001 when I saw a paint program (MacPaint?) on an oyester iBook - It was way more intuitive and engaging than Windows counterpart. I think late Classic Macintosh (v8.6-9) to early OSX (~10.8) had a very good aesthetic balance between form & function.
They were optimized for the low-resolution screens of that time (hence 'pixel perfect' design was the norm), and there was also no expectation whatsoever of "touch friendly" controls so everything was a lot more tightly-spaced than today. Though the mockup does show how larger widgets could also be integrated quite well in that sort of design.
"Flat design" is a disaster and the latest redesigns are slowly inching away from it by adding some 3d-rendered shadows to try and restore some intuition for "depth". But that sort of fancy, almost photo-realistic rendering just adds more weirdness to the overall "flat" look.
The original flat designs, Zune HD and the Zune software, Windows Phone 7, Windows Media Center, was incredibly usable.
All those were produced by small design teams at Microsoft, and for, relative to an entire OS, small projects. (Settings aside Windows Phone 7 for a bit, which IMHO actually had very few distinct UI elements.)
Heck Windows Phone 7, to this day, is unlike anything else on the market, It is still going to be more responsive, and look cleaner, than almost anything else out there.
I am not sure why someone decided "flat" means "no button border", that is where I think it all went wrong.
Oh and also people who think flat means getting rid of text! Windows Phone 7 loved text, text was everywhere!
Well if Apple’s Execs are to be believed, touch-screen Macs aren’t in the pipeline, which is aces with me because that’s what my iPad is for.
So given that the preeminent pointing devices on a Macintosh are still the mouse and trackpad, I could do with them tightening up the spacing again and walking back the last 10 years of nonsense.
We don’t have to go back to Snow Leopard, certainly not Platinum; but widgets and theming that are consistent with how a Macintosh is used and the hardware it actually runs on would be preferable.
My child seems to intuitively get modern UI design, and doesn't seem to need many affordances to understand what to interact with or not. I think people who say older designs are better for beginners may be applying some retrospective thinking.
Anecdotally, my 70-year-old father and several of my elderly uncles & aunts had a much easier time figuring out Classic Mac OS. Modern macOS and iOS are much more complicated. While they still use these systems, they do so in a much more superficial way and they tend to exhibit what (for lack of a better term) I would call a "fear response." That is, when attempting to do a novel task they refuse to experiment and instead resort to asking for help immediately. Classic Mac OS was much better designed to encourage experimentation and avoided surprising the user (in a negative way) as much as possible.
Too bad the actual stability of these systems was horrid (both OS 8/9 and Windows 95).
Sure, the proliferation of floating palettes in the 2000s was a bit much, but on the other hand monolithic single-window apps for everything is terrible. Slack, for example, would be much better if we could different windows for calls, chats, and the channel list. As it stands now, we have either one window with conflicting functions, or a lot of repeated information taking up quite a lot of space. Ultimately, this is the result of cramming everything into one window because some OSes confuse windows for applications. It is grating to see this design pattern on macOS, which really does not work that way.
https://github.com/felixrieseberg/macintosh.js/
"This is Mac OS 8, running in an Electron app pretending to be a 1991 Macintosh Quadra. Yes, it's the full thing."
But the apps ring true. The splash screens are a nice touch.
Working in IT they gave me an MBP (the edition w/o an Esc key) and now my second IPhone (13PM). I say with confidence that the only reason why Mac / IPhone is more popular than a TP (be it with Windows or Linux) / Android is simply b/c it is more expensive and a status symbol. That's it. A status symbol. Congratulations to Apple for conning even IT experts who are at the end of the day also just human beings with psychological weaknesses to be capitalized on.
There are lots of complaints about modern OSs one can make but none of the major three are anything close to ugly and are all easy on the eyes. So I ask you to explain why you believe UI design peaked in the 90s, when just using these OSs was unintuitive to anyone but a tech enthusiast, other than you are looking through rose colored glasses.
I started on Win95 and remember how I couldn’t wait to switch to XP and where there were actual colors and it didn’t feel so empty. I then wanted to switch to vista so bad that when my PC wouldn’t handle it I download SUSE Community Edition (a Linux) because Vista copied a lot of its fancy new UI (like previews and 3D icons) from Gnome (at least it seemed that way to me, may not be true or it could be they just happened to be tested on Gnome first.)
I get rose colored glasses for sure but romanticizing about a UI that only looked as plain as it did due to technical limitations just seems weird to me.
Then again I suppose their were people who thought black and white cartoons were objectively prettier than technicolor. People have a way of convincing themselves what they initially got used to is the best way and any change is a regression.
Well, allow me to be number two, then. In my opinion, macOS is bland, unclear and the general UX is peculiar to say the least. I dislike the concept and the design of the system bar at the top and the blur effect they add to some UI elements (which Microsoft copied for their latest Metro design language) just looks excessive to me. The iPhone-i-fied controls that have been added to macOS are a step back, in my opinion, because now there's a giant system status popup that looks like you're supposed to touch it but Apple doesn't want to introduce touch screens to macOS.
The thing the macOS-ecosystem does well is integration, which is arguably much more important than just design. I rarely use any tool on Windows that follows Microsoft's guidelines, whatever those are this month, but on macOS the UI designers seem to be focused on integrating well with the looks of rest of the system. This has the unfortunate side effect of putting some of developers using the macOS design language on other platforms as well, fragmenting the system even further, but for macOS users this is a great benefit. Even an awfully ugly system (like the BeOS look which some people love, but also the Gnome 2 "3D" look) is still much more usable than most "modern" designs because you know what to expect from applications running on that system.
You can disagree with me, and that's alright. Any design is liked and disliked by different people. I personally enjoy the simplicity and elegance in designs like the SerenityOS UI, but I can definitely see why others hate it.
But, if you truly have never met heard anyone say that they didn't like macOS' design, then you're part of some very different social circles than I am.
- The UIs of the 90s were made with mice and keyboards in mind. The designers' minds weren't yet compromised by the existence of touchscreens, both on phones and Windows laptops.
- IT companies were building tools to empower users and actually competed with each other fiercely. It was important to make sure your UI doesn't suck, because otherwise someone else will. This competition required the companies to put users' needs before their own.
And tangential to that: "developer experience" wasn't a thing. Writing software was an engineering job done by people knowing what they're doing. The bar was set pretty high. Compare that to now, when it's almost encouraged to be a junior developer and pile libraries into your project without ever looking under to hood to assess the compromises you're making. And the way the code looks and builds is considered more important by many than the end result that ships.
That's very far from my experience. I know a fair amount of people who had no trouble going from DOS -> win 3.1 -> 95, 98, XP, but starting from Vista using their computer became much harder and much more external assistance started to be required. I know no one who is learning how to use computers as easily with, say, Win 10 or the latest Macs, than people 15 years ago ; when I give classes most non-CS college students are more computer illiterate than people around me when I was in junior high.
The problems I have with modern UI/UX design are as follows:
1. It is often tailored to the needs of mobile interfaces instead of the needs of desktop computing environments, often resulting in certain UI interactions being more complex with modern applications than with older applications that were designed for desktop users. For example, hamburger menus make sense in environments such as smartphones where room is scarce. However, I believe they are inappropriate in desktop environments, yet they are becoming more commonplace on websites (even when browsing on a desktop) and in applications. Another example is a trend in newer versions of GNOME and macOS where the title bar is fused with the toolbar. While this does save space, it makes it harder for me to rearrange windows on the desktop since I must look for empty space in the combined tool/title bar to click to drag (and sometimes not all empty space in this area drags the window), while this was never a problem for me with traditional title bars.
2. We've lost certain affordances that were present in the 1990s versions of Mac OS and Windows that aid in usability. It's harder to visually distinguish between clickable and non-clickable portions of a window in many modern applications. Scroll bars provide useful feedback while reading content that doesn't fit within the window, yet it's a common trend in modern UIs to hide the scrollbars, and when they do show up, they are often very skinny, making it harder to scroll with them (yes, trackpad gestures and scroll wheels make this less of an issue, but not everybody has nice trackpads or mice).
3. The rise of applications that refuse to adhere to platform guidelines, preferring to be "special snowflakes" for branding reasons, engagement metrics, developers' convenience, or cost reasons (it's cheaper to make an Electron app than to make separate UIs that conform to each platform's respective guidelines). The notion that applications should follow a platform's UI guidelines is increasingly fading away, and is being replaced with the attitude of, "You should be grateful that you are able to use this application." The Web, with its lack of UI guidelines and its emphasis on siloed applications instead of interoperability among applications, is taking over the desktop, with unfortunate consequences for the future of desktop computing.
There is nothing wrong with the idea of taking the substance of UIs from the 1990s and having updated color themes, icons, and fonts for them. I personally believe the pre-Yosemite Aqua interface of macOS and the Windows Vista/7 interface were great examples of modernized UIs that were desktop-tailored and retained or even enhanced affordances that were present in previous versions of these interfaces. I feel we lost a lot when the industry shifted to mobile computing and decided that desktops should look and feel more like smartphones and tablets instead of continuing to improve on the desktop computing experience.
Windows 2000 and Windows Vista on the other hand... Much better in my opinion. Also Mac OS Aqua design was great when it was skeuomorphic.
That is some bizarro world stuff. Even Windows 10 is actually fairly pretty in its interface even if it is super bloated.
Which one is more plausible?
a) Apple actually "conning" tens of thousands of highly skilled, [mostly] intelligent, [mostly] educated people into spending money on hardware they don't actually need.
b) You not understanding some important aspect of the situation and/or having different personal preferences.
edit: you know you're making terribly misguided UI decisions when people build software to revert them: https://github.com/MacEnhance/MEMiniMe
the absolute peak of interfaces was mac system 9
- Applications not actually being terminated when clicking on the close icon
- lagging / unreliable context menu opening with middle-index-finger on apps in dock bar
- the concept of installing something by moving it from an icon on the left to an icon on the right
- or when you can't start apps due to connectivity issues
- app removal is totally opaque and sometimes requires to download a custom uninstallation tool (adobe creative cloud f.x.)
Windows XP features were easy to discover. Scroll bars were not hidden. Buttons looked like buttons.
I had the first Macbook Retina and used it for years at a company where it made sense to do so; when I handed it in and left for my own startup life, I was open to either OS (couldn't use Linux as the daily driver since I my industry uses a lot of Windows-only programs), and Windows was just far more productive to use on a regular basis. The only thing I miss is Final Cut Pro, and Sublime Text to some degree (VS Code has been an adequate replacement).
Makes me wonder what sort of currently extremely common UI elements we'll look down on in the near future.
I’m guessing this was on purpose to try and transition the P2P crowd over to streaming.