Which makes sense, between the "if we change it we break it in some subtle way" and "we don't expose that in UI anymore so the new panel doesn't have it".
My understanding is that windows want to move to a "you can't configure much of anything, unless you use group policy and then you set everything through that" so they don't update the settings and don't include them in the new screens for 90% of the things, but then they have this huge moat of non active directory users who need to go into the settings and my god are they bad.
as a long time windows user, I wish linux copied this feature more
ODBC Data Source Administrator (64-bit)
Configure > untick "Use Current Directory", Select Directory
Gotta love that the disk and directory picker survived 20-30 years.
Like I always say, the user-mode of Windows is easiest to change, that's why it has been done almost every version.
I can't immediately see why explorer.exe wouldn't run and give you a start menu
It won't compile.
(Heck, recently I migrated a VM to its third hypervisor. It began as a physical machine a quarter century ago.)
This is how I think about music and Spotify. Pretty much all music exists, you 'just' have to remember everything that exists and what it's called so you can find it.
Recent HN link, Red Alert 2 in your web browser. A game from 25 years ago which you can unofficially download the C++ original version from the Internet Archive to upload to the website which extracts the assets to play in a Javascript based reimplementation inside a web hypertext document browser.
The first release of git was in 2005, around a decade after Windows 95.
Diff3 is from 1979 (https://en.wikipedia.org/wiki/Diff3), so three-way merges (https://en.wikipedia.org/wiki/Merge_(version_control)#Three-...) predate git by decades.
Win 95 feels from era1, xp and git was already in era 2.
Once those two changes were done by 2010 though, there’s been no game changer, if anything we've regressed through shittyfication (we seem to have fewer social networks vs the original Facebook for example, as most of them turned single player feed consumption).
Maybe pre and post LLMs will feel like an era change in a decade as well?
It just really highlights how much better BitKeeper and then Git's design was compared to what came before. You then pile on being free/OSS, and being "proven" by an extremely large, well known, and successful project on top, and you have yourself explosive growth.
There are developers around these days who never had the displeasure of using the pre-Git source control offerings; it was rough.
> ...and I immediately got flamed by several people because no one used patches any more.
How are these ideas connected? The intent of git is that you work with patches.
Priceless.
Edit: https://devblogs.microsoft.com/oldnewthing/20190830-00/?p=10... seems the justification was that UTF-8 didn't exist yet? Not totally accurate, but it wasn't fully standardized. Also that other article seems to imply Windows 95 used UTF16 (or UCS2, but either way 16-bit chars) so I'm confused about porting code being a problem. Was it that the APIs in 95 were still kind of a halfway point?
By the way, UTF-16 also didn't exist yet: Windows started with UCS-2. Though I think the name "UCS-2" also didn't exist yet -- AFAIK that name was only introduced in Unicode 2.0 together with UCS-4/UTF-32 and UTF-16 -- in Unicode 1.0, the 16-bit encoding was just called "Unicode" as there were no other encodings of unicode.
That's not true, UTF-8 predates Windows NT. It's just that the jump from ASCII to UCS2 (not even real UTF16) was much easier and natural and at the time a lot of people really thought that it would be enough. Java made the same mistake around the same time. I actually had the very same discussions with older die-hard win developers as late as 2015, for a lot of them 2 bytes per symbol was still all that you could possibly need.
Unicode 1.0 was in 1991, UTF-8 happened a year later, and Unicode 2.0 (where more than 65,536 characters became “official”, and UTF-8 was the recommended choice) was in 1996.
That means if you were green-fielding a new bit of tech in 1991, you likely decided 16 bits per character was the correct approach. But in 1992 it started to become clear that maybe a variable with encoding (with 8 bits as the base character size) was on the horizon. And by 1996 it was clear that fixed 16-bit characters was a mistake.
But that 5-year window was an extremely critical time in computing history: Windows NT was invented, so was Java, JavaScript, and a bunch of other things. So, too late, huge swaths of what would become today’s technical landscape had set the problem in stone.
UNIXes only use the “right” technical choice because it was already too hard to move from ASCII to 16-bit characters… but laziness in moving off of ASCII ultimately paid off as it became clear that 16-bits per character was the wrong choice in the first place. But otherwise UNIX would have had the same fate.
Edit: should be done now