First real job was at Southwest Research Institute in San Antonio. I programmed in a DEC VMS environment. My Fortran program read the raw output from strain gauges attached to a USAF jet trainer and processed the data which made it possible to track the airframe's weakening due to cyclic fatigue. It was just a bunch of I/O calls and loops. It was delivered to the USAF as part of their program to predict airframe fatigue. This time, there was a mouse but it didn't do anything. Because once I logged into the DEC VMS, it was all keystrokes again. My program was one giant file something like 5K lines.
And that really makes me appreciate the generation that was stuck on a few kilobytes of ram with just a plain text editor making things for the future. I don't even get why we are so ungrateful and unwilling to learn the fundamentals. Where is the humility? Do you really think that the code that an ai wrote that you just copied over proves thay you have any sort of intelligence? Be humble and keep learning.
A version control system that multi-file commits, branches, automatic merging that generally works as long as devs don't touch the same lines, etc., is a fairly modern invention. ...there was still easily an order of magnitude speedup available from better version control tooling, test tooling and practices, machine speedups allowing faster testing, etc. ... For large projects, just having CI/CD alone and maintaining a clean build over building weekly should easily be a 2x productivity improvement"
Minicomputers and mainframes had advanced interactive facilities in the early 1970s, you could use BASIC on a PDP-11 and it was about the same as using BASIC on an Apple ][ (no graphics though unless you had a GIGI terminal), edit Assembly, FORTRAN and COBOL programs with the TECO text editor and compile just like you would with C or Go on a Linux machine today. By the 1980s most mainframe development was done with VM/CMS which was a lot like using virtual machines today, to do development you would spin up a VM that ran a single user OS that was a lot like MS-DOS (or maybe it was the other way around.)
One Saturday, I figured this out, and started pushing the first job in the queue to above interactive priority, and it would finish in a few seconds. I repeated... and repeated this manual tweaking.... about 2 hours later, everyone had gotten their work done faster than usual, and the room was practically empty.
Most academic environments suffered from similar issues... too many students all trying to get their work done at the last minute.
---
When I got my own PC/AT clone and ran Turbo Pascal, the compile times were effectively instant (on the order of a few seconds), and just kept getting faster with each new release.
https://68kmla.org/bb/index.php?threads/getting-started-with...
Period correct programming books https://vintageapple.org/macprogramming/
I wasn't able to do interesting graphics programming on the Apple because I had to learn assembly language and the Apple's weird graphics format (amusingly, somebody recently released a nice high performance assembly graphics library for the Apple II).
The stack limit on my first PC's C compiler was 6 function calls.
I had to upgrade my second PC from 4MB RAM to 32MB RAM before I could run X, emacs, and g++ all at the same time without swapping.
keybding hex codes.
coding with assembler mnemonics
coding with macros, and compiler, whew, intimate knowledge of hardware no longer required.
coding with system calls, fingers dont cramp as much, a degree of universality existed accross platforms. yay DOS.
coding with WIN API, yay Microsoft fingers dont hurt, brain fatigues from use of arcane API calls
C comes to rescue no macro assembly required, but variable type hell begins to burn.
proprietery lockouts and secret source, MASM makes freedom.
DLL hell awakes.
LINUX ! yay linus.