I think about that card from time to time. The extra 2MB made me feel badass, though I'm not sure any games actually used it, did they?
The Pure3D and Pure3D II were the first and last times I ever owned best-in-class graphics hardware.
The Rendition-based card I owned previously wasn't half bad, but those early voodoo cards were really revolutionary.
What’s happened since 2015, by contrast? Many people still swear by their 2015 MBPs, the last model before the touchbar debacle. That Haswell based machine was getting long in the tooth even back then.
Looking at GPUs, the Voodoo3 came out in 1999 and had a fillrate of around 150 MPixels/sec. Today's GPUs have a fill rate of around 300 times that, which gives an annualized increase of over 30%, for 20 years. This doesn't even consider the fact that GPUs went from highly specialized parts to general purpose compute chips.
Lower power CPUs are also impressive. 20 years ago the state of the art was the DEC/Intel StrongARM. A modern smartphone SOC is around 50x faster than that per watt. If you look at floating point stuff it's more like 1000x faster.
In the same timeframe, we went from a 200 MHz Pentium MMX to a 1 GHz Pentium 3.
...and most if not all of your existing hardware and software would continue to work unchanged, in addition you got new features. I remember going from a 233MHz Pentium to a 2.4GHz P4 and pretty much everything was the same, just faster. The situation isn't so clear now.
To summarise, with the 90s to early 2000s each upgrade really was an upgrade. Now, not so much.
It’s funny how backwards compatibility back then allowed us to have massive progress, by making gradual steps forward possible.
These days, where we are seeing less and less real progress, we now hear many programmers claiming we have to sacrifice and shed backwards compatibility if we want further progress. (I’m looking at you Apple and Google!)
Something is very very backwards. And yes, an upgrade may no longer be an actual upgrade.
In 5-year spans they all had 10x increase in power, from toy single-core CPUs and GPUs that barely moved pixels on screen, to proper multi-core CPUs and GPUs that would rival your laptop of only they had more thermal headroom.
Also, I can't turn my stock mobile into a good desktop like environment, given I have a big screen. I work with LLVM and Haskell, with Emacs or VSCode [lately], with some Blender3D mixed in.
My phone was not built for this. It is a purely consumer device.
Honestly you could go all the way back to 2010 or so. Zen, Zen+, and especially Zen 2 are the most exciting things to happen since Sandy Bridge.
Intel fucked up big time, not only did they let moore's law die, but they repeatedly suppresed one of the only companies capable of carrying the torch further
Edit: before anyone event thinks of trying to reply defending intel, I suggest you look into the increasingly common reports of internal hubris being the core cause of many of their failures. They failed to realize this themselves until they tried pitching customers features that competetors had already sold and were shipping!!
These voodoo boards were a mere joke in comparison to what SGI could provide, so the PC market was catching up, hence it moved fast through the well trodden path.
I mean, all these voodoo and nvidia guys already worked for SGI or similar companies and just brought to the PC market the technologies which were developed for previous decade.
I think Intel could have stepped up their no. of cores with low thermal CPUs a lot sooner. The last radical step up from Intel was 2013 with Haswell CPUs and then after that 2018 with a more lower TDP 4 core CPU which we see now on the Macbook Pro 13".
Any Mac user who needed 'real power' moved to an iMac (or even an iMac Pro) when they outgrew their 15" BMP
Moore's law broke. GHz and those low hanging fruits are gone. I'm using a 2013 MBP. It's a core race now but lots of legacy software that aren't coded for multicore.
Here's the progress over 120 years, and spoiler ... 17 orders of magnitude in computing power:
https://cdn-images-1.medium.com/max/1200/1*zWyqQ2EP83gBcLZHG...
A very interesting observation is that the switch to transistors and integrated electronics does not stand out. It is just part of the curve.
It was an absolutely wild ride! Each machine back then was dramatically dramatically faster than the previous one. It wasn't like today where you'd need a stopwatch to tell. You went from being unable to do something to it being possible.
By 2000 we had most of what we have today other than tons of video available everywhere. In 1990 we were still using DOS and character cell graphics most places. No CD ROM, 10mhz machines, no connectivity other than modems. Lots of people still had monochrome displays in 1990, memory was 640k/1MB/2MB in the early 1990s, hard drives were < 100MB. My 2000 we had 1Ghz machines, DVD/Blu-Ray drives, 1GB of memory, drives were into > 100GB, etc.. 3D graphics was a thing, good video capabilities, high res desktops.
I'm prepared to believe that e.g. foveal rendering will give step-wise improvements without more GPU power, but it's not the same as the years of continual process improvements were for GPUs.
Hope I remembered this rightly but I really remember that Geforce 256 being so much better it was like night and day. I stopped playing PC games when I went to university around the Geforce 3 era, so that's when my knowledge of the topic drops off a cliff :D
edit: and now I reached the end of the article it seems Fabien has said exactly this! Note to self: read first then comment
Ironically, we've now looped back, and do pretty t-buffer-esque usage with modern DX11/DX12 pipelines.
One of the simplest functions of the T-buffer was to do temporal AA using a fixed function supersampler but also do integration over several frames, which didn't come into being in a modern AAA title until Doom 2016.
For a technology invented in 1999, 3DFx was too ahead of their time.
For a modern example, imagine if the RTX debacle of current gen Geforces destroyed the entire company. Nvidia backed down and released the 16xx series cards, 3DFx went bankrupt instead while everyone else was releasing the 16xx equivalent of that time period (early Nvidia and ATI cards).
And even then its generally meh with how much software is single thread bottlenecked. Meltdown and spectre didn't help things - after they were published I definitely intended to wait for architectures mitigating them before upgrading again.
Heres hoping AMD can keep pushing the envelope this year. A consumer 10+ core chip would definitely have my interest. Not so much to necessarily buy myself, but to drop the floor of the high end even further.
Look at the PL progress in that Voodoo 1-5 timeframe. We started with C/C++, and ended up with C/C++ and a little bit of Java starting to creep in. That doesn't seem terribly impressive to me.
Today, if I told you I was writing a new program (and I am!), there's 10 different languages you might reasonably guess that I'd have picked. Life's way better than when everybody was constantly transistor-constrained.
Pesky universal laws! I want my faster-than-light spaceships and decreasing entropy!
Intel monopolized the market by breaking laws (paid billions in fines). The lack of progress is the consequence.
The progress only slowed down for x86. For the rest of chips it has not: GPUs, mobile SoCs, they all becoming significantly faster with each generation.
Fortunately, with AMD Ryzen the stagnation is ending. Core count in mainstream chips already doubled, after stagnating for decade.
2015 models run on Broadwell and even today the one I use runs everything quite zippy.
Maybe software efficiency might start to matter again at some point.
Since we were a tiny company with not many resources so we decided to just manufacture the reference design. This decision enables us to be the first one in Taiwan to ship the product. I cold call 50 companies in Europe. I did not have much success at first since distributors in Europe were not convinced that 3D card can sell. Internet was slow back then, we were still using dialup modem and sending video capture was not an option. I finally got a break when I called the 42nd company on the list: Guillemot (France). Guillemot got their start in PC gaming sound card so they were already interested in the 3D card. Guillemot was talking to Orchid Technology but they need a lower price than what Orchid can offer. Since all other Taiwanese makers were still evaluating or in the development process, we got the business because our price is US$50 lower than Orchid and able to ship right away.
One thing I'm very thankful for in life. It hasn't been the same since around the turn of the century, the magic and mystery is not like it was with PCs or game consoles in decades prior. The advancements were just leaps and bounds every few years.
It sparked your imagination more than things today, because creativity for some reason reduced without technical limitations. Today, within any reasonable definition, an artist's vision can be fulfilled. There's really nothing left for the end user to imagine or fill in the gaps (think Zork).
Not to be crass but this is a relatable example for many I'm sure- it's no different than finding a Playboy magazine back then, as opposed to extreme, explicit hardcore websites today. There's no mystery at all there, and it's not really an upgrade from your imagination being used at least a little bit.
I've actually rediscovered books because most media today (as in movies and sitcoms, not THAT sort of media) is so poor quality. It's really all about the writing, and I struggle to find games and films that are at the 20th century or prior quality level. Books can be exquisite entertainment, and leave plenty for your imagination to run free with. Which for me, is what it's all about. That's the joyful part to entertainment, or at least a part of it that I find critical.
People who are in their formative years today will look back upon the current times with similar enthusiasm; just like your elders a few decades ago found that your Playboy magazines and 3D video game cards were pointless debauchery.
So it has been for the billions of humans who have lived before us, and so will it be for the billions who come after us.
The reasoning that immediately comes to mind is two-fold.
The first is the obvious point I made, where from the very start I place books as the supreme medium, and are far before my time. The written word can be information dense, puts your mind and imagination to work and if in a book, never needs recharged. :) It's just underrated, underappreciated. Books are "inferior technology", which to me is the ideal abstraction layer for our species. I think we'll continue to see back to basics as a movement as corrosive societal effects from "social media" plays out. We need real community, the kind humans evolved for, not that marketing nonsense chewing people's brains up and spitting them out only left with capacity for short-term attention spans. Most people don't need help with that. Facebook is the new smoking. Things don't always get better with every generation for billions of years, nor do most witness great change. You're assuming that because the last 300 years have been relatively action-packed. Mostly thanks to Europe's astonishing leap forward around 1600AD in and out into the world. Things are objectively worse today from the 20th century, everyone knows it or can definitely feel it. The deck is stacked against a young person with opportunities slowly dwindling. That trend may continue if we don't solve capitalism collapsing on itself in the western world, discover new antibiotics, among numerous other very serious challenges that aren't being effectively addressed. Our big, impressive move lately (speaking for the US) is simply cutting taxes when already at historically low rates. Brilliance.
The second thought I had is that the home PC space undeniably hit it's stride from ~1980-2000. That's just when that market had its golden age. Combine it with the arcade experience of the day, and you had a sensory experience that really isn't even widely available today anymore. You can't really explain it to someone who didn't see it. It's not just generational placebo. It's like the circus. They hardly exist now, but they were worth the trip and I regret many kids may never go to one. Someone who did miss the way it was before would likely insist I'm just a fool, but Netflix and Youtube is not a fair replacement for these things. They're just not.
You have to go hunting for an arcade today, and I'm not even sure if there are any truly modern ones around. A kid just won't get that sensory experience, which isn't just technological but the social element of all the kids being there too. That goes without mentioning the loss of comic books stores that kids rode their bike to(!), toy stores, candy stores, Saturday morning cartoons and all the waiting, anticipation and excitement attached. Just nostalgia? Or is it real. The examples sound real.
Back to addressing your point, certainly the cotton gin, the steam engine, home electricity (which my grandma told me about when they were the first house in town to have it installed because her dad was so enthusiastic about it), among others, were more monumental on a macro level than seeing the home PC space explode from 1978 to 2000. Yet I have my doubts that an old timer was passionately reminiscing about "seeing the cotton gin come to be", and how amazing it was. It harkened great change to society, but I'm not sure people were living their lives in a way at that point to personally experience rapid iteration as people witnessed in the home PC market. Being a market specifically targeting them/us. It really is different today than it was then, it has normalized, there's less excitement for sure, and it's far more difficult to be impressed with advancements.
We're at the point now where we need to go back to step one, bring back creativity. That's why I circled back to books being the ultimate medium. What good is an 8K TV is there's nothing good to watch? Zero. I'm back to hunting for good books instead. Choose Your Own Adventure were better than what Hollywood is putting out today. Writers often do the best job at expressing thought-provoking creativity, which without that human spark of creativity injected into your technological medium, it would all be pointless.
From our human perspective at least, once they're unleashed, our AI overlords won't care.
Never mind the pictorials, the depressing thing about finding a Playboy issue from the 1960s era is the sheer literacy of the thing. You really could read it for the articles back then.
I got one of these cards - confirmed it was indeed hella fast (even for large meshes of small triangles), and then dropped into SoftICE a few times, winding up at this code:
https://github.com/sezero/glide/blob/glide-devel-sezero/glid...
My thoughts were - "Wow, somebody gets it!" - Very tight triangle setup, and a simple PCI register layout that means that the setup code just writes the gradients out to a block - the last write pushes the triangle into the huge on-card FIFO.
That performance, along with the simplicity of glide, made it a a no-brainer to drop all other card support and focus on that exclusively.
I bought thousands of dollars in stock in the company. I lost it all because unfortunately nVidia ate their lunch because they marketed themselves better. They had full 32-bit color vs 3dfx who had the superior technology but only had 16 or 24 bit color. 3DFX spent a lot of time trying to explain why it didn't matter but in the end it did. It mattered to the gamers at the time, and they basically died and I lost a huge amount of money that took years to pay off. It took me a long time to move over to nVidia because of my hurt ego but they were the superior technology in the long run.
I remember back in the day "serious gamers would never use a combined 2D/3D card". Then nVidia came out with their Riva chipset and suddenly 2D graphics cards weren't really a thing any more.
Edit: well, according to Wikipedia it was somewhat competitive although drivers and support for rendering stacks was erratic for a while. I guess that’s what soured me up, and I was probably too n00b to have a clue
Riva 128 (April 1997) to TNT (June 15, 1998) took 14 months, TNT2 (March 15, 1999) 8 month, GF256 (October 11, 1999) 7 months, GF2 (April 26, 2000) 6 months, | 3dfx dies here |, GF3 (February 27, 2001) 9 months, GF4 (February 6, 2002) 12 months, FX (March 2003) 13 months, etc ...
Just think about it, 6 months between GPU generations, Nvidia had an army working 24/7 while 3dfx had couple ASIC people, 3dfx was murdered in their sleep.
I came to write the exact same thing. It was gaming pre-Voodoo 1 and post. The difference was so striking - like going from SD TV to HD.
Most hardware I've purchased has been evolutionary. It's not often I bought something and verbally said 'holy shit' like I did with the Voodoo 1 card.
I remember buying my first 3D 'accelerator card' as they were called back then. It was a Voodoo Banshee card. The Banshee had an onboard 2D video chip, so it didn't have the VGA passthrough cable.
I bought the card at a trade show (the Dutch audience here will remember the 'HCC dagen'). That's where you could buy them cheap. Not sure if it was actually cheaper, internet wasn't very useful back then, so there was no easy way to compare prices.
I didn't have a computer of my own yet (I must have been 14 years old or so), so I bought it for our 'family' computer, an IBM P166. I remember getting up super early to put the card in before my dad would wake up. He would certainly have freaked out if he saw that I opened up the expensive computer to put it some gaming thing.
Good times.
GLQuake blew my mind, but I was just trying to get Daggerfall to run acceptable more than anything at that point, and run it did.
Good times.
I also had a Voodoo Banshee. I put it in my AMD K6-2 with 192MB of RAM. This was right around when Windows 98 came out.
My friends all had similar rigs. We were on the quest for max frame rate in Halflife, TF1, and original CS. It was a different time back then.
I picked it up in the early 2000s during a pre-closure clearout at my then-employer, a UK video games developer. The sheer size of the thing made me LOL, that and the number of fans and the additional power connector. Then I noticed it was a 3dfx - oh, hey! I could play that Glide-based motorbike game, that I remembered enjoying at a friend's house a few years previously.
The game wasn't as good as I remembered. I threw the card away.
I was young, but I immediately recognized it and traced it back to the computer magazines (from 97/98) of my older syster that I used to devour. No other student realized what it was. They were puzzled by the double VGA connector :)
I took it home, and installed it in my Athlon XP 2600+ system alongside the ATI 9800 Pro, to finally try the Glide games that I never had the opportunity to play years before... The first time I saw the full-screen 3dfx logo, it felt amazing.
Not only I still have it, but I collected more for free over the years, and I'm still amazed that something I would drool over a few years back became trash-worthy. It's either a reminder that good things happen to those who wait... or a memento morì.
I was curious so I had a dig on the specs to relive the decision of the time and can see the Matrox did 32bit colour whilst the 3500 was 24bit. Not seen any comparisons in performance but I certainly had no complaints and was happier with the G400Max on many levels (2nd monitor - no problem).
[EDIT ADD] This looks worth a watch for nostalgia circa 1999 graphics cards and compares the G400MAX, 3DFX 3500 and the TNT2 Ultra https://www.youtube.com/watch?v=-4LvoGQ2lgI
I saved up money to build my first computer that summer and chose a Creative 3D Blaster TNT2 Ultra, paired with an Abit BX6 motherboard, a Pentium 3 450 CPU, 256MB of RAM, a white box Sound Blaster Live, a 3Com Etherlink XL, a US Robotics 56K modem, generic 40x CDROM drive plus a CDRW drive, and I want to say an 8GB Western Digital hard drive, connected to a 19" monitor that was so big I had to pull my desk away from the wall to fit my Model M keyboard. It was pretty much perfect, except for the fact that technology was improving so quickly then that I started feeling the need to upgrade it in 6 months.
That fall I went to college and hooked it up to the university's broadband connection. I probably got my money's worth in all the hours I spent on that machine. It definitely gave me a sense of wonder and power to have something top of the line. I spent countless hours playing Half Life multiplayer, Team Fortress, and then Counter-Strike beta. I don't know if I'd do it again, but I had a great time back then.
https://www.youtube.com/watch?v=ooLO2xeyJZA
Btw - already EDO DRAM favours reading entire memory blocks/lines. My suspicion is that the trick in increausing memory bandwidth is not only based on interleaving, but also on block transfers with on-chip caching. Especially critical for texture reads.
The Voodoo is a nice example of how much execution matters. They were not the only ones to follow this path back then, but by only concentrating on the core functionality they managed to beat all others to the market without compromise. (Compare to S3 Virge, Matrox Mystique, Nvidia NV1, Tseng, NEC and many others)
>they started they own company
typo
>EOM'es only leverage on the cards they produced was the RAM they selected (EDO vs DRAM), the color of the resin and the physical layout of the chips. Pretty much everything else was standardized.
- EDO is DRAM, EDO vs FPM? I didnt know that, always assumed every V1 shipped with EDO, just like every card on your pictures is EDO.
- video signal switching was also up to the vendor (relay/mux)
- and one of the cards has TV encoder section with TV out, pretty neat selling point
>It is not specified if the bus used address multiplexing or if the data and address lines were shared. Drawing it non-multiplex and non-shared makes things easier to understand.
You are addressing 512 kB, but datapath is 16 bit/2 Byte wide, so we only need 18 bit address bus. As for multiplexing thats not how DRAM addressing works IRL (understandable misconception/simplification for non EEs). Row/Column address lines are multiplexed, meaning we are down to 9 address pins +OE/WE. Comes down to ~110 pins assuming full 4 way interleaving. Seems doable with >200pin ASICs.
>21-bit address generates two 20-bit where the least significant bit is discarded to read/write two consecutive pixels.
still too many bits, 2MB at 4 byte granularity is 19 overall, 18 per 1MB bank
>TMU was able to perform per-fragment w-divide
This was a HUGE deal at the time, and achieved by doing serious low level optimizations/tricks (lookup tables/approximation if I remember Oral Panel correctly). 3dfx engineers were big fans of good enough hacks vs slow but correct way of doing things. Another one was color dithering, too bad you didnt mention "24-bit color dithering to native 16-bit RGB buffer using 4x4 or 2x2 ordered dither matrix" - this is the reason straight ram dump screenshots from Voodoo1 dont really look the same as on directly connected monitor. 3dfx called it ~22bit color, it was noticably better than Nvidias pure 16bit.
Btw afaik Quake pushed somewhere between 500-1000 polygons per frame, earlier games like Actua Soccer rarely went up to 500 with fatal consequences of single digit framerate on S3 Virge. You might enjoy Profiling Of 3 Games Running On The S3 ViRGE Chip http://www-graphics.stanford.edu/~bjohanso/index-virge-study...
To thank the support staff, the management/owner offered the choice of a Voodoo2 card or a DVD player (~$200 each IIRC) to every support rep that helped with the load that day. I ended up working for that company three separate times in different positions, leaving for college and coming back, and working for a relative's company for a while and returning again later. It's the wonderful people there and actions like that which keep that company in a special place in my heart. (For those wondering, it's Sonic.net, now Sonic.com, or just Sonic maybe. I'm not sure the official branding, and I have years of history with it being Sonic.net)
They are really special cards, and two of my vintage computers were built around a Voodoo 1 and Voodoo 2, meaning, I started with the cards, and knew I had to build a PC to run them optimally.
Perhaps i should try and find a couple of those before they disappear.
Just a few seconds of gameplay was enough to get someone to make the purchase.
Modern source ports have fixed both issues though (FWIW i recommend Quakespasm)
The Voodoo1 4MB which I had back then as well, was unprecedented. I agree that the Sound Blaster was great (in isolation against older x86 PCs), but 3dfx was the first standout example of the fruit in the IBM-compatible space, and a clear demonstration in the gaming sense as to why it took over the market.
PC Speaker vs. Soundblaster is more EGA vs VGA, something much more profound. And beneficiary to the gaming space, as opposed to early 3D, which basically ruined all the things for quite a while...
The readability part is just the css based on monospace ans justified paragraphs.
The site does seem to run some PHP scripts. http://fabiensanglard.net/appleTechTalk2009/index.php
For example, google "static site generator".
I bought one my junior year of college as a CS major. I had some good internships and that year I had some money. Instead of a car I had built a Pentium MMX 200Mhz box with Diamond Stealth card & 16MB ram. Pretty hot machine among my friends at that time... when the Voodoo 1 became available I was able to get one and it's performance was mind blowing at the time, even though I had access to SGI machines and such on campus that had way more impressive demos on them.
My senior year of college I took an Open GL course and did a bunch of my projects in linux with the Voodoo 3D drivers. Cool stuff. Played a lot of quake too, I remember writing a program to render the quake characters on my own as one of my projects. The data model formats were open source IIRC so it wasn't too hard to read in the data. Very cool since we didn't have any good 3D tools to build our own models.
I remember playing AH-64 Longbow or something on it too.. some of the flight sims were amazing at that point right before flight sim popularity tanked at the same time the remaining programs got unbelievably complex.
Voodoo was kind of a pain the neck in day to day usage. In 1999 I built a new machine and went to an NVIDIA Riva TNT and then later that year got a GeForce 256 when those came out.
Kind of the end of my heyday of PC gaming.. the combination of working on computers all day + games at the time still requiring a lot of debugging to get them to work well wore me out.
Some things the article did seem to miss out on:
- You could have two Voodoo's in your PC for extra throughput (I can't remember the numbers). I seem to recall there was a ribbon cable between the two boards...
- The reason 3dfx ultimately failed was due to hefty lawsuits ongoing with NVidia about IP theft and headhunting the 3dfx staff.
During this time there was a mailing list (can't remember it's name) that existed and a lot of game devs operated in it, mainly around DirectX (v1 onwwards), but it was in existence much before that. All the card manufacturers were on it that I recall. One day John Carmack posted a comment (I'm paraphrasing somewhat) how rubbish DirectX and Direct3D was. A month or so later glquake was available.
I think it was about 12-18months later Unreal (the game... before the engine) was announced as a demo on this list and we all thought: Awesome -- who the * are these guys!?
I'd like to say 'Good times' were had, but seriously, I burnt out due to the insansely fast changing pace of 3D dev during those times.
They were the first thing I ever sold on eBay, sometime around 1999.
What should I do with them?
Voodoo, Riva 128, tnt, voodoo2, lol don’t forget that power VR, what an amazing time to live
I then moved to NVidia predominantly (TNT2 Ultra), although I did pick up a cheap V5 5500 which I ran for a bit.
Like other's have said, it was a fun time to be involved with PC gaming. Unfortunately life has got in the way since, although I do spend time on Vogons looking at old systems and wondering if I should build a couple of retro machines!
If you need justification you can tell yourself that these machines will only go up in value. Works for me ;)
Hell, Quake itself was written on miniGL which was a subset of an abstraction layer built (IIRC) on top of Glide.
And really, OpenGL at the time was simple too. [Check this OpenGL 1.1 reference](http://www.talisman.org/opengl-1.1/Reference.html) which also includes GLU and GLX function and even with those, the number of calls is very small.
For Direct3D games you must use a 32bit color mode otherwise Windows 8.x/10 will force software rendering which is very slow. Even better, use dgVoodoo2 which is a reimplementation of DirectX 1 to 7 (with a some bits of 8) in Direct3D 11 and provides much better compatibility (also gets rid of the Direct3D 7 2048 surface width limitation, making it possible to play games at 2560x1440 and up).
For Glide, dgVoodoo2 is also very good and you can "cheat" the game to force higher resolutions than what the game thinks it is running at.
I have a lot of old games and everything that isn't DRM encumbered works fine in Windows 10 using dgVoodoo2 and/or some game-specific hacks (Tomb Raider 1 for example is normally a DOS game that you can play using a Glide-enabled build of DOSBox but there was also a Windows version made that used an ancient proprietary 3D API by ATI - someone reimplemented that API and placed extra hacks in there for high resolutions and widescreen support).
We both bought cards. I was convinced.
I really wish I'd gotten back into game programming then (I was doing mostly systems stuff, boring things like storage and operating systems). It would have been a lot of fun.
Then Nnvidia came along, and I remember wishing I had money to invest on it early on. Wished I could go back and do it :D
I wonder if nvidia actually got much ip out of the acquisition.
On a side note, that blue cable was so incredibly stiff you could probably club somebody over the head with it and it still wouldn't bend.
There was no analog-digital-analog conversion, it was straight analog pass-thru, either using a mechanical Relays or special video switching muxes. Slight signal degradation was due to additional cables and connectors.