>A lot of things have changed in the last quarter-century – in 1997 NVIDIA had yet to even coin the term “GPU”
[1] https://www.anandtech.com/show/21542/end-of-the-road-an-anan...
Edit: source https://www.computer.org/publications/tech-news/chasing-pixe...
Found these here https://books.google.de/books?id=Jzo-qeUtauoC&pg=PT7&dq=%22g... Computerworld magazine 1976 VGI called a graphics processing unit (GPU)
> 3400 is a direct-writing system capable of displaying 3-D graphics and alphanumerics with speeds up to 20.000....
It's not what I would call a GPU, but I think it's hard to draw lines when it comes to naming things and defining things.
If anyone else wants to try to find the real GPU
https://www.google.com/search?q=%22gpu%22+graphics+processin...
> a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second
It’s kind of arbitrary, even when you take out the processing rate. But prior to that there was still a significant amount of work expected to be done on the CPU before feeding the GPU.
That said, the term GPU did definitely exist before NVIDIA, though not meaning the same thing we use it for today.
"The TMS34010, developed by Texas Instruments and released in 1986, was the first programmable graphics processor integrated circuit. While specialized graphics hardware existed earlier, such as blitters, the TMS34010 chip is a microprocessor which includes graphics-oriented instructions, making it a combination of a CPU and what would later be called a GPU."
https://en.m.wikipedia.org/wiki/TMS34010
And they weren't alone in the history of graphics hardware.
the "old way" was to engineer a bit of silicon for each one of those things, custom-like. problem was how much silicon to give to teach feature, it almost has to be fine-tuned to each individual game, a problem
So nvidia comes up with the idea to sort of have a pool of generic compute units, each of which can do T&L, or shading, etc. Now the problem of fine-tuning to a game is solved. but also now you have a mini compute array that can do math fast, a general-purpose unit of processing (GPU-OP), which was a nod from NVIDIA to the gaming community (OP - overpowered)
https://archive.org/details/byte-magazine-1985-02/1985_02_BY...
> Two years later, Nvidia introduced the GPU. He [Curtis Priem] recalls that
> Dan Vivoli, Nvidia's marketing person, came up with the term GPU, for
> graphics processing. "I thought that was very arrogant of him because how
> dare this little company take on Intel, which had the CPU," he said.https://archive.org/details/byte-magazine-1985-02/1985_02_BY...