DF Retro: The Story of Nvidia GeForce 256 - The Original 'GPU

JohnnyFootball

GerAlt-Right. Ciriously.
MAN, does this take me back. This is the golden age of PC gaming, where there were true PC exclusives and games that couldn't be played on anything but a PC. This was the card that made nvidia the undisputed GPU leader and didn't get really challenged until the Radeon 9700 Pro.



So many fun times when this GPU came out. The successor Geforce 2 Ultra was the first $500 GPU.


GOOD TIMES
 
The "original GPU"? Is this paid for by Nvidia? Ignoring the 3Dfx Voodoo 2 and 3, which in 1999 were much more popular than the 256.

At the time, these cards were called just graphics accelerators. We were still at a time when most games could run in software mode. But if you had a card like these, certain graphic's functions would be accelerated. Hence the name.
nVidia was the first to market the term GPU. Mind you this was not a technical term, just a marketing gimmick. But it stuck, and now all graphics cards are called GPUs.
And nVidia defined that only cards that had T&L could be called GPUs. Once again, this was just a marketing gimmick to differentiate the Geforce from the competition.
ATI tried to counter with the term VPU, Visual processing Unit. But it never caught on.
 
Last edited:
The "original GPU"? Is this paid for by Nvidia? Ignoring the 3Dfx Voodoo 2 and 3, which in 1999 were much more popular than the 256.
It's what Nvidia called it at the time of it's launch. "Nvidia hailed the GeForce 256 as "the world's first GPU," a claim made possible by being the first to integrate a geometry transform engine, a dynamic lighting engine, a 4-pixel rendering pipeline, and DirectX 7 features onto the graphics chip."

I assume the term "GPU" wasn't used before the release of the GeForce 256.
 
GF256 was called the first GPU by nVidia because it was the first board to have T&L (transform and lighting calculations) on board.
All previous cards were just rasterizers.
They simply took all the calculations done by the CPU, project them on the screen space to create a 2D image from a 3D description while applying textures in the process.

In any case the GF256 was a card with limited filtrate (480Mpixels/s) and the T&L implementation was just the first experiment in that direction.
The 3dfx Voodoo 5 was a much better card although it was delayed and ended up competing with the GF2 GTS.
 
Dear Lord, I feel old looking at all the benchmark footage. I remember being blown away when I played Q3A on the 256. Times were a lot simpler back then. :)
 
I think i had the Matrox Mystique at this time, then went onto a Radeon 9200 if i remember right, then Geforce 9600 GSO, etc.
Never had this card.
 
Last edited:
Good old times.

My first graphic card (back then it wasn't even called GPU) was S3 Virge. Then I had ATI Rage Fury, GeForce 2 Ti, Radeon 9500, GeForce 4 Ti 4200 and GeForce 5900XT. The last one was a HUGE MISTAKE back then due to low performance in games requiring Shader Model 3.0 - games like Far Cry "1".
 
Good old times.

My first graphic card (back then it wasn't even called GPU) was S3 Virge. Then I had ATI Rage Fury, GeForce 2 Ti, Radeon 9500, GeForce 4 Ti 4200 and GeForce 5900XT. The last one was a HUGE MISTAKE back then due to low performance in games requiring Shader Model 3.0 - games like Far Cry "1".

FC1 didn't require SM3.0
It released with SM2.0 support, and only later did it receive a patch for SM3.0
And even then it just run marginally better than SM2.0
The issue was that the FX cards were really bad. Easily the worst generation in nVidia's history.

You should have just kept the 9500 and softmod it to a 9700.
 
Last edited:
I got one back in 1999. The hype back then was ridiculous. I remember reading an article in PC Gamer about how these cards would be so powerful that your CPU wouldn't matter at all and you wouldn't have to upgrade it anymore :messenger_tears_of_joy:
 
But it didn't have T&L...

Sure but it was a complete graphics card that didn't need a 2D card already installed like previous "3D accelerators".

Edit: There were various cards that did the same, the Riva TNT was also a complete graphics card and that was released in 1998. And that was by Nvidia. So i don't know why the 256 is referred as the first GPU. Is T&L a requirement?
 
Last edited:
Sure but it was a complete graphics card that didn't need a 2D card already installed like previous "3D accelerators".

Let me repeat this again. GPU is a marketing term made by nVidia. And nVidia decided that only cards that had T&L were GPUs.
 
Sure but it was a complete graphics card that didn't need a 2D card already installed like previous "3D accelerators".

Edit: There were various cards that did the same, the Riva TNT was also a complete graphics card and that was released in 1998. And that was by Nvidia. So i don't know why the 256 is referred as the first GPU. Is T&L a requirement?
Yeah but there were a lot of cards like that, my 200MMX came with a shitty ATI Rage 2 card that was both 2D/3D.

GPU has always meant T&L ever since this card. It doesn't matter if Nvidia invented that determination or not, it's what happened and it is what has been used since then.
 
So the engineering samples of the Riva 128 we were using back in '97 weren't real... good to know! :(
 
Last edited:
At the time, these cards were called just graphics accelerators. We were still at a time when most games could run in software mode. But if you had a card like these, those certain graphic's functions would be accelerated. Hence the name.
nVidia was the first to market the term GPU. Mind you this was not a technical term, just a marketing gimmick. But it stuck, and now all graphics cards are called GPUs.
And nVidia defined that only cards that had T&L could be called GPUs. Once again, this was just a marketing gimmick to differentiate the Geforce from the competition.
ATI tried to counter with the term VPU, Visual processing Unit. But it never caught on.
Haha, I remember looking down my nose like a snooty teenage nerd at anyone who used the term 'GPU' , it was such a marketing gimmick at the time. I have since learned:

a) Not to be so uptight

and

b) That everyone (including me) calls it a GPU now anyway.
 
FC1 didn't require SM3.0
It released with SM2.0 support, and only later did it receive a patch for SM3.0
And even then it just run marginally better than SM2.0
The issue was that the FX cards were really bad. Easily the worst generation in nVidia's history.

You should have just kept the 9500 and softmod it to a 9700.
I know the game was release with SM2.0 and later they released a patch for SM3.0 support. Just skipped that part.
Nevertheless, back then, the FX series was the worst ever released.

P.S: I got it for a present so luckily I didn't pay for the card.
 
I know the game was release with SM2.0 and later they released a patch for SM3.0 support. Just skipped that part.
Nevertheless, back then, the FX series was the worst ever released.

P.S: I got it for a present so luckily I didn't pay for the card.

On the positive side, you got a 5900 series.
The 5800 was even worse. So bad, even nVidia's engineers made fun of it.

 
Last edited:
ati mach 64 (2d) + a pure 3d LX voodoo 1 (6mb!)
creative TNT 1 or 2 can't recall
elsa xrazor geforce 1 ddr (died on me)
?
ati 9100
ati AIW 9800 (this thing lasted forever before i upgraded)
xfx 7800 gtx
bfg 9800 gt
sapphire 5870
sapphire rx 290
powercolor rx 580
geforce 3070 fe

to the best of my "gpu" history I can remember

I recall being quite disappointed on how my geforce 1 crapped out quite quickly I had to go back to the tnt for sometime.
 
Very interesting.
Its crazy to think that GPU are relatively new.

I wonder if there will ever be a new type of chip. There are TPUs which are more suited to machine learning tasks.

I wonder if there will ever be dedicated TPUs for PC s to aid with A.I image reconstruction.

........


I also remember back in the day when we got a family computer its had a Riva TNT2 M64, it turns out the M64 version was gimped with 64bit memory bus, halfing its memory bandwidth. However at the time I was new to PC gaming and didnt know about any of it apart from things like the clock speed and memory etc. The GPU did well though paired with a 600mhz P3 and 384mb ram. It played max payne at medium settings @800x600 at reasonable well framrates (but I didn't really understand framerates at the time), unreal tournament, half lofe, red alert 2 also played very well.
return to castle Wolfenstein struggled through, there was one sniper level which had like sub 10fps lol.
 
It was the first GPU in popular terms but what made the Geforce 256 the first GPU - Transform & Lighting - was displayed many years prior:

  • The Intergraph Realizm line of 1994. These were professional OpenGL cards and they had Geometry modules along with Texture modules, thus establishing a T&L GPU. OpenGL was a far more advanced feature set than DirectX, so T&L existed for several years already in that space (There were Windows NT drivers for these things, and famously John Carmack used these in his development systems) (Linked is a Realizm I, which was used to develop GLQuake, of 1996 playing Quake 3 in 640x480 at 22 FPS. The Realizm II followed in 1997)

  • SGI's IMPACT line and in general the Geometry Engine (1981) were the professional precursors. A Maximum IMPACT system held a Raster Manager and a Display Manager with 4 MB of highly expensive Texture RAM and was used in the Indigo2 Workstation. A fully equipped Indigo2 from 1995 could play Quake 3 at 20-30 FPS, and can run Hexen 2 (A Quake engine game) at 1280x1024 (Above HD resolutions) when the IRIX port is used.
  • Arcade hardware subsequently also pushed for T&L, but unlike the Geforce 256, they used multiple DSP's to achieve the feat - Namco System 22 from 1992 uses the Evans & Sutherland TR3 chip along with a bank of Texas Instruments DSP's, Sega's Model 1 (1992) and 2 (1993) utilize several dedicated ASIC's by Fujitsu (Like a Z-Sorter, etc) to achieve T&L - But those weren't yet integrated like the Geforce 256. The Real3D/1000 used in the Model 3 from 1996 is more close to this, and provides similar performance to a Geforce 256, 4-5 years before that card was released.

Nvidia can't even claim it had the first shader based GPU: That goes to Sun and Michael Deering's efforts, who pioneered the concept in 1988.

PS: The term GPU can be applied very loosely here - Basically, anything that accelerates a rendering process. Argonaut, the developers behind the Super-FX chips used in the SNES, have called their accelerator a GPU. The Amiga (and later the Atari STE) have a Blitter that can copy contents quickly in RAM, thus accelerating the process. The Mindset PC from 1984 (Which had a 186 processor and ran DOS!) had a blitter aswell, and was used in the only game known to use it - Vyper:



Even further down the line you had arcade systems utilizing a barrel shifter - Essentially a quick copy device. Atari's I-Robot (1983) has a custom Mathbox unit comprised of 4x AMD AM29000 bit-slice processors, build to 16-bit, responsible for the polygon generation and thus by that metric a GPU.

At the end of the day, The Geforce 256 brought Hardware T&L to the PC and thus popularized the term GPU - An all-in-one term describing a device that can do rendering, lighting and rasterization in real time on a single chip.
 
Last edited:
ati mach 64 (2d) + a pure 3d LX voodoo 1 (6mb!)
creative TNT 1 or 2 can't recall
elsa xrazor geforce 1 ddr (died on me)
?
ati 9100
ati AIW 9800 (this thing lasted forever before i upgraded)
xfx 7800 gtx
bfg 9800 gt
sapphire 5870
sapphire rx 290
powercolor rx 580
geforce 3070 fe

to the best of my "gpu" history I can remember

I recall being quite disappointed on how my geforce 1 crapped out quite quickly I had to go back to the tnt for sometime.
Man, I rode my 9800 Pro until 2008, maybe 9 lol. Was such a great card. I was rocking a big ass blue Dell XPS, had a p4 3.2Ghz with HT and I think 2GB of ram. I also got the 5.1 Altec Lansing sound system, that computer was such a beast and I had some of my favorite memories on that thing. I'll never forget firing up Doom 3 for the first time and hearing the 3d sound go all over. It was fucking remarkable. In 2009 I upgraded to a 4core AMD Gateway with a GTX 260 and I was absolutely blown away by the change. I'm not seeing as big of leaps now compared to then, but I absolutely adore this technology and the fact we get to do whatever we like virtually.
 
what was the worlds first 3d accelerator then?
This really depends on what you mean:
- in the PC space, you had PHiGS, and Intel i860 based boards like the SPEA Fire.
- In professional use, you had image generators. Basically a drawer of graphics ic's. E&S used those.
 
My Lil mid range gpu history

Riva TNT2 M64 32MB
Nvidia XFX 7600gt 512mb
Xfx AMD 4870 1GB GDDR5


My current card - ASUS AMD R7 260 1gb gddr5

I mainly game on console now, I just play C&C remastered on PC.
 
The "original GPU"? Is this paid for by Nvidia? Ignoring the 3Dfx Voodoo 2 and 3, which in 1999 were much more popular than the 256.
If you cared to actually watch the video (literally two minutes in) it specifically says "sponsored by Nvidia" - so yes... it is paid for by Nvidia :D
 
I bought one at the time, and for the 1st time i switched from my Voodoo 3's, the transform and lighting demo was impressive back then, the card lasted me a good few years, until i had to buy a Geforce FX to play games, a great GPU for it's time.
 
Dear Lord, I feel old looking at all the benchmark footage. I remember being blown away when I played Q3A on the 256. Times were a lot simpler back then. :)
This was a time when you could make a triple A game in 2 years with a studio of 50 developers.
That's why there was a lot of experimentation going. Investors could handle the risk of losing 20 million dollars.
 
And nVidia defined that only cards that had T&L could be called GPUs. Once again, this was just a marketing gimmick to differentiate the Geforce from the competition.
The first time someone have called their video chip as GPU was Sony with the first Playstation. Then it was Nvidia with he T&L, but personally I woundn't call GPU something without programmable functions, otherwise even the SNES video chip was a GPU.

Is a CPU with fixed functions a CPU? Yes because you can do anything with that limited functions. It's not the same with the PS1 GPU and the GF256, since the only things you can do were the built-in functions. But with modern GPUs, you can do everything, from GPGPU to graphics, to physics, etc.
 
Last edited:
My Lil mid range gpu history

Riva TNT2 M64 32MB
Nvidia XFX 7600gt 512mb
Xfx AMD 4870 1GB GDDR5


My current card - ASUS AMD R7 260 1gb gddr5

I mainly game on console now, I just play C&C remastered on PC.

I have a similar lame trajectory.

TNT 2
Geforce 5500
Geforce 9800GT (green addition, it was cheap)
Geforce GTX 960

Interesting video. I was primarily just playing on PS1 back then, maybe DC I don't remember.
 
The first time someone have called their video chip as GPU was Sony with the first Playstation. Then it was Nvidia with he T&L, but personally I woundn't call GPU something without programmable functions, otherwise even the SNES video chip was a GPU.

Is a CPU with fixed functions a CPU? Yes because you can do anything with that limited functions. It's not the same with the PS1 GPU and the GF256, since the only things you can do were the built-in functions. But with modern GPUs, you can do everything, from GPGPU to graphics, to physics, etc.

Never saw anything from Sony at the time calling the graphics chip on the PS1 as a GPU. In fact this chip was a 2D chip. And the PS1 relied on the GTE for it's 3D capabilities.
 
My original S3 virge 4mb !!!
Shame I don't have my voodoo :( I had the original 1997 voodoo and running half-life, turok and Unreal was... unreal

4FASy5R.jpg
 
Last edited:
ATI tried to counter with the term VPU, Visual processing Unit. But it never caught on.
ATI were just too far ahead of their time; VPU is actually used these days, but for Vision Processing Unit, for accelerated machine learning vision workloads.

Very interesting.
Its crazy to think that GPU are relatively new.

I wonder if there will ever be a new type of chip. There are TPUs which are more suited to machine learning tasks.

I wonder if there will ever be dedicated TPUs for PC s to aid with A.I image reconstruction.

Rather than dedicated accelerators (since there'd be a possible bandwidth bottleneck), we'll just keep getting integrated acceleration units for offloading those type of tasks. I think Apple has this already with small cores, the Neural Engines.

More stuff like that with integrated chiplet designs, most likely.
 
Last edited:
My original S3 virge 4mb !!!
Shame I don't have my voodoo :( I had the original 1997 voodoo and running half-life, turok and Unreal was... unreal

4FASy5R.jpg
I remember buying extra memory for S3 Virge with my father. Beautiful times xD
 
I was real hyped for the GeForce when it came out, and I got one a few months after launch, that holiday season. I had a 3Dfx Banshee (which, I guess, was probably only a year old), but I saw how GeForce was smoking the Voodoo 3 in benchmarks and I had to have one. I would have been a high school senior.

I bought the ASUS V6600 Deluxe model, which came with LCD-shutter 3D glasses that, at least in theory, worked with any Direct 3D game (in practice it was hit or miss, but cool for the stuff it worked well with). This was the time when a lot of games started moving away from the GlideAPI and onto Direct3D, so it was a good time to hop on the new shit, and GeForce was a beast at the time.

That didn't have 2D, thus it truly was just 3D accelerator.
Yeah but there were plenty of cards that did both.

nVidia's claim to the "GPU" name was that theirs was the first to do hardware transform and lighting. Whether or not that's a defining feature of a GPU is pretty debatable, but that's what their claim was.
 
Last edited:
Top Bottom