• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

The first GPU with programable shaders was released 25 years ago

winjer

Gold Member


NVIDIA is marking the 25th anniversary of the GeForce3, which debuted on February 27, 2001. The company's own product timeline lists GeForce3 as a February 2001 launch, and the card is widely remembered as NVIDIA's first GeForce GPU to bring programmable shader support to consumers.
GeForce3 was a small family by today's standards. NVIDIA's lineup refresh arrived later in 2001, expanding the range into three main retail variants. GeForce3 series cards (retail models):

GeForce3 (original)
GeForce3 Ti 200 ($149 MSRP)
GeForce3 Ti 500 ($349 MSRP)

Matt Damon Grandpa GIF
 
My first ever GPU was the GeForce 3 Ti200, which I got from a close friend when he upgraded to a Radeon 9800 Pro. Man, I loved that video card.

The first game I tried on it was Prince of Persia The Sands of Time, which I borrowed from the same friend, since before that I couldn't even play it because it required a GPU with pixel shader 1.1 support. Good times!
 
The jump from my TNT2 32MB (1999) to Geforce3 128MB (2002) was massive. My framerate in Quake III Arena improved from 7-10 fps at 1600x1200x32 to 80-90 fps. On top of that, the GeForce 3 had amazing new graphics features that blown me away. The Nature demo in 3DMark 2001 featured pixel shader water and moving grass, and all my friends who saw that tech demo on my PC were impressed. As for games, the true showcases for my GeForce 3 were Splinter Cell, Enclave, and NOLF2. Experiencing these games on my new GeForce 3 was magical. I want to thank all the Nvidia engineers who built that revolutionary chip.
 
The jump from my TNT2 32MB (1999) to Geforce3 128MB (2002) was massive. My framerate in Quake III Arena improved from 7-10 fps at 1600x1200x32 to 80-90 fps. On top of that, the GeForce 3 had amazing new graphics features that blown me away. The Nature demo in 3DMark 2001 featured pixel shader water and moving grass, and all my friends who saw that tech demo on my PC were impressed. As for games, the true showcases for my GeForce 3 were Splinter Cell, Enclave, and NOLF2. Experiencing these games on my new GeForce 3 was magical. I want to thank all the Nvidia engineers who built that revolutionary chip.

My first graphics was a TNT2 Ultra. Amazing card for the time.
Eventually, I got a GeForce 2 MX and that was a nice bump in performance.
But at the time, I was studying and money was short. So I skipped the Geforce 3 and 4. But I looked at magazines and reviews on the internet and really wanted one.
 
Did Xbox gpu have them ?
i think it did but released a few months later.
The first Xbox had an even better GPU than the GeForce 3. It was a GeForce 4 hybrid with an additional vertex shader unit, higher clocks, and some features from the new architecture. Nvidia built a true high-end GPU for Microsoft, but Microsoft did not provide the necessary bandwidth for that chip on the Xbox console. My GeForce 3 didn't have to share memory bandwidth with the CPU and could run games much better, even at 1600x1200x32. For comparison, Xbox games run at 640x480x32.
 
My first graphics was a TNT2 Ultra. Amazing card for the time.
Eventually, I got a GeForce 2 MX and that was a nice bump in performance.
But at the time, I was studying and money was short. So I skipped the Geforce 3 and 4. But I looked at magazines and reviews on the internet and really wanted one.
I had the GeForce 2 MX 32MB (128-bit) for a short time, but my motherboard had some issues with it. If I recall correctly, the AGP speed was limited to 1x, so I only experienced a fraction of the GeForce 2 MX's potential. This situation forced me to replace my entire PC.

Geforce 2 had some interesting new technology "shading rasterizer". I think the first game that used this tech was "Evolva". This game used bump mapping on virtually all objects. Each player and monster had a highly detailed shadow. Thanks to integrated T&L, "Evolva" could use up to 80,000 polygons per frame when run on a GeForce or GeForce2. Evolva looked impressive back then, and the TNT2 surely wasn't capable of running it.

But the GeForce 3 was in a class of its own compared to the GeForce 2, representing a true generational leap in every way. Old games were never the same once I saw the DX8 graphics effects. Games built around GeForce 3 hardware look surprisingly good, even today. Xbox classic had similar GPU and it's one of the reason why so many people fall in love in xbox. Without the GeForce 3 hybrid, games such as Halo, Riddick, Doom 3, Splinter Cell, Project Gotham Racing, Rallysport Challenge, and Crimson Skies would never have looked as good on a console.
 
Last edited:
The jump from my TNT2 32MB (1999) to Geforce3 128MB (2002) was massive. My framerate in Quake III Arena improved from 7-10 fps at 1600x1200x32 to 80-90 fps. On top of that, the GeForce 3 had amazing new graphics features that blown me away. The Nature demo in 3DMark 2001 featured pixel shader water and moving grass, and all my friends who saw that tech demo on my PC were impressed. As for games, the true showcases for my GeForce 3 were Splinter Cell, Enclave, and NOLF2. Experiencing these games on my new GeForce 3 was magical. I want to thank all the Nvidia engineers who built that revolutionary chip.
Technology moved so fast in those days!
 
I played the hell out of Warcraft 3. I'm not sure if this was the PC I played it on, but I remember our computer around then having a TNT2 Aladdin, which was an integrated chipset from NVIDIA.
 
I didn't know that!

Damn I remember playing Doom 3 on it at like 15fps and stuttering like crazy haha

Yes. Even Jonh Carmack called out Nvidia on their bullshit move.

"Nvidia has really made a mess of the naming conventions here. I always thought it was bad enough that GF2 was just a speed bumped GF1, while GF3 had significant architectural improvements over GF2. I expected GF4 to be the speed bumped GF3, but calling the NV17 GF4-MX really sucks. GF4-MX will still run Doom properly, but it will be using the NV10 codepath with only two texture units and no vertex shaders. A GF3 or 8500 will be much better performers. The GF4-MX may still be the card of choice for many people depending on pricing, especially considering that many games won't use four textures and vertex programs, but damn, I wish they had named it something else."
 
ngl, it took me a bit to get to grips with how you are supposed to write vertex and fragment shaders (in GLSL) in addition to geometry and tesselation shaders (and compute shaders). Luckily, mesh shaders will simplify that whole pipeline again to "just" mesh shaders and fragment shaders (and compute shaders, which are just mesh shaders but with more flexible input and output types). Also: Slang seems cool as hell.
 
My first GPU was the Geforce 256 (I use to play Max Payne 1 on this card), folowed by a GeForce 2 Geforce 4ti, GeforceFX5700 Ultra (This was basically required to run Max Payne 2, as it needed DX9.0c), GeForce 6600... etc... I also had a few ATi cards. I am currently using an AMD card.
 
My strongest memory about the Geforce 3 is how it was supposed to be the card to promote DOOM 3. I remember watching a promo video about this and reading magazine articles.

But DOOM 3 took so long to make, it was released along the Radeon X800 and Nvidia Geforce 6 series. The Geforce 3 was pretty much obsolete by then. Remember, back then hardware was improving sat a rapid pace. I had a Geforce 4 ti 4400 when DOOM 3 was released and i still struggled playing it at high enough settings and resolution. So playing the game on a GF3 would probably be about as bad as the XBOX version or worse.
 
Top Bottom