• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The HD revolution was a bit of a scam for gamers

soulbait

Member
You are confusing LED, Plasma, and other technologies of the time with HD. HD did not mean those technologies, it was just a resolution. You continue to bring up CRT, forgetting that there were HD CRT TVs (I know, the first HD TV my family owed was a CRT). We bought a HD CRT due to the limitations of LED and the expense of Plasma at the time.

Also, you ignore that there were many HD resolution PC CRT monitors at this time. That is actually how I mostly played my 360: on a high end PC CRT monitor via its VGA HD cables.
 

charles8771

Member
I didn't think I would but I agree, especially early on. The first HD TVs were awful and looked so much worse than CRTs. The ghosting was awful. We probably could have gone to enhanced definition CRTs as the standard instead. A lot of games were below 720p with many in the 500p to 600p region like Halo 3. Many are commenting on the fact that jump to 720p was worth it but the funny thing is half the games ran below 720p and at sub 30fps.
Halo 4 was 720p.

However Halo 3 was 1152x640p due for HDR lighting, two frame buffers.
Sub-HD resolution was there for many reasons.
1-)Like how PS3 was notorious for being very difficult to work for it:
https://n4g.com/news/569155/top-10-hd-games-on-xbox-360-that-are-only-sub-hd-on-ps3
2-)The Xbox 360 limited disc capacity of 6.8GBs without HDD by default, that how Final Fantasy 13 ran at 576p while PS3 ran at 720p: https://www.giantbomb.com/forums/final-fantasy-xiii-79/360-version-runs-in-576p-ps3-in-720p-392310/

Don't forget there were PS2 games that run at 512x448 for the sake of performance
 
Last edited:

BlackTron

Gold Member
The upgrade to HD was legitimately megaton but it would be nice if they had prioritized other aspects in gaming more. We started sacrificing things for that res and to this day, haven't gotten all of it back in the pursuit of higher and higher resolutions edit: unless you're Nintendo and care too little
 
Last edited:
You are confusing LED, Plasma, and other technologies of the time with HD. HD did not mean those technologies, it was just a resolution. You continue to bring up CRT, forgetting that there were HD CRT TVs (I know, the first HD TV my family owed was a CRT). We bought a HD CRT due to the limitations of LED and the expense of Plasma at the time.

Also, you ignore that there were many HD resolution PC CRT monitors at this time. That is actually how I mostly played my 360: on a high end PC CRT monitor via its VGA HD cables.

CRT was on its way out as soon as PS3 launched in 2006-2007. everyone was switching to flat panels for HD resolution

By 2009 it was completely gone...
 

SkylineRKR

Member
Yes it was a huge scam on many levels. First of all the consoles were never fit to run even remotely complex games at HD resolutions. They did anyway and usually the framerate tanked in the 20's or lower, and the resolution was quite a bit below 720p. Especially the PS3, which was marketed as a 1080p device and carried a hefty price tag at first was seen as a let down.

I bought a 720p set, 26 inch for like 700 bucks. Ridiculous ofcourse. But in truth you could barely play games like Dead Rising on an SD. It wasn't all doom and gloom, I mean DoA4, CoD2 and some other games did look amazing on that television. But in terms of viewing angle, ghosting, input lag, black levels etc CRT was probably better still. Especially for regular broadcasts and DVD. All they had to do was lower the resolution to 480, which would free up performance. Which many games desperately needed. By 2010 I had a Panasonic plasma which was stellar and better than CRT, but the poor PS3 couldn't really make good use of it. Gran Turismo 5 did look good at times, but also borderline awful. And if you loaded a full premium grid on Monza, the framerate tanked to stupid levels.

The PS4 was the first console that ran a good portion of games at 1080p and at acceptable performances. But it was late 2013. Before that HDTV weren't really a necessity for console gaming imo. In hindsight i should've waited a few years when high end plasma became affordable, but okay, admittedly I had some wow moments with GoW, GR:AW, Dead Rising etc.
 

soulbait

Member
CRT was on its way out as soon as PS3 launched in 2006-2007. everyone was switching to flat panels for HD resolution

By 2009 it was completely gone...
Correct, they were on their way out, because the newer technology allowed for larger screens that did not weight a ton, but CRT was still available at the time and was a better display in HD. The OP (which I just realized posted this a long time ago) was talking about HD compared to CRTs, ignoring there were HD CRT tvs. Does not matter if they were on their way out or not, its just their complaints about HD versus CRT does not make sense, when HD was available on CRT.

Just because the LEDs of that day were not the best (plasmas were great, but cost a ton of money), it does not mean HD was a bad way to go. LEDs got better every year, making it less expensive to get into.

This would be like someone today trying to say that 4K Dolby Vision is shit because all they have done is viewed it on a crappy $400 TV that is edge lit, has terrible nits, and a terrible image processor. To truly take advantage of 4K Dolby Vision, you need a really bright panel or OLED in a darker room.

The PS3/360 consoles were being made to last as more and more HD tvs were coming to the market, not just for the initial ones at the time, which cost a lot of money if you wanted something with good quality.
 

Gaiff

SBI’s Resident Gaslighter
You are confusing LED, Plasma, and other technologies of the time with HD. HD did not mean those technologies, it was just a resolution. You continue to bring up CRT, forgetting that there were HD CRT TVs (I know, the first HD TV my family owed was a CRT). We bought a HD CRT due to the limitations of LED and the expense of Plasma at the time.

Also, you ignore that there were many HD resolution PC CRT monitors at this time. That is actually how I mostly played my 360: on a high end PC CRT monitor via its VGA HD cables.
I'm not and it's clear in the OP. The marketing was 100% geared towards flat panels and not CRTs. Hell, I had an HD CRT for my computer even, but for consoles, it was always plasma or LCD.

Furthermore, the second point that I'm making is that these consoles simply didn't have the horsepower to drive HD resolutions at an acceptable frame rate. Sure, you had 720p but at the cost of sub-30fps.

Resolution also matters a lot less for CRTs.
 
Last edited:

nkarafo

Member
Early 360/PS3 generation was the worst. Early HD screens were abhorrent. You were far better off with any CRT. Took at least until halfway of that generation (which lasted like 8 years total) until we started seeing some decent LCD HD screens. And even now, 99% of them suck compared to CRTs when it comes to motion clarity and input lag.
 
Last edited:

SHA

Member
I remember 480p experience was better than almost all 720p games, the SD experience wasn't bad actually, it was decent and not pixelated, I don't understand how the 720p looked more pixelated than 480p, the HD experience was awful.
 

charles8771

Member
Yes it was a huge scam on many levels. First of all the consoles were never fit to run even remotely complex games at HD resolutions. They did anyway and usually the framerate tanked in the 20's or lower, and the resolution was quite a bit below 720p. Especially the PS3, which was marketed as a 1080p device and carried a hefty price tag at first was seen as a let down.

I bought a 720p set, 26 inch for like 700 bucks. Ridiculous ofcourse. But in truth you could barely play games like Dead Rising on an SD. It wasn't all doom and gloom, I mean DoA4, CoD2 and some other games did look amazing on that television. But in terms of viewing angle, ghosting, input lag, black levels etc CRT was probably better still. Especially for regular broadcasts and DVD. All they had to do was lower the resolution to 480, which would free up performance. Which many games desperately needed. By 2010 I had a Panasonic plasma which was stellar and better than CRT, but the poor PS3 couldn't really make good use of it. Gran Turismo 5 did look good at times, but also borderline awful. And if you loaded a full premium grid on Monza, the framerate tanked to stupid levels.

The PS4 was the first console that ran a good portion of games at 1080p and at acceptable performances. But it was late 2013. Before that HDTV weren't really a necessity for console gaming imo. In hindsight i should've waited a few years when high end plasma became affordable, but okay, admittedly I had some wow moments with GoW, GR:AW, Dead Rising etc.
With that overly complex Cell CPU with that GPU, that how you got awful frame rates on PS3.
RskBqBS.png
H0BYeSv.png


Even some exclusives like Uncharted 1, MGS 4 couldn't run decently
 

TGO

Hype Train conductor. Works harder than it steams.
20-30fps is better than 50fps?
30fps in a game is different to a 50htz refresh rate on a TV.
While FPS & HTZ go both in hand both are not the same thing.
For example a 50htz TV will run a game regardless of it's FPS slower then a 60htz TV.
 

Gaiff

SBI’s Resident Gaslighter
30fps in a game is different to a 50htz refresh rate on a TV.
While FPS & HTZ go both in hand both are not the same thing.
For example a 50htz TV will run a game regardless of it's FPS slower then a 60htz TV.
Well, yeah, it's either 50fps or 25fps to avoid screen tearing with vsync. PAL regions finally moved to a 60Hz standard with the HD revolution but then then got games that ran at 20-30fps on a 60Hz display rather than 50fps on a 50Hz display.
 
Last edited:

hussar16

Member
not me i had a ps2 and the switch from sd to hd even 720 p hdmi cables was significant even on my old crt tv. once it was real 720p on a lcd screen it looked even better. crt while great dont have the sharpness
 

charles8771

Member
Yes it was a huge scam on many levels. First of all the consoles were never fit to run even remotely complex games at HD resolutions. They did anyway and usually the framerate tanked in the 20's or lower, and the resolution was quite a bit below 720p. Especially the PS3, which was marketed as a 1080p device and carried a hefty price tag at first was seen as a let down.

I bought a 720p set, 26 inch for like 700 bucks. Ridiculous ofcourse. But in truth you could barely play games like Dead Rising on an SD. It wasn't all doom and gloom, I mean DoA4, CoD2 and some other games did look amazing on that television. But in terms of viewing angle, ghosting, input lag, black levels etc CRT was probably better still. Especially for regular broadcasts and DVD. All they had to do was lower the resolution to 480, which would free up performance. Which many games desperately needed. By 2010 I had a Panasonic plasma which was stellar and better than CRT, but the poor PS3 couldn't really make good use of it. Gran Turismo 5 did look good at times, but also borderline awful. And if you loaded a full premium grid on Monza, the framerate tanked to stupid levels.

The PS4 was the first console that ran a good portion of games at 1080p and at acceptable performances. But it was late 2013. Before that HDTV weren't really a necessity for console gaming imo. In hindsight i should've waited a few years when high end plasma became affordable, but okay, admittedly I had some wow moments with GoW, GR:AW, Dead Rising etc.
There were games that didn't run at 1080p on PS4: https://www.ignboards.com/threads/list-of-sub-native-resolution-ps4-games.454948861/

However 900p on PS4 is equivalent to 600p on PS3/Xbox 360
 
Last edited:

Fafalada

Fafracer forever
Yeah no. CRT meant 50 Hz on Pal territories. The move to 720p was the right call.
There's nothing wrong with 50hz though - in fact modern games could often benefit from using it.

It's ironic - we have people clamouring for 40hz modes on modern consoles - and they are only accessible to TVs made in last 3 years, whereas literally every TV made in the last... 17 years or so, can run native 50hz.
But through sheer stupidity of all the console makers, that's been kept inaccessible to everything but legacy SDTV PAL content - it's mind boggling really.
 

Portugeezer

Member
I was more upset that - unlike PC - if you played on a SD TV the console games would still render at whatever the HD resolution the game was, so much performance was wasted on SD.

I remember GTA4 looking good on the nice CRT I had, but thinking could have gotten some more stable performance if the game could render at 480p.
 

Fafalada

Fafracer forever
I felt X360 was at least comparable with last gen in regards to that performance metric. PS3 had a harder time with multiplat games, of course. A bit between N64 and PS2 in regards to framerate.
That's revisionist history - PS2 had the highest % of 60fps games of any 3d system until current gen. The only system that came close was NDS, actually.
PS3/360/Wii gen was a massive step backwards in framerates across the board.

I was more upset that - unlike PC - if you played on a SD TV the console games would still render at whatever the HD resolution the game was, so much performance was wasted on SD.
Most games rendered native SD (and did in fact run better in SD). The ones that didn't weren't actually that common.
 
Last edited:

Portugeezer

Member
That's revisionist history - PS2 had the highest % of 60fps games of any 3d system until current gen. The only system that came close was NDS, actually.
PS3/360/Wii gen was a massive step backwards in framerates across the board.


Most games rendered native SD (and did in fact run better in SD). The ones that didn't weren't actually that common.
Oh, I remember hearing otherwise.
 

charles8771

Member
I was more upset that - unlike PC - if you played on a SD TV the console games would still render at whatever the HD resolution the game was, so much performance was wasted on SD.

I remember GTA4 looking good on the nice CRT I had, but thinking could have gotten some more stable performance if the game could render at 480p.
An open world game? look out how GTA 3 and San Andreas runs on PS2, specially when they came out on PS2 first then later ported to PC and Xbox:




Not to mention that PC version of GTA 4 was notorious for being unoptimized: https://gtaforums.com/topic/552103-gta-iv-is-the-worst-pc-port-of-all-time/
In where a $3000 PC from 2008 struggle to run it
 
Last edited by a moderator:

simpatico

Member
Rumor has it a small team at Naughty Dog finally truly cracked the Cell. It's actually outperforming the PS5 dev kits with the new efficiencies. The PS5 Pro could be Cell powered. There's enough overhead to emulate a full PS5.
 

StreetsofBeige

Gold Member
The crazy thing is the most popular games year in and year out became COD. A shooter game that was sub-720p with muddy textures. Technically, it should be at the bottom of the ladder compared to other shooters, but the reason why everyone played it (at least I'm pretty sure it was a giant factor for many like me), was that it was the only key shooter that was 60 fps which made playing it smooth and responsive. I think the COD games bottomed out to around 600p and nobody cared or even noticed unless a game site told them.
 
Last edited:

SkylineRKR

Member
There were games that didn't run at 1080p on PS4: https://www.ignboards.com/threads/list-of-sub-native-resolution-ps4-games.454948861/

However 900p on PS4 is equivalent to 600p on PS3/Xbox 360

Depends how you see it. If you had a 1080p TV around 2009, which was highly possible, 600p was quite the step down. 900p wasn't so much of a downgrade. I know Battlefield 4 was 900p for example, but it ran at 60fps and featured 64 players. The upgrade compared to PS3 was immense.

The biggest problem wasn't even the horrible resolution and AA, it was the usual sub 30fps that was prevalent much more than on PS2.
 
Well, yeah, it's either 50fps or 25fps to avoid screen tearing with vsync. PAL regions finally moved to a 60Hz standard with the HD revolution but then then got games that ran at 20-30fps on a 60Hz display rather than 50fps on a 50Hz display.

Gamecube was 60Hz as standard from the get go in PAL regions. I think maybe Xbox was too.
 

SkylineRKR

Member
Gamecube was 60Hz as standard from the get go in PAL regions. I think maybe Xbox was too.

This isn't true. For GC you had to hold A or B on boot up to enable 60hz in games that supported it. IIRC, Twin Snakes didn't have it and Tony Hawk 3 too which was a bummer. For Xbox you could select PAL60 from dashboard but some games ignored it, like Halo 1.
 
This isn't true. For GC you had to hold A or B on boot up to enable 60hz in games that supported it. IIRC, Twin Snakes didn't have it and Tony Hawk 3 too which was a bummer. For Xbox you could select PAL60 from dashboard but some games ignored it, like Halo 1.

Almost all the PAL GC games were 60hz, but yes it required user input on boot. Surprised Twin Snakes didn't have it. Maybe because of the source euro playstation material being ported over or silicon knights weirdness.
 

SkylineRKR

Member
There's nothing wrong with 50hz though - in fact modern games could often benefit from using it.

It's ironic - we have people clamouring for 40hz modes on modern consoles - and they are only accessible to TVs made in last 3 years, whereas literally every TV made in the last... 17 years or so, can run native 50hz.
But through sheer stupidity of all the console makers, that's been kept inaccessible to everything but legacy SDTV PAL content - it's mind boggling really.

Because its a multiple. 40hz only works on 120hz sets. If your TV caps at 50hz, ideal performance would be 25fps if 50fps is too demanding. Which meant they had to slow down PAL versions to avoid glitches since NTSC would be 60/30. The slow down was rather huge, at some 17%.
 

SkylineRKR

Member
Almost all the PAL GC games were 60hz, but yes it required user input on boot. Surprised Twin Snakes didn't have it. Maybe because of the source euro playstation material being ported over or silicon knights weirdness.

Some were notorious for lacking 60hz, like Ikaruga, Baten Kaitos, Luigi's Mansion, Wave Racer.

Actually, about half of the GC games didn't have it. Its more than you might think.


As always, Sega supported it (they pioneered it with DC, though their first releases like VF3, HotD2 and SR2 didn't have it). As for Metal Gear, I remember even Xbox Substance wasn't 60hz. And Xbox had the best 60hz support. It was the only console that played Tony Hawk, SSX etc at 60hz.
 
Last edited:

Fafalada

Fafracer forever
Because its a multiple. 40hz only works on 120hz sets. If your TV caps at 50hz, ideal performance would be 25fps if 50fps is too demanding.
That's not relevant to what I said at all.
The point is that games that can't run 60 are stuck at 30, or, for the 5% of TVs that can use it - 40.
The 50hz option everyone could benefit from, has been deliberately kept inaccessible by the platform holders for decades.
 

Fafalada

Fafracer forever
Almost all the PAL GC games were 60hz, but yes it required user input on boot. Surprised Twin Snakes didn't have it. Maybe because of the source euro playstation material being ported over or silicon knights weirdness.
I doubt it was tech related.
Regional publishing was just weird back then - they'd sometimes explicitly require to remove certain video-options depending on the region.
 

YeulEmeralda

Linux User
It's thanks to LCD that TVs are as cheap as they are today. Also if you think PS2 ran games at an acceptable frame rate...

Nostalgia is a hell of a drug and I do not partake. The technology of our childhood was shit and we should be embarrassed about it.
 

Bry0

Member
Rumor has it a small team at Naughty Dog finally truly cracked the Cell. It's actually outperforming the PS5 dev kits with the new efficiencies. The PS5 Pro could be Cell powered. There's enough overhead to emulate a full PS5.
I heard a rumor the simulation we live in is actually powered by the mythical dual cell ps3.
 

charles8771

Member
They generally ran much better than PS360 games which ran like turds a lot of the time.
Did Shadow of the Colossus, Metal Gear Solid 3, Red Faction, Killzone on PS2 ran better than PS3/Xbox 360 games?






Yeah i bet you won't talk about the frame rates of these games that were developed for PS2 in mind.
 

Gaiff

SBI’s Resident Gaslighter
Did Shadow of the Colossus, Metal Gear Solid 3, Red Faction, Killzone on PS2 ran better than PS3/Xbox 360 games?


Yeah i bet you won't talk about the frame rates of these games that were developed for PS2 in mind.
A handful of games means nothing. PS2 had far more 60fps games than PS3. Your post isn't addressing the point at all.

http://www.benoitren.be/60hz-palps2.html

Look at how many there are in this non-exhaustive list. You could practically count those on PS360 on one hand.
 

charles8771

Member
A handful of games means nothing. PS2 had far more 60fps games than PS3. Your post isn't addressing the point at all.

http://www.benoitren.be/60hz-palps2.html

Look at how many there are in this non-exhaustive list. You could practically count those on PS360 on one hand.
MGS 2 was 60fps, but 3 years later MGS 3 couldn't even stay at 30fps.

However 60fps would be depend on the game, open world games can be justified if they drop at 20fps like GTA Vice City due for stuff going on-screen
Shadow of the Colossus, God of War 2, MGS 3 are one of the most impressive games on PS2, despite of the perfomance.

As for PS3, people would say that overly complex CPU is the main reason why PS3 inferior multiplatform ports to the Xbox 360



 

Fafalada

Fafracer forever
Oh, I remember hearing otherwise.
Yea, just one of the hundreds of urban-myths of that era of internet.
SD support mandated 4:3, which combined with impact on readability of text/elements meant you pretty much had to hand-author SD resolution experience unless you wanted users without HD panel to suffer.
Performance potentially being better was just a bonus.

MGS 2 was 60fps, but 3 years later MGS 3 couldn't even stay at 30fps.
That's immaterial - end of 360/PS3 gen there were some games running in single digits. Those consoles had more in common with N64 in terms of framerates than they did with any other machine before them.

As for PS3, people would say that overly complex CPU is the main reason why PS3 inferior multiplatform ports to the Xbox 360
Madden/NCAA example was a CPU problem. But not just PS3 CPU specific - it was caused by the UI framework, and it was the same reason 360 versions shipped at 30fps the first year also.

However 60fps would be depend on the game, open world games can be justified if they drop at 20fps like GTA Vice City due for stuff going on-screen
There were 60fps open-world games on PS2 also - some of the most impressive showcases in fact. But in general terms, things that ran really poorly on PS2 (20fps range) were almost always CPU constrained, it was the number one pain point of the multi-platform games in particular, though it didn't spare exclusives either.

It was the only console that played Tony Hawk, SSX etc at 60hz.
That had more to do with the fact that XBox was pretty much the 'Pro' console of that generation.
Basically anything with 60 target had a much higher probability of being stable on XBox - and in some cases, 30fps games could be frame-doubled also (usually the CPU limited ones - where difference to other machines was arguably the most drastic).
 
Last edited:

Trogdor1123

Gold Member
was there many sdtvs for sale then? Wasn’t the shift already happening? Dont get me wrong, fps over resolution in most cases but there are other considerations too
 

MarkMe2525

Gold Member
So, I recently made a thread about a youtuber who has OC'd his PS3 for better frame rate and found an interesting post:

And I wholeheartedly agree with this. Display manufacturers are some of the biggest pathological liars in the tech world who will advertise muddy jargon to sucker people into buying subpar products. For instance, they advertise(d) 60Hz TVs as 120Hz when it's false. They're 60Hz TVs with motion interpolation. Another scam is HDR. Since there is no standard, you find all kinds of tags such as: HDR10, HDR1000, HDR10+, HDR400 (which isn't even HDR), etc.

Never in my opinion has this hurt gaming more than in the PS360 era aka the era of the HD resolution. It was back then that a major push was made to sell LCD TVs with high-definition capabilities. Most at the time were 720p TVs but the average consumer didn't know the difference. It was HD and you on top, you had interlaced vs progressive scan, making things even more confusing. Back then, an HDMI cable was extremely expensive ($50+ for a 6ft one) but you HAD to have an HDTV, otherwise, you were missing out on the full potential of your gaming console. While it was cool to watch your football games on an HD TV (and let's be honest, most networks were slow as hell to deliver HD content with some taking years), the gaming experience was quite different.

What most people gaming on consoles (and I was one of them) didn't really talk about was how abysmal the performance was compared to the previous generation which had far more 60fps games than their newer, more powerful successors. It wasn't just 60fps, it was also the stability of 30fps games. We were sent back to the early 64-bit era performance-wise with many (dare I say most?) games running at sub 30fps and sub HD resolutions. Furthermore, the early HDTVs in fact looked much worse than the CRTs we had and I remember being thoroughly unimpressed with my spanking brand new 720p Sharp Aquos television compared to my trusty old Panasonic CRT. The same happened when I got my Samsung KS8000 4K TV, it was a step down from the Panasonic plasma I had before, and 4K while sharp and crisp wasn't worth tanking my fps. 1440p was just fine and the 980 Ti I had at the time simply wasn't enough to drive this many pixels.

It was easy to sell big numbers: 1080>720>480. More pixels = better and clearer image which was a load of horseshit because pixel count doesn't matter nearly as much for CRT TVs. Never mind the loss of perfect blacks and high contrast that the CRT TVs provided by default. Plasma were also quite a bit better than LCDs but suffered from burn-ins and high power consumption and were hot.

In my opinion, the PS360 consoles should have stuck to SD resolutions and CRT devices but aim for higher frame rates. 60fps at SD resolutions should have been doable. I played inFamous at a friend's home on a CRT and it looks great. Imagine if it was also running at 60fps. I've also been dusting up my old 360 and PS3 only to realize that most games I play run like shit compared to the standards of today.

Then PS4/X1 could have moved to 720p/60fps or 1080p/60fps for less demanding games (assuming a less shit CPU). 30fps would of course always be on the table. Current consoles could have been 1080p/60/ray tracing devices with graphics comparable or even exceeding 4K/30 modes and then PS6 would be the proper move to 4K which in my opinion without upscaling is a waste of GPU power.

Thoughts on the HD revolution and how it impacted gaming? Would you change anything? Were you happy with the results?
I didn't read this... but I 100% agree. My 360 and PS3 look fantastic on my 27" Phillips CRT. I'm fortunate enough for my Phillips to have component input, and I would never go back to hooking my 360 back up to the 4k.
 

charles8771

Member
When developers make ambitious games with complexity, good frame rates would be considered impossible without compromises.

Red Faction from 2001 with physics, destructible environments is why the game struggle properly on PS2.
 
Top Bottom