So, I recently made a thread about a youtuber who has OC'd his PS3 for better frame rate and found an interesting post:
And I wholeheartedly agree with this. Display manufacturers are some of the biggest pathological liars in the tech world who will advertise muddy jargon to sucker people into buying subpar products. For instance, they advertise(d) 60Hz TVs as 120Hz when it's false. They're 60Hz TVs with motion interpolation. Another scam is HDR. Since there is no standard, you find all kinds of tags such as: HDR10, HDR1000, HDR10+, HDR400 (which isn't even HDR), etc.
Never in my opinion has this hurt gaming more than in the PS360 era aka the era of the HD resolution. It was back then that a major push was made to sell LCD TVs with high-definition capabilities. Most at the time were 720p TVs but the average consumer didn't know the difference. It was HD and you on top, you had interlaced vs progressive scan, making things even more confusing. Back then, an HDMI cable was extremely expensive ($50+ for a 6ft one) but you HAD to have an HDTV, otherwise, you were missing out on the full potential of your gaming console. While it was cool to watch your football games on an HD TV (and let's be honest, most networks were slow as hell to deliver HD content with some taking years), the gaming experience was quite different.
What most people gaming on consoles (and I was one of them) didn't really talk about was how abysmal the performance was compared to the previous generation which had far more 60fps games than their newer, more powerful successors. It wasn't just 60fps, it was also the stability of 30fps games. We were sent back to the early 64-bit era performance-wise with many (dare I say most?) games running at sub 30fps and sub HD resolutions. Furthermore, the early HDTVs in fact looked much worse than the CRTs we had and I remember being thoroughly unimpressed with my spanking brand new 720p Sharp Aquos television compared to my trusty old Panasonic CRT. The same happened when I got my Samsung KS8000 4K TV, it was a step down from the Panasonic plasma I had before, and 4K while sharp and crisp wasn't worth tanking my fps. 1440p was just fine and the 980 Ti I had at the time simply wasn't enough to drive this many pixels.
It was easy to sell big numbers: 1080>720>480. More pixels = better and clearer image which was a load of horseshit because pixel count doesn't matter nearly as much for CRT TVs. Never mind the loss of perfect blacks and high contrast that the CRT TVs provided by default. Plasma were also quite a bit better than LCDs but suffered from burn-ins and high power consumption and were hot.
In my opinion, the PS360 consoles should have stuck to SD resolutions and CRT devices but aim for higher frame rates. 60fps at SD resolutions should have been doable. I played inFamous at a friend's home on a CRT and it looks great. Imagine if it was also running at 60fps. I've also been dusting up my old 360 and PS3 only to realize that most games I play run like shit compared to the standards of today.
Then PS4/X1 could have moved to 720p/60fps or 1080p/60fps for less demanding games (assuming a less shit CPU). 30fps would of course always be on the table. Current consoles could have been 1080p/60/ray tracing devices with graphics comparable or even exceeding 4K/30 modes and then PS6 would be the proper move to 4K which in my opinion without upscaling is a waste of GPU power.
Thoughts on the HD revolution and how it impacted gaming? Would you change anything? Were you happy with the results?