My Atari 2600 performed exactly same in black & white as it did in color.I feel the same way when looking back at the color revolution. Just imagine how much power we could save our game engines if we were still in black and white!
I don’t think so. Those older LCD models from the 369 gen, no matter what you paid or brand you bought, all smeared during motion and had input latency far higher than what’s available today. They sucked for gaming, but it’s what was available so we made due.There were HD CRTs. Also that's if you bought a shit lcd. I was selling tvs at that time in my life and there were DLPs LCD rear projection, plasma, and good lcds. If you bought at Walmart then that's on you! HD was the best thing for gaming!
HD gaming lowered frame rates.I’m so happy gaming went HD, smart people figured out how to bounce super graphics off your screen also the frame ratez.
Wii games looked bad mainly because of a lack of af/antialiasing of any sort as well as relying on anamorphic widescreen due to the measly 2MB framebuffer.I agree with OP and have been thinking that for a while now... If those consoles where designed for that big of a resolution bump, first Xbox 360 model would have come with an HDMI port as default but it didn't.
I get what some say about the Wii looking bad on HDTVs, but remember that Wii had interlaced signal, if games where, say, 540p (progressive) the jump on resolution would be equally good and games would have performed way better.
And interlaced signal, it made the game not only look low resolution, but shaky/shimmery tooWii games looked bad mainly because of a lack of af/antialiasing of any sort as well as relying on anamorphic widescreen due to the measly 2MB framebuffer.
DMC, Tekken, GT3/4, Jak & Daxter, Ratchet & Clank, MGS2, Burnout, Zone of the Enders, Ace Combat, WWE whatever, just to name a few.Where is this narrative of good sixth gen performance coming from?
Could have swore it supported progressive scan, since the GC did and it's the same hardware. Did most games run interlaced?And interlaced signal, it made the game not only look low resolution, but shaky/shimmery too
Oh, it probably did through component cables, I just played it using default RCA cablesDMC, Tekken, GT3/4, Jak & Daxter, Ratchet & Clank, MGS2, Burnout, Zone of the Enders, Ace Combat, WWE whatever, just to name a few.
Could have swore it supported progressive scan, since the GC did and it's the same hardware.
Most people probably did. Not that progressive output would help much with the Wii's horrid iq.Oh, it probably did through component cables, I just played it using default RCA cables
Where is this narrative of good sixth gen performance coming from?
People have this false memory that every game back then ran at a perfect 60fps at 480p, when the reality is that performance was just as bad (if not worse) than the HD gen that followed. All of the above games struggle to maintain their target framerates. Developers also had to run their games at weird lower resolutions on PS2 like 512x448, and often interlaced / using field rendering to save on performance.
Every game being progressive scan in HD to finally kill off interlacing flicker was a huge win by itself. Not to mention the death of PAL bullshit.
Are you sure?? We didn't have digital foundry to zoom in and monitor it back then.My Atari 2600 performed exactly same in black & white as it did in color.
How is 4k a scam? It might have negative effects in gaming but for every other piece of content it’s a massive improvement.3D was the real scam and 4k is the real scam today, the costs you have for the gains you get simply arent worth it
PS3 and 360 would have been much better consoles from a technical standpoint if we went with CRT TVs at SD resolution. I played the 3 Uncharted games on a CRT and they looked beautiful. If they were to target 480p, 60fps would have been the standard for that generation.
How is 4k a scam? It might have negative effects in gaming but for every other piece of content it’s a massive improvement.
Imo, resolution wise, yes. Hardware, no. PS360's hardware at 480p would've been better for the first few years.So...Nintendo's been right all along?
I got a WiiU partially because my new (at the time) TV didn't have component inputs. Was looking forward to playing Metroid Prime Trilogy finally in all-digital glory.Most people probably did. Not that progressive output would help much with the Wii's horrid iq.
I'm not sure if any games didn't support progressive, but I can't remember hitting any.DMC, Tekken, GT3/4, Jak & Daxter, Ratchet & Clank, MGS2, Burnout, Zone of the Enders, Ace Combat, WWE whatever, just to name a few.
Could have swore it supported progressive scan, since the GC did and it's the same hardware. Did most games run interlaced?
CRTs never had perfect blacks. the best ones could get fairly close, but they never did blacks like OLEDs.Never mind the loss of perfect blacks and high contrast that the CRT TVs provided by default.
The craziest shit I saw years back is all the motion rate PR the TV makers would do. I like the motions smoothing especially for sports, so at that time years back motion smoothing feature was a relatively new thing. I had to google what all this shit meant and a guy did a big chart comparing every brand's PR claim of 120, 240, 960 rates were to real life. Most of them all converted to 120 hz, which I think at that time either Sharp or Samsung were honest where their 120 = 120. The rest had bogus numbers where some maker's 120 were really only 60 too.I could write an essay about resolution, refresh rates, hdr, oled and lcd technologies but most of you fucks are brainwashed by tv manufacturers so it’s pointless.
Motion looks better on a CRT than it does on an LCD screen especially older LCD screens. I suspect the feeling of smoother motion is why the older games felt smoother.
That's racistI feel the same way when looking back at the color revolution. Just imagine how much power we could save our game engines if we were still in black and white!
The natural flickering off the CRT is what made the motion perceptibly more smooth, which is why BFI modes on modern televisions have become so popular.I mean technically they should look less smooth because games back then didn't have motion blur usually and LCD blur can help making an image look smoother.
modern high quality OLED screens usually have a motion blur option to add a bit of persistence blur the image so that movies do not look too stuttery as many people complained about that when OLED screens started to get really impressive pixel response times.
I sat really close.Are you sure?? We didn't have digital foundry to zoom in and monitor it back then.
Fair enough but still far far better than LCDs of the time. Blacks didn't get good in LCDs until the past few years where you started getting mainstream panels with full-array local dimming with 1000+ zones.CRTs never had perfect blacks. the best ones could get fairly close, but they never did blacks like OLEDs.
and there were a ton of crappy CRTs back then.
HDR is good but it's also a trap not unlike LCDs back then that didn't tell you a lot of stuff. Even within legitimate HDR displays, the differences can vary quite enormously.agreed about refresh rates. selling native 60hz displays as 120hz due to interpolation was super slimey.
i can kind of see your point about HDR... standards aren't super strict, and a lot of displays cant produce enough nits to pop big time, which leads to more slimey sales tactics. but HDR is good nevertheless.
https://www.neogaf.com/threads/stock-vs-overclocked-ps3.1647559but... OCing a ps3? that's the real story here.
Has to be Sharp because Samsung weren't honest. The KS8000 is supposedly a 120Hz panel. It's not. It's a 60Hz display. It's "120hz" with their crappy Auto Motion Plus tech which is motion interpolation.Most of them all converted to 120 hz, which I think at that time either Sharp or Samsung were honest where their 120 = 120. The rest had bogus numbers where some maker's 120 were really only 60 too.
HDR isn't a scam. Good, honestly advertised HDR is great but there's a lot of scummy tactics parading fake HDR and trying to tell you it's HDR. HDR400, looking at you.HDR a scam? The only non HDR games I buy are Switch and 2D games. 120hz plus VRR is great as well, allowing for full fidelity at 40FPS or unlocked framerates in performance modes (GOW Ragnarok seems to average around 80fps). All this stuff on nice big OLED or LCD with a lot of dimming zones, display tech is fucking great right now.
one thing that sucks about HD games and the industry in general at this point is how fucking long games take to make. Are we supposed to be ok with 2 games a generation from these Triple A devs? the fuck is thisSo, I recently made a thread about a youtuber who has OC'd his PS3 for better frame rate and found an interesting post:
And I wholeheartedly agree with this. Display manufacturers are some of the biggest pathological liars in the tech world who will advertise muddy jargon to sucker people into buying subpar products. For instance, they advertise(d) 60Hz TVs as 120Hz when it's false. They're 60Hz TVs with motion interpolation. Another scam is HDR. Since there is no standard, you find all kinds of tags such as: HDR10, HDR1000, HDR10+, HDR400 (which isn't even HDR), etc.
Never in my opinion has this hurt gaming more than in the PS360 era aka the era of the HD resolution. It was back then that a major push was made to sell LCD TVs with high-definition capabilities. Most at the time were 720p TVs but the average consumer didn't know the difference. It was HD and you on top, you had interlaced vs progressive scan, making things even more confusing. Back then, an HDMI cable was extremely expensive ($50+ for a 6ft one) but you HAD to have an HDTV, otherwise, you were missing out on the full potential of your gaming console. While it was cool to watch your football games on an HD TV (and let's be honest, most networks were slow as hell to deliver HD content with some taking years), the gaming experience was quite different.
What most people gaming on consoles (and I was one of them) didn't really talk about was how abysmal the performance was compared to the previous generation which had far more 60fps games than their newer, more powerful successors. It wasn't just 60fps, it was also the stability of 30fps games. We were sent back to the early 64-bit era performance-wise with many (dare I say most?) games running at sub 30fps and sub HD resolutions. Furthermore, the early HDTVs in fact looked much worse than the CRTs we had and I remember being thoroughly unimpressed with my spanking brand new 720p Sharp Aquos television compared to my trusty old Panasonic CRT. The same happened when I got my Samsung KS8000 4K TV, it was a step down from the Panasonic plasma I had before, and 4K while sharp and crisp wasn't worth tanking my fps. 1440p was just fine and the 980 Ti I had at the time simply wasn't enough to drive this many pixels.
It was easy to sell big numbers: 1080>720>480. More pixels = better and clearer image which was a load of horseshit because pixel count doesn't matter nearly as much for CRT TVs. Never mind the loss of perfect blacks and high contrast that the CRT TVs provided by default. Plasma were also quite a bit better than LCDs but suffered from burn-ins and high power consumption and were hot.
In my opinion, the PS360 consoles should have stuck to SD resolutions and CRT devices but aim for higher frame rates. 60fps at SD resolutions should have been doable. I played inFamous at a friend's home on a CRT and it looks great. Imagine if it was also running at 60fps. I've also been dusting up my old 360 and PS3 only to realize that most games I play run like shit compared to the standards of today.
Then PS4/X1 could have moved to 720p/60fps or 1080p/60fps for less demanding games (assuming a less shit CPU). 30fps would of course always be on the table. Current consoles could have been 1080p/60/ray tracing devices with graphics comparable or even exceeding 4K/30 modes and then PS6 would be the proper move to 4K which in my opinion without upscaling is a waste of GPU power.
Thoughts on the HD revolution and how it impacted gaming? Would you change anything? Were you happy with the results?
For me, I was lucky as I never had an LCD. All that ghosting shit back in 2006 or 2007 was junk. Electronic stores tried to hide it by showing looping demo videos that were slow paced. But you didnt even have to read articles. You could still see the horrible comet trails at the store if they showed normal TV footage or sports. At the time, these TVs cost a lot and despite clearer pixels, they were trash.I knew something was really fucked up about these flat screen TVs when back in 2006 I went to my friend's house who just bought a 720p LCD screen and suddenly every PS2 game started to look like absolute dogshit. I was very puzzled about this especially since both he and my brother were sitting next to me trying to justify why it's looking like this, when all I could think about is how much I wouldn't want to play PS2 games on that TV.
Ah well, at least Beowulf on bluray looked decent.