Same here. Switched from CRT to a 720p hd flat screen with my xb360 at that time. Image quality improvement was phenomenal.I played the xbox 360 on a CRT the first months, it felt so good when I switched to the new plasma HDTV, I felt like a king playing on that screen
You had plasma which was awesome (ignoring the insane power consumption and burn-in). LCD was trash.I played the xbox 360 on a CRT the first months, it felt so good when I switched to the new plasma HDTV, I felt like a king playing on that screen
Why, how? It's not like they'll get any more ambitious.
Same here. Switched from CRT to a 720p hd flat screen with my xb360 at that time. Image quality improvement was phenomenal.
But yes, xb1 and PS4 should have stayed with 720p . Full HD was often just to much for the hardware. With the current gen 1440p seems to be the sweet spot. UHD is just to much to be generally used. Even the next generation will struggle and the improvement to image quality is no longer there. Steps feel to small. Therefore other features are there to improve the overall quality. I'm not a fan of image reconstruction. Still to many artifacts and unstable image quality for my taste, but I guess this trend won't go away.
But developers ignored the Wii due for being far behind from PS3 and Xbox 360 in terms of specs, Nintendo explained why Wii wasn't HD: https://www.svg.com/289769/the-real-reason-why-the-nintendo-wii-wasnt-high-def/HD development led to increased development time, budgets, and expanding teams. A single flop could sink a studio.
Again, this doesn't explain how this is an HD problem but a generation one. You got games till this day that are still cross-gen and their budget isn't dropping. Using the HD example, a cross-gen game be a lot more cost effective yo make since they've already hit the ceiling of 4k on the PS4 pro.HD development led to increased development time, budgets, and expanding teams. A single flop could sink a studio.
Developers didn't ignore the Wii because of its power but because of the bad support, horrible SDK and really bad sales for non Nintendo games.But developers ignored the Wii due for being far behind from PS3 and Xbox 360 in terms of specs, Nintendo explained why Wii wasn't HD: https://www.svg.com/289769/the-real-reason-why-the-nintendo-wii-wasnt-high-def/
The third party support on Wii was horrible
...
I'm not a fan of image reconstruction. Still to many artifacts and unstable image quality for my taste, but I guess this trend won't go away.
Aren't they doing just that? Jedi fallen order and final fantasy 16 run sub 1080p in performance mode.Yep, 1080p is low res.
2K should be the new 4K.. Trying to push higher res on console is redundant, industry needs to focus on performance not resolution.
Well regarding Fallen Order, that is 100% dev poor optimisation and the devs did focus on 4K as that's what it runs at in quality mode.Aren't they doing just that? Jedi fallen order and final fantasy 16 run sub 1080p in performance mode.
I meant jedi survivor.Well regarding Fallen Order, that is 100% dev poor optimisation and the devs did focus on 4K as that's what it runs at in quality mode.
Oops I meant Survivor, I don't know why I said fallen order but that's my point. Quality mode was aiming for 4K and they couldn't even deliver on that. Officially Quality mode was advertised as 30fps @ 4K resolution.I meant jedi survivor.
But this is a quote from digital foundry about quality mode.
"In the (default) 30fps resolution mode, resolution scales dynamically from 972p (45 percent of 4K) at worst to 1242p at best (looking directly up at the sky, 57.5 percent of 4K). Cutscenes offer more resolution, at up to 1440p."
A gorgeous blurry fest.Returnal only renders at 1080p and it's a gorgeous game with 4K presentation and steady 60FPS.
Nah, just means the public at large is ignorant. Try a quality CRT with component. It shits on early and even late LCDs something awful. Most people at the time weren't aware and had shitty composite.Well, if we never went back to 480p and CRTs it probably means you’re wrong.
I do think that way less people cared about frame rate back then partially because it wasn't shoved in their faces all the time.People say that nobody cared about frame rates in 2000s decade.
There are games that were very popular even if ran at poorly.
Folks only see the leap to PS3/Xbox 360 very big at allI do think that way less people cared about frame rate back then partially because it wasn't shoved in their faces all the time.
I've played on PC since the late 90s when I was a young lad and I was 1000% playing shit at like 15 fps and having tons of fun. Look at how fond of memories people have of the N64 as another example.
Back then I just played games on whatever I had and had tons of fun while doing it. How instant info is on the internet made things much better and at the same time much worse for gaming.
Gears 1 was straight epic when it released! Best trailer ever too!! “Mad World” when i saw it, screamed next gen for the time.Gears of War in 2006 on a 1080p HDtv was mind blowing !!!!!!
years later Uncharted 2 did that again but not as impactful as GOW1
I feel the same way when looking back at the color revolution. Just imagine how much power we could save our game engines if we were still in black and white!
I can tell you weren't there. It meant that Ocarina of time played at 22 FPS in PAL territories. It meant that FFIX played 16% slower in PAL version, or that FFX have 2 HUGE black bars above an below due to the bigger resolution of PAL. It was a shit show. Anyone that played PAL in the 90s will tell you what a huge improvement was moving to HD standards.There's nothing wrong with 50hz though - in fact modern games could often benefit from using it.
It's ironic - we have people clamouring for 40hz modes on modern consoles - and they are only accessible to TVs made in last 3 years, whereas literally every TV made in the last... 17 years or so, can run native 50hz.
But through sheer stupidity of all the console makers, that's been kept inaccessible to everything but legacy SDTV PAL content - it's mind boggling really.
That had more to do with console hw and sw deficiencies of the era and very little with Video standards themselves (especially games that literally ran 'slower', had black-bars or ran at sub 25fps framerate when NTSC counterpart didn't).I can tell you weren't there. It meant that Ocarina of time played at 22 FPS in PAL territories. It meant that FFIX played 16% slower in PAL version, or that FFX have 2 HUGE black bars above an below due to the bigger resolution of PAL. It was a shit show. Anyone that played PAL in the 90s will tell you what a huge improvement was moving to HD standards.
I can tell you weren't there. It meant that Ocarina of time played at 22 FPS in PAL territories. It meant that FFIX played 16% slower in PAL version, or that FFX have 2 HUGE black bars above an below due to the bigger resolution of PAL. It was a shit show. Anyone that played PAL in the 90s will tell you what a huge improvement was moving to HD standards.
That had more to do with console hw and sw deficiencies of the era and very little with Video standards themselves (especially games that literally ran 'slower', had black-bars or ran at sub 25fps framerate when NTSC counterpart didn't).
However - all of that is the 90ies and has nothing to do with my original post.
I was specifically referring to HD standards when I said 50hz is natively available to every HDTV made since 2005 or thereabouts. And other than refresh - nothing else changes (digital revolution and all) - basically it's the superior option to the 'faux 40 fps' stuff we've been getting that only select TVs can use, and it's explicitly been made unavailable by Platform holders (because 'reasons'), even though every console on the market is technically capable of supporting it.
The point is - that's a question that noone ever attempted to answer - because the platform holders block the option alltogether.So you want 50fps. But perhaps the games supporting 40fps on 120hz panels can't do higher without sacrifices.
VRR support is even more scarce than 120hz. But as a mode it also makes the distinction irrelevant, since the target refresh is 120hz anyway, and sub 48hz is done in software via LLC, if you need to allow the drops.Perhaps 50fps is just too much to ask for or it would cause too much drops for VRR to be useful.
Yes, PS hardware in SDTV modes just output NTSC as 60hz even on PAL machines, which was compatible with TVs that supported 60. I guess you got slightly different colors compared to 'true' PAL60, but that's the SDTV era and all the mess of standards it had.Its true some or a lot 50hz TV did 60hz just fine, so you could play at 60hz with RGB cable on imported hardware (PSX was software related, NTSC discs would run at 60hz on a PAL machine, Saturn etc didn't without mod). This meant 60fps was in sync on these TV since they could do 60hz too. I had a B-brand TV from 1990, it supported RGB scart and 60hz. I played on it for years.
Man I used to have 36" Sony Trinitron with flat glass in front. It was heavy as all hells (talking 200lbs plus another 100+ for the stand), but it had amazing picture quality and it did have component inputs.I didn't read this... but I 100% agree. My 360 and PS3 look fantastic on my 27" Phillips CRT. I'm fortunate enough for my Phillips to have component input, and I would never go back to hooking my 360 back up to the 4k.
Man I used to have 36" Sony Trinitron with flat glass in front. It was heavy as all hells (talking 200lbs plus another 100+ for the stand), but it had amazing picture quality and it did have component inputs.
There was no way I could keep that thing with all the moves I have had to do, but I don't think I got equivalent picture quality on TV till much more recent LCDs (never had plasma).