• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Confessions of a Grfx Whore

DaCocoBrova

Finally bought a new PSP, but then pushed the demon onto someone else. Jesus.
I work in multimedia/television operations... I have access to all kinds of display devices that I could never be able to afford (thanks Tax Payers!).

That said, why can I not tell the difference between interlaced and progressive?

I know how it works (full frame versus two frames making one image etc.), but the results are negligible IMO. There's no night/day difference like marketers have led us to believe.

In fact, the whole HD thing is such bullshit to me. I own several, and while the right source looks great on it, the same shit looked great on a nice analog set w/ component inputs. 720p is a wash for anyone w/ a CRT HDTV, but they don't tell you that either. 1080i makes most text unreadable and the image jitters a bit.

Knowing that the HD era is upon us yada yada, I can't say I'm too excited about Nex Gen since this seems to be the whole focus.

Thougts?
 
I run my XBOX on my non-HDTV Sony in a 16:9 widescreen mode and it looks pretty damn amazing. And nothing I have seen is worth spending $1000+.
 
720p is a wash for anyone w/ a CRT HDTV, but they don't tell you that either.
Watching TV show Lost in 720p versus watching it in 480p on a CRT HDTV makes quite a striking difference (and yeah, same show in 480i looks quite bad comparably). I can say the same about various HD trailers I've seen. Not to even bring games into the equation where the difference between 480p and 720p is like a night and day due to less anti-aliased graphics.
 
DaCocoBrova said:
I work in multimedia/television operations... I have access to all kinds of display devices that I could never be able to afford (thanks Tax Payers!).

That said, why can I not tell the difference between interlaced and progressive?

I know how it works (full frame versus two frames making one image etc.), but the results are negligible IMO. There's no night/day difference like marketers have led us to believe.

In fact, the whole HD thing is such bullshit to me. I own several, and while the right source looks great on it, the same shit looked great on a nice analog set w/ component inputs. 720p is a wash for anyone w/ a CRT HDTV, but they don't tell you that either. 1080i makes most text unreadable and the image jitters a bit.

Knowing that the HD era is upon us yada yada, I can't say I'm too excited about Nex Gen since this seems to be the whole focus.

Thougts?

You need eye glasses badly.

The difference is very visible for 480p compared to 480i, though not that great.

True hi-def resolutions simply blow away 480p/i. Watching the Discovery Channel in 1080i, then changing the mode to 480i makes me pity those poor souls that can't enjoy the visual orgasmic experience of hi-def.
 
to say you can't tell the difference is hard to believe, but easier to believe than the people who claim all sorts of crazy shit about the supposed differences they supposedly see.

The truth is the difference between interlace and progressive isn't nearly as big as many make it out to be. it cleans up artifacting from the interlacing and that's it. depending on the application the interlacing can be extremely annoying (extreme shimmering, moire patterns, stairstepping, colorshifting), but in many and most cases it is barely noticeable to not noticeable at all.

the resolution increase of HDTV is much more important than the progressive nature of 480p and 720p. One only has to look at 1080i to understand that interlacing is unimportant in most cases and becomes even less of an issue the higher the resolutions you use.

edit - and both marconelly and ijoel's examples are based on resolution, not i-to-p differences. as for just straight 480i-to-480p differences, it all depends on the material you are watching/playing. with some of it there is pretty significant artifacting, with most of it very little artifacting.
 
Here's an interesting observation I've made and I'm gonna stick by...

480 progressive makes less of an impact on a PAL image than it does on an NTSC one.

I have yet to see proper HDTV broadcasts at 720p and above, but I'm ready for it.
 
DaCocoBrova said:
I work in multimedia/television operations... I have access to all kinds of display devices that I could never be able to afford (thanks Tax Payers!).

That said, why can I not tell the difference between interlaced and progressive?

I know how it works (full frame versus two frames making one image etc.), but the results are negligible IMO. There's no night/day difference like marketers have led us to believe.


My my, you're SUCH a graphic whore indeed.
 
Well, since I have 1080i on my TV, it'll make all the difference in the world. I was just showing my wife the other day the difference between ESPN and the HD ESPN. Crazy amount of difference
 
In fact, the whole HD thing is such bullshit to me. I own several, and while the right source looks great on it, the same shit looked great on a nice analog set w/ component inputs. 720p is a wash for anyone w/ a CRT HDTV, but they don't tell you that either. 1080i makes most text unreadable and the image jitters a bit.


Wha?

Guh?


Fuh?

On a (normal) CRT HD set, 720p and 1080i look almost identical. Which is to say, stable and crystal clear. I'm not being an AV snob, I am stating a fact. I can only imagine that you are on shrooms.
 
I think it just depends. Maybe it's difficult for some people to tell the difference.

My brother-in-law plays Halo 2 online quite a bit, and recently visited us. I have my Xbox hooked up to a 27" CRT HDTV, so it's not much bigger than what he currently plays on (a bit smaller, if anything). He was surpised at how many little things he noticed in the game, and how much cleaner and clearer it looked. In fact, he said he got some pretty bad headaches from it for the first two days or so, until he got used to the set. Also, he said that his game noticeably improved; he played much better due to the the fact that he could see details better than he could previously.
 
Stinkles said:
On a (normal) CRT HD set, 720p and 1080i look almost identical. Which is to say, stable and crystal clear. I'm not being an AV snob, I am stating a fact. I can only imagine that you are on shrooms.
True, 720p gets converted to 1080i on CRTs (and it gets scaled down to a resolution less than 1080i). It doesn't change the fact that it looks really damn nice.
 
DaCocoBrova said:
That said, why can I not tell the difference between interlaced and progressive?

Maybe because the TV you're comparing them on deinterlaces 480i sources, so in effect there IS no difference?

DaCocoBrova said:
720p is a wash for anyone w/ a CRT HDTV, but they don't tell you that either.

I don't understand what you mean. Are you saying you can't tell the difference between 480i and 720p on a crt tv? If so, that's retarded for two reasons: 1. Most CRT HDTVs are not 720p native, only a few are, so most of them downscale the image to fit its res. 2. CRTs have the best picture quality, so if you're watching 720p on a 720p native crt, and can't see any difference between that and 480i, you need to visit an optometrist immediately.



My sister has a shitty walmart HDTV and even on that, flipping between hbo and hbohd shows massive differences. It's even more apparent on discovery hd.
 
People here should post pictures...to see the difference. I just know that the only title that I saw running at 720p on a Samsung LCD (Outrun 2) looked like shit...
 
For some reason, I've heard a fair number of people say they can't tell the difference, and I find that utterly baffling. While it is still the same picture, the quality is much, MUCH crisper. I wonder why some people don't notice it.

Then again, 90% of the american workplace desktop crts are set to 60hz, so maybe people just don't like their eyes :P
 
A large part of the visible difference is based on screen size and distance from the display. If someone has a modest sized HD-TV (say 30" or smaller) they would need hawkeyes to see much if any difference from across a room. Up close viewing would be another matter of course. I wonder how many people compare and brag about the differences while viewing it from inches from the screen, then play 10+ feet away, removing (some of) the advantages?

Don't get me wrong though, I think HD is great, but can't help but agree with most of the people here about the gains being much less then marketed.
 
half of the HD sets I see in restaurants, bars, or even stores that sell them are not even set up right to display HD signals. I'm surprised there's not more of a "fuck HD" sentiment from the general public.
 
radioheadrule83 said:
Here's an interesting observation I've made and I'm gonna stick by...

480 progressive makes less of an impact on a PAL image than it does on an NTSC one.

I have yet to see proper HDTV broadcasts at 720p and above, but I'm ready for it.

PAL has more horizontal lines than NTSC, not sure how many though. So naturally interlaced is not as bad on PAL devices. PAL also has better colors (NTSC - never the same colors). Both of these factors are big reasons why HDTV hasn't quite caught on in europe yet. It's just not needed as badly.

PAL, however, sucks for games not made with PAL versions in mind. Unmodified games run 15% slower than intended and have blackbars that squish the picture. Square is still quilty of delivering it's games like that in europe. Games like MGS2 and DMC are too.

May even be one of the reasons PC gaming is not yet as obsolete as in the states.
 
Stinkles said:
Wha?

Guh?


Fuh?

I think we need to keep in mind that certain people still say they can't tell the difference between VHS and DVD and in the same vein, complain about videos being "cropped" to widescreen instead of "Fullscreen."

HD makes a huge difference. I have a 36" Sony Trinitron Wega (pre-HD), grabbing up a digital signal from my local carrier, and I can still see the quality jump with HD.
 
DaCocoBrova said:
That said, why can I not tell the difference between interlaced and progressive?

Maybe you don't know what to look for? 480i and 480p are the exact same resolution, so if you're expecting to see things clearer on a 480p set I can see why you're confused. Displaying something in 480p over 480i will simply make movement seem smoother, but if you're playing/watching something without a lot of movement you really won't see any difference at all.

DaCocoBrova said:
In fact, the whole HD thing is such bullshit to me. I own several, and while the right source looks great on it, the same shit looked great on a nice analog set w/ component inputs. 720p is a wash for anyone w/ a CRT HDTV, but they don't tell you that either. 1080i makes most text unreadable and the image jitters a bit.

What are you talking about? While most CRT HDTV's can't display 720p natively, they are able to upconvert it to 1080i. And when a picture is being displayed in 1080i it's text (as well as everything else) will be much clearer than if it was displayed in 480i/480p/720p because it's made up of so many more pixels (480i/p = 307,200 pixels / 720p = 921,600 pixels / 1080i = 2,073,600 pixels).
 
urk said:
I think we need to keep in mind that certain people still say they can't tell the difference between VHS and DVD and in the same vein, complain about videos being "cropped" to widescreen instead of "Fullscreen."

HD makes a huge difference. I have a 36" Sony Trinitron Wega (pre-HD), grabbing up a digital signal from my local carrier, and I can still see the quality jump with HD.


This thread now also includes a bunch of people talking about watching a 480i source at 720p, so obviously the thread is completely fucked. HDTV is really complicated, but this isn't an MP3 versus CD quality difference. It is a night/day difference.

I am startiong to think the original post is some kind of random, pre-emptive Revolution damage control. or by someone who really hasn't seen HD source material, ever.
 
I just had my HD Satellite dish installed this weekend, along with an HD-Tivo, and all I can say is, how can you not see a difference? I see a nice difference with 480p over 480i, and a vast difference when switching up to 720p or 1080i, although I don't see a huge difference between the two of them. 720p did seem better while watching a football game over the weekend.

BTW, I broke in my new HDTV and Tivo/receiver by watching the Notre Dame/USC game. That was a great way to user in the HD era to my household! :)
 
interlaced -> progressive = subtle to no difference (depending on source and television)
standard definition -> high definition = subtle to tremendous difference (depending on source and television)

that's pretty much all that need be said.
 
The difference between interlaced and progressive is only gripping when a game is pushing 60+ frames every second ----

if below that, I notice that people have a hard time telling --- and it makes sense, if a game is somewhat choppy then there's no clarity at all.

the Cube is the best way I've found to show people progressive --- most games run very pure on its expensive component-cables and they run at great fps.
put Metroid Prime in on interlace, then crank it on progressive the difference is immediately noticable -
 
Ristamar said:
I'd just like to say that ABC's high-def feed used for their football broadcasts sucks ass.
it is still better than CBS.

well, at least better than our CBS which has to recompress down to 14Mbps to support multicasting. :(
 
On a (normal) CRT HD set, 720p and 1080i look almost identical.

There are no native 720p CRT sets. It'll get upconverted best case scenario.

I haven't done OTA or cable HD btw. This is strictly for gaming realted purposes.
 
Ristamar said:
I'd just like to say that ABC's high-def feed used for their football broadcasts sucks ass.
You're right, it pretty much does. I'd have to say that FOX is the overall worst for HD sports, though. CBS is unquestionably the best by leaps and bounds. At first I thought the discrepencies were due to the fact that CBS uses 1080i, and ABC/FOX/ESPN use 720p, but it's not. I've seen a few fantastic looking games on ESPN, and even a couple good ones on FOX. Those three just have lower standards, apparently, compared to CBS and my local sports network (NESN, 90% of Red Sox/Bruins/Celtics games in HD).
 
I always wondered what all the fuss was about w/ the XBOX/GC/PS2 progressive scan when all it does is accentuate the flaws (artifacting especially).
 
DaCocoBrova said:
There are no native 720p CRT sets. It'll get upconverted best case scenario.
I think that's what he was talking about. When viewing either an upconverted 720p image or a natural 1080i image on a CRT (1080i native, obviously), they both look virtually the same. Probably because most CRTs can't resolve the full resolution of either standard anyways. I know that I've done a ton of A/B comparison with an upconverting DVD player and my STB on my CRT, and there is virtually no difference seen when inputting 720p versus 1080i. On a CRT, the upconversion does virtually no harm to the image, unlike any other display technology.
 
Progressive makes the scanlines less visible but if your tv has a built-in deinterlacer, as many do today, then it doesn't make a huge difference if your game supports 480p.
 
My bench marks are Ninja Gaiden and Rallisport Challenge 2.

480i component=Nice!

480p component=OMGWTF!!!! *cums in pants*

Those 2 pretty much justify the whole HD thing.
 
The source material is going to be the determinant. That's why I don't see the point in an upconverting DVD player.

There are hardly any 720p/1080i games on the market. The few that there are are all on the XBOX. I wonder what all you are basing your opinions on. On paper is one thing... Seeing is believing though.

I can spot frame rates with the best of them tho.

My bench marks are Ninja Gaiden and Rallisport Challenge 2.

Rallisport 2 night courses trick my mind into thinking it's real from time to time. More because of the 60fps than the progressive output.

Another thing:

Why do I notice on the GC, that enabling progressive scan puts vertical lines on the screen. Thought it was my console, but I have 2 (OG) Gamecubes. Tried it on numerous display devices (NEC Plasma, Panasonic Tau CRT, LCD etc.). Same thing.
 
Aside:

CRT's don't have native or "fixed" resolutions. They are, by design, analog devices. The difference you would see in quality has to do with design of the set, rather than any issues related to fixed pixel devices.
 
If you have the gear, just go to the XBox dashboard and use the both triggers both sticks simultaneous trick to switch between interlaced and progressive. If you can't tell the diff between the 2 you need glasses.

Progressive also seems to brighten and saturate the colors alot more than the greyish interlace blending that otherwise occurs.
 
Progressive also seems to brighten and saturate the colors alot more than the greyish interlace blending that otherwise occurs.

That I do notice (saturation) and less noticeable grey gradiants.

That, to date, has been my only real point of reference to tell between the two.
 
At 50+ inches, all racing games are a pixel / shimmery mess at 480 anything, and if Anti-Aliased at thet res, tend to look a bit blurry.

On a BIG display for a realistic look, racing games make better use of resolution than framerate. Of course both would be ideal......
 
Warm Machine said:
If you have the gear, just go to the XBox dashboard and use the both triggers both sticks simultaneous trick to switch between interlaced and progressive. If you can't tell the diff between the 2 you need glasses.
lol.. was wondering when people like this would enter the thread.

Progressive also seems to brighten and saturate the colors alot more than the greyish interlace blending that otherwise occurs.
not to mention cures cancer and feeds the poor. or at least it seems that way.

you are all mostly talking through your ass here seen as though you can't do an A/B comparison of 480i/480p. Your television will either be showing a native 480p signal or showing a 480i signal upconverted to 480p. Either way it will be a 480p picture, just different on where the progressive portion was generated. now yes your deinterlacer on the TV may suck and that's why there is such a huge difference in quality between 480i and 480p inputs on your TV, but that is NOT a difference of the technology in general and NOT a difference in every (or even most) displays out there.
 
Show me...

IIRC, Princeton used to make a set. Monovision is some UK brand that isn't avail here in the US.

Regardless, the cost would be a deal breaker for most. Especially considering the fact that 34" is as big as it'll get.
 
ourumov said:
People here should post pictures...to see the difference. I just know that the only title that I saw running at 720p on a Samsung LCD (Outrun 2) looked like shit...

This makes no sense, Outrun 2 is a 480p source game, not a game designed to run in 720p. Watch something that is taped in HDTV or software made to run 720p and there is a huge difference in running that on an HDTV vs. a regular TV.

I would agree with others that the difference between 480i and 480p is more subtle, but it's defintely there. I play Xbox games on both types frequently and for something like Conker, I can definitely see a smoother and cleaner image on the 480p display
 
I just recently bought a 30" WEGA Sony HD. It upscales all images to 1080i.

I have a really hard time telling the difference between 480i and 480p. Seeing how the television is upsampling everything to 1080i (so everything is interlaced in the end), is there any advantage to having a 480p signal fed into it?

Is the 480p signal "cleaner" than a 480i signal for this television?
 
levious said:
half of the HD sets I see in restaurants, bars, or even stores that sell them are not even set up right to display HD signals. I'm surprised there's not more of a "fuck HD" sentiment from the general public.

Most of the HD sets I see in stores must not be set up properly either. Half the time their picture doesn't look that much more impressive than a good standard set. I've heard many people complain that they don't see the big deal with HDTV. I was at Best Buy the other day with a friend and he was commenting on how he didn't see what all the fuss is over HD. He said that he thought HDTV was supposed to be "like looking through a window" and he wasn't all that impressed. Retailers need to do a better job with their displays and feed the sets true HD content. Right now it's as if they're trying to sell color TVs while displaying B&W programming.
 
What drives me crazy is on the news networks, when doing interviews in their "modern studios" with the interview-ee on a 16:9 set.

They'll cut between the interviewer looking at the set asking questions, and then a full-screen view of the interview-ee. The parts where the guy is on the 16:9 is always ridiculously stretched, making their heads distorted and wide... then it switches to full screen on my set and they look normal again.

You'd think someone at the network would catch this... but they all seem to have the same problem.
 
Bar's drive me nuts is there some rule that if you run a bar you must be an absolute tool when it comes to electronics. The number of bars that have 20 grand worth of tv's and every one of them looks like shit is amazing. I know the smoke kills them but you can't even smoke in a pub anymore!
 
^^

Nope.

The Sony takes it to 1080i.

Still for a CRT, that set has the best picture around.
 
I find it sometimes hard to tell the difference between i and p, if there are no horizontal lines. If the image has black bars or a ticker then I cannot stand interlaced.
 
Top Bottom