HDTV 720p/1080i -- What's with this "UPCONVERSION" sh*t?

Deg said:
Rubbish. Photoshop is nothing like this stuff and this stuff is standard in high end tv's for afew years now. But its really amazing now.

Oh and they have found ways to increase the framerate. Its insane. You know what it actually works despite being fake! Philips are one of the more aggressive at this and their image quality is effectively higher res as a result if you have higher res tv compared to the source. A good thing is to use it on someones face being shown. See the difference it makes in detail. It is STUNNING :D

This is one side of tv GAF has failed to discuss.

Dude, you lap up corporate tech PR BS like it's mother's milk. Until you can explain to us how they managed such amazing feat, shut up about it.
 
Shogmaster said:
Dude, you lap up corporate tech PR BS like it's mother's milk. Until you can explain to us how they managed such amazing feat, shut up about it.

This is real. People have this in their homes unlike PS3. :D

This is bloody amazing stuff. Any red blooded tech person would be blown away.

1_5_1_sharpness.jpg


sharper shot is PP2. 1st shot without.
 
Shogmaster said:
Simply, on a CRT that can actually do 1280x720, 720P will show without any conversion, and 1080i will show as 960x540 @ 30fps progressive.

Can you point me to a site that can explain this to novices, that are literate, but unschooled?

I don't understand why 1080i wouldn't be able to go to 1280 by 720 and has to be cut in half. If you're going to lose half of the lines, isn't losing only 1/3 better? In neither case is the set actually doing anything intelligent, just cutting lines of display. Wouldn't the less cut be the better?

Also, I understand that "todays" sets aren't the best to have this discussion on as you say that all CRT's are 'faking it'. But what about sets late this year or early next. Or even late next.

All of this talk just seems to reinforce that HDTV itself is outstripping the ability of actual TV sets to display it. And I'm not going for any LCD display. I want a real TV that will fit in my living room, not a computer screen that has lesser abililty to display contrast and black levels.
 
Deg said:
This is real. People have this in their homes unlike PS3. :D

This is bloody amazing stuff. Any red blooded tech person would be blown away.

I've seen similar upconversion technologies. The best equivilent (and probably superior) is Matrox's upconversion circuitry in their Parhelia triple head DVI cards. The 1024x768 image upconverted to 1280x1024 was by far, the best I've ever seen by a graphics card. It an was amazing feat for sure.

Obcourse, native 1280x1024 output beats the everlovin' shit out of it. ;)
 
sonycowboy said:
Can you point me to a site that can explain this to novices, that are literate, but unschooled?

I don't understand why 1080i wouldn't be able to go to 1280 by 720 and has to be cut in half. If you're going to lose half of the lines, isn't losing only 1/3 better? In neither case is the set actually doing anything intelligent, just cutting lines of display. Wouldn't the less cut be the better?

Because there are more pixels on 1080i than 720p. Do the math: 1920x540 vs 1280x720.

Also, I understand that "todays" sets aren't the best to have this discussion on as you say that all CRT's are 'faking it'. But what about sets late this year or early next. Or even late next.

The industry is moving away from CRTs via consumer demand for smaller footprint. I think we won't see any true 720p CRT. The only hope left is the Sony's Superfinepitch line.

All of this talk just seems to reinforce that HDTV itself is outstripping the ability of actual TV sets to display it. And I'm not going for any LCD display. I want a real TV that will fit in my living room, not a computer screen that has lesser abililty to display contrast and black levels.

You want to know something even more depressing? How about lack of true HD material to show on HDTVs. Except for select movies, most crap shown on HD channels are low res shit merely upresed. And those true HD stuff (like some TV shows) are shit I wouldn't watch anyways. Crap shown higher res is still crap, if you know what I'm getting at. If you are a sports fan, than you have some real reason to be happy for HD, but I hate sports so....

Until we are far into the blu-ray DVD era, we'll hardly have anything besides X360 and PS3 games to show off on our HDTVs besides sports shit.
 
radioheadrule83 said:
Thats a bit of an exagerration though right? VHS looks shitty for a whole bunch of reasons... it's an analog recording solution involving magnetic tape of all things. It's shocking how bad it can look when you think back and remember it being acceptable.

What I meant to say was - if we put aside the quality of the CRT/LCD/Plasma panel itself for a second - it's contrast ratio / blacks & hues and so forth.... talking about what basically happens in terms of pixels: wouldn't a system with a native pixel resolution of <whatever> x 480/590 simply be making a 720p image look like 480p/590p at best?

Is there a better way to think about this in order to get my head around it?


Yeah, I guess. What TV are you comparing it too, though? If we are looking at a standard analog set the native resolution is 480i. Not 480P. So if you still have a 480P signal in an analog set it'll output it in a native 480i resolution. This will look nicer than 480i signal but not as nice as a true 480P signal. So if you have a 720P signal being converted down in an HDTV set to 540P, this will look much nicer than an analog set which native output is 480i. Beyond this, HDTV is a 16:9 format which is going to fill the screen in a CRT widescreen where an analog set the picture is going to be letterboxed. So if you have a 27" 4:3 TV your output is going to be roughly 21" for a widescreen viewing area.
 
Shogmaster said:
I've seen similar upconversion technologies. The best equivilent (and probably superior) is Matrox's upconversion circuitry in their Parhelia triple head DVI cards. The 1024x768 image upconverted to 1280x1024 was by far, the best I've ever seen by a graphics card. It an was amazing feat for sure.

Superior? No way if you are saying upconverting is rubbish :lol . Philips arent the only ones but they are more aggressive in improvements over most.

Obcourse, native 1280x1024 output beats the everlovin' shit out of it. ;)

No it does not beat the shit out of it. Sure if you can get it at higher res it will be 'better' but if you cant then this is mush better than the original source on the matching res tv. Meaning PP2 gives you a better picture than you could possibly get :D
 
Deg said:
Superior? No way if you are saying upconverting is rubbish :lol . Philips arent the only ones but they are more aggressive in improvements over most.

Sure if you can get it at higher res at good quality it will be better but if you cant then this is better than the original source on the matching res tv. :D You cant ask for more.

You don't even have a clue what you're jibber-jabbering about, do you? Stop it. You're just gonna hurt yourself.
 
Shogmaster said:
Because there are more pixels on 1080i than 720p. Do the math: 1920x540 vs 1280x720.

Right. That's what I'm saying. Take away every 3rd line on 1080i, including the vertical and you'd end up with 1280 by 720 right? Which the TV would natively support. And wouldn't a downsampled 1280 by 720 be better than a downsampled 960 x 540?
 
sonycowboy said:
Right. That's what I'm saying. Take away every 3rd line on 1080i, including the vertical and you'd end up with 1280 by 720 right? Which the TV would natively support. And wouldn't a downsampled 1280 by 720 be better than a downsampled 960 x 540?


This would work if you are willing to spend $3000 for a 34" CRT widescreen HDTV. :) Besides, the resolution difference will only matter if your TV viewing distance is under 5 feet. And that is with a 34" set. A 30" set the resolution difference would matter if you were viewing from about 2 feet away.
 
sonycowboy said:
Right. That's what I'm saying. Take away every 3rd line on 1080i, including the vertical and you'd end up with 1280 by 720 right? Which the TV would natively support.

I'm not sure anyone's downconversion method works that way.

And wouldn't a downsampled 1280 by 720 be better than a downsampled 960 x 540?

Problem is, there are no 26"+ CRTs now that can do 1280x720.
 
Shogmaster said:
I'm not sure anyone's downconversion method works that way.



Problem is, there are no 26"+ CRTs now that can do 1280x720.

1) Would such a downconversion if feasible be better? Is there a technical reason why it couldn't be done this way.

2) That's just as much (if not more) a problem for 720p as it is for 1080i, right? And again, what about sets coming out in the next 18 months? Many home theater enthusiansts seem to have great disdain for LCD's and even plasma displays. If CRT's aren't the future, what's better for overall quality and works in a living room setting?
 
Can someone dig up more info on the Sony Super Fine pitch CRT set? It looks like it has almost all the componentry to handle a true 720P signal (Lacking a tiny bit on the vertical resolution). From what I've been reading, the set renders nearly a 720P signal internally but then outputs it to the 960 X 540 resolution. I'm sure this has a little to do with why people are pleased with the HD content in the set. It's processing it at a higher rate internally before outputting the signal (Which is actually a little higher too than your standard HDTV set).
 
Shogmaster said:
You don't even have a clue what you're jibber-jabbering about, do you? Stop it. You're just gonna hurt yourself.

Shog you seem to have little clue about tv's and are giving the wrong advice here. How many people here do you think even care about buying 720p//1080i tv if they have the money to get a nice 1080p tv and Blu Ray? People will go for the better tech in this case. HDMI will also get pushed and people will want that. You cant stop people from taking a natural step. 720p isnt that good anymore quite frankly as 1080p is here ofcourse.

Your argument about scaling is rubbish and its caught on alot here at GAF. This is 2005. Things like PP2 are real and already out in high end models with various differences. There is no need to be worried about getting a 1080p screen if it has something like this because quite frankly the 720p image will look better on a 1080p screen with PP2 due to enhancements over an origianl 720p tv for example. Without any enhancements it wouldnt look that good of course but buy a tv that has some tech like this.

See it in person if you dont agree with me. They have a demo button on the remote which splits the screen in half(2 sides with one on other off). Other companies use their own names and buzzwords but they are similar and only in high end models. This stuff will filter down into other models eventualy but high end models tend to showcase this tech. High end CRT, LCD, Plasma you name it.
 
sonycowboy said:
1) Would such a downconversion if feasible be better? Is there a technical reason why it couldn't be done this way.

I don't see why. I'm up on how exactly down conversions are done on these HDTVs anyways, so.....

2) That's just as much (if not more) a problem for 720p as it is for 1080i, right? And again, what about sets coming out in the next 18 months? Many home theater enthusiansts seem to have great disdain for LCD's and even plasma displays. If CRT's aren't the future, what's better for overall quality and works in a living room setting?

You do know that true 1080i is too much for anything other than LCD and LDPs, right? Those two techs lend themsleves best to true 1080 anything, because it's easier than others to reach such resolutions.

So knowing that true 1080i is 1920x540, I'd say it more difficult to do than true 720p on CRTs. I think 1080i > 960x540 is a good compromise right now for CRTs (on the XBR960) since it involves simple 4 pixels mashed into one downconversion, as long you don't mind interlaced output. I do mind, so I'd rather have a little less neat 1280x720 to 960x540 .66X down conversion, if the XBR960 is indeed able to do that.

All a guess on my part right now though.....
 
Shogmaster said:
So knowing that true 1080i is 1920x540, I'd say it more difficult to do than true 720p on CRTs.

That's not true is it?

1080i is by definition 1920 by 1080. It's just a matter of interlacing. The 540 lines drawn in each alternating are completely different.

But you are getting 1080 lines of resolution vs 540.
 
Shog or someone compare these 2 sets for me -

Benq 32 inch lcd
MAG innovision 32 inch lcd

Maybe its just me but the mag almost seems like a better tv mostly becuase its so much cheaper , would my best bet be just going to a local best buy and seeing what the difference is in person ? I saw the benq and it had a fantastic picture and displayed motion far better then a 42 inch tv next to it on the shelf... the brand name of which I dont' recall.

But that second one , I saw it in a flyer after I'd gotten home and it seems to good to be true in a way....
 
Deg said:
Shog you seem to have little clue about tv's and are giving the wrong advice here. How many people here do you think even care about buying 720p tv if they have the money to get a nice 1080p tv and Blu Ray? People will go for the better tech in this case. HDMI will also get pushed and people will want that. You cant stop people from taking a natural step. 720p isnt that good anymore quite frankly as 1080p is here ofcourse.

Your argument about scaling is rubbish and its caught on alot here at GAF. This is 2005. Things like PP2 are real and already out in high end models with various differences. There is no need to be worried about getting a 1080p screen if it has something like this because quite frankly the 720p image will look better on a 1080p screen with PP2 due to enhancements over an origianl 720p tv for example. Without any enhancements it wouldnt look that good of course but buy a tv that has some tech like this.

Look spanky. I've been playing 480p games on 720p native sets with plenty of that modernupressing voodoo trickery and I CAN see the interpolation, and it looks like shit! Don't gimme this I trust modern voodoo magic to solve even mathmaticallyu inherent problems BS. I don't believe in magic. I do believe that there's too much tech PR BS. I've seen these promises and none have delivered. So don't push this Philips has magical technology BS around here. It's stinking up the place, like all other companies' upressing BS techs.

See it in person if you dont agree with me. They have a demo button on the remote which splits the screen in half(2 sides with one on other off). Other companies use their own names and buzzwords but they are similar and only in high end models. This stuff will filter down into other models eventualy but high end models tend to showcase this tech. High end CRT, LCD, Plasma you name it.

Even though I've tested just about every single CRT HDTVs (with Xbox 480p games no less), I will humor you and check out the Philips crap agian. But I highly doubt it will be any different than what I've experienced before.
 
sonycowboy said:
That's not true is it?

1080i is by definition 1920 by 1080. It's just a matter of interlacing. The 540 lines drawn in each alternating are completely different.

It's the same if each refresh the 540 lines are showing new frame of data.

But you are getting 1080 lines of resolution vs 540.

On CRTs, due to it's analog nature, it's practically the same. On fixed res 1920x1080 displays, it will be easier to notice the interlacing.
 
Pachinko said:
Shog or someone compare these 2 sets for me -

Benq 32 inch lcd
MAG innovision 32 inch lcd

Maybe its just me but the mag almost seems like a better tv mostly becuase its so much cheaper , would my best bet be just going to a local best buy and seeing what the difference is in person ? I saw the benq and it had a fantastic picture and displayed motion far better then a 42 inch tv next to it on the shelf... the brand name of which I dont' recall.

But that second one , I saw it in a flyer after I'd gotten home and it seems to good to be true in a way....

BENQ seems to be a WXGA panel. The other's probably the same, lathough there's slight chance that it's a 1280x720 panel. Either way, the site doesn't tell us what type of TFT panels they are, so you should go see them for yourself before having others take guesses over the internet.
 
Sony KD-34XBR960
http://reviews.cnet.com/4505-6481_7-30787600.html

At a glance

* Release date: July 15, 2004
* Editors' rating: 8.1 Excellent
* Editor's take: If you can deal with the bulk and don't need a huge screen, this is the perfect home-theater TV.
* The good: Highest resolution of any consumer direct-view HDTV; independent memory per input; excellent color decoding.
* The bad: Bulky tube form factor.
* What's it for: Watching TV, DVD, and HDTV.
* Who's it for: Viewers who don't mind a tube if it delivers the best picture.
* Business use: None.
* Essential extras: DVD player; HDTV tuner or CableCard.
* The bottom line: This is the reference standard for picture quality among direct-view HDTVs and also happens to be a great value.

CNET editor's review

Reviewed by ; Kevin Miller
Edited by ; David Katzmaier
Reviewed August 26, 2004

Sony's 34-inch wide-screen tube-based direct-view HDTV, the KD-34XBR960, is simply the best-performing television of its kind on the market. Its screen boasts an incredible 1,400 lines of horizontal resolution, which allows it to resolve more detail with high-def sources than any other direct-view tube. It can deliver deeper blacks than any non-tube TV, and it offers two key improvements over last year's excellent KV-34XBR910: accurate color decoding and independent picture memory per input. In the smaller-than-40-inch category, the KD-34XBR960 earns its place as CNET's reference HDTV. Add to all that a list price of $2,200--another improvement compared to last year--and you also have a compelling value.

I think they have they're resolutions mixed up, but it does have lotsa lines. I have one, the picture at 1080i is awesome.
 
seanoff said:
Sony KD-34XBR960
http://reviews.cnet.com/4505-6481_7-30787600.html

At a glance

* Release date: July 15, 2004
* Editors' rating: 8.1 Excellent
* Editor's take: If you can deal with the bulk and don't need a huge screen, this is the perfect home-theater TV.
* The good: Highest resolution of any consumer direct-view HDTV; independent memory per input; excellent color decoding.
* The bad: Bulky tube form factor.
* What's it for: Watching TV, DVD, and HDTV.
* Who's it for: Viewers who don't mind a tube if it delivers the best picture.
* Business use: None.
* Essential extras: DVD player; HDTV tuner or CableCard.
* The bottom line: This is the reference standard for picture quality among direct-view HDTVs and also happens to be a great value.

CNET editor's review

Reviewed by ; Kevin Miller
Edited by ; David Katzmaier
Reviewed August 26, 2004

Sony's 34-inch wide-screen tube-based direct-view HDTV, the KD-34XBR960, is simply the best-performing television of its kind on the market. Its screen boasts an incredible 1,400 lines of horizontal resolution, which allows it to resolve more detail with high-def sources than any other direct-view tube. It can deliver deeper blacks than any non-tube TV, and it offers two key improvements over last year's excellent KV-34XBR910: accurate color decoding and independent picture memory per input. In the smaller-than-40-inch category, the KD-34XBR960 earns its place as CNET's reference HDTV. Add to all that a list price of $2,200--another improvement compared to last year--and you also have a compelling value.

I think they have they're resolutions mixed up, but it does have lotsa lines. I have one, the picture at 1080i is awesome.

That was the article I was talking about. I'm fairly close to jumping on XBR960 myself, but for that much money and girth, I wish I knew if it could actually display real 1280x720 like the old 30" Princeton......
 
Shogmaster said:
Look spanky. I've been playing 480p games on 720p native sets with plenty of that modernupressing voodoo trickery and I CAN see the interpolation, and it looks like shit! Don't gimme this I trust modern voodoo magic to solve even mathmaticallyu inherent problems BS. I don't believe in magic. I do believe that there's too much tech PR BS. I've seen these promises and none have delivered. So don't push this Philips has magical technology BS around here. It's stinking up the place, like all other companies' upressing BS techs.

I know its hard to think its possible :lol . Well this is 100% real. One other person has posted about this because has a PP2 tv too, but i dont know what he has other than mentioning it has that motion feature/fps increasing feature too. I guess its understandable that it sounds like a too good to be real kind of thing because its bloody good and you dont know where they get the data from. :lol



Even though I've tested just about every single CRT HDTVs (with Xbox 480p games no less), I will humor you and check out the Philips crap agian. But I highly doubt it will be any different than what I've experienced before.

You will come screaming back, "I cant believe they pulled it off!". It boggles the mind how they do this. :D Check it out on the top top CRT and/or plasma Philips models, LCD ussualy have it in for different features due to screen quality issues. Pixel Plus 2 minimum. Pixel Plus orginal is rubbish and afew other older versions too but you can tell since they will be older models or cheaper. :)

1_5_1_sharpness.jpg


Remember the above picture too. Thats how the difference is. I could take pictures of my CRT and show you guys what video processing can do? :)
 
Ahhh... a 720p CRT set. The holy grail of TV. I searched long and hard for one, because of my HATRED for fixed panel displays.

- There are no 720p CRT sets. CRT displays do not posses the bandwidth necessary to generate a 720p image. That’s why they go with 1080i... which it can't do, either, but it can "fake it" well enough that the image is considered better. I don’t consider the 2 non-existent sets that claim to do it (a Princeton and some other model that was "discovered" a few weeks ago on these forums, that is for sale NOWHERE) to be viable solutions. YES CRT sets look better, but there comes a time when you have to say "I’m a frickin' IDIOT for spending $4500 on a 30" TV...."

- CRT HD sets are the ONLY sets that have multiple resolutions. These resolutions would be 480p/i and 1080i. All other technologies (LCD, DLP, Plasma) have one fixed resolution (virtually always 720p, although there is a move towards 1080p).

- The Sony Superfine set doesn't do true 720p. You will not find any documentation on this, since its not in their best interest to make this easily understandable. None of the HD CRT manufacturers do. What they do instead is try and pass it off as a "feature", the fact that all signals are upconverted. Toshiba calls it "CrystalScan HDSC". Samsung calls it "Dual HD". Sony Calls it "Hi-Scan".

- All of this worrying will most likely not matter at all. Odds are that both the PS3 and Xbox 360, as well as any future versions, will allow you to set the resolution that the game is rendered at. There is no official documentation on this ability... but I have to believe it is there, because the current Xbox works that way. If the console renders the game internally at 720p, then all of your quality issues will disappear. It will look PERFECT on an LCD, DLP, or Plasma.

- And the final rule of HDTV sets: UPconverting (making a smaller image larger to fit your screen) is BAD. DOWNconverting (making the a big image smaller to fit your screen) is GOOD. Well, not really. But its a lot less bad. Unless you buy a CRT HD set, all of your old games will look BAD. Xbox, PS2, Gamecube, anything even older... well, they will look the same as SDTV signals when they are upconverted. So, just forget about it, and worry about the new games.

If I am wrong on any of these points, please correct me, as there is nothing I would like more (especially after all my searching) than to go out and be able to buy a 720p CRT set!
 
Oracle Dragon said:
And the final rule of HDTV sets: UPconverting (making the image larger to fit your screen) is BAD. DOWNconverting (making the a big image smaller to fit your screen) is GOOD. Well, not really. But its a lot less bad. Unless you buy a CRT HD set, all of your old games will look BAD. Xbox, PS2, Gamecube, anything even older... well, they will look the same as SDTV signals when they are upconverted. So, just forget about it, and worry about the new games.

If I am wrong on any of these points, please correct me, as there is nothing I would like more (especially after all my searching) than to go out and be able to buy a 720p CRT set!

Upconverting depends on the tv set. Its not a rule you can get more than you intend if you have the right tv as results will vary. Downconverting results in lower resolution and isnt a good thing as you getting something that is less than intended. In an ideal world everything would be making the most out of your tv.
 
Shogmaster said:
I don't see why. I'm up on how exactly down conversions are done on these HDTVs anyways, so.....



You do know that true 1080i is too much for anything other than LCD and LDPs, right? Those two techs lend themsleves best to true 1080 anything, because it's easier than others to reach such resolutions.

So knowing that true 1080i is 1920x540, I'd say it more difficult to do than true 720p on CRTs. I think 1080i > 960x540 is a good compromise right now for CRTs (on the XBR960) since it involves simple 4 pixels mashed into one downconversion, as long you don't mind interlaced output. I do mind, so I'd rather have a little less neat 1280x720 to 960x540 .66X down conversion, if the XBR960 is indeed able to do that.

All a guess on my part right now though.....

It was my understanding that fixed panel displays like LCD, DLP, and Plasma, do not do an interlaced image. I don't think it really makes sense to do it... thats why its always progressive. I've never seen any set using any of those techs display anything other than 720p. Is there one (or more) out there that are actually REAL 1080i? If so, which ones? That would be interesting to check that out!
 
Oracle Dragon said:
- The Sony Superfine set doesn't do true 720p. You will not find any documentation on this, since its not in their best interest to make this easily understandable. None of the HD CRT manufacturers do. What they do instead is try and pass it off as a "feature", the fact that all signals are upconverted. Toshiba calls it "CrystalScan HDSC". Samsung calls it "Dual HD". Sony Calls it "Hi-Scan".
You're right about this. When I was shopping for a TV, I was reading the spec sheets for the new Samsung CRTs. One of them lists, "Native Resolution Display Format: 1080i/480p/720p interlaced". 720p interlaced? There's no such thing! Obviously that means that on this set, 720p signals are converted to be shown in interlaced form, but they'd never come out and say that.
 
Deg said:
Upconverting depends on the tv set. Its not a rule you can get more than you intend if you have the right tv as results will vary. Downconverting results in lower resolution and isnt a good thing as you getting something that is less than intended. In an ideal world everything would be making the most out of your tv.

Uhhh, what? Upconverting doesn't depend on the TV set... some sets wil of course perform the upconversion with better results. But at the end of the day, you can't produce an image of a higher resoultion than your source. All you can do is clean up enough of the artifacts in the hope that it looks acceptable. Upconverting is always bad, because you are essentially trying to create information out of nothing.

Downconverting is, of course, also bad. But since you are not creating information, only removing it, it results in a superior picture compared to one that is Upconverted. Of course, it is not the ideal situation, since you are not seeing the image as originally intended. But it is a better position to be in than upconverting.

If you want to see the results first hand, head to anywhere that sells TV's. Check out as many makes and models as you like, they will all perform the same. A low resolution image (SDTV, DVD, anything 480) upscaled to 720p looks WORSE than a similar image at 1080i downscaled to 720p. A LOT worse.
 
Oracle Dragon said:
It was my understanding that fixed panel displays like LCD, DLP, and Plasma, do not do an interlaced image. I don't think it really makes sense to do it... thats why its always progressive.

You are right obcourse. They would show interlaced shit line doubled so it's progressive. I was trying to illustrate a point to cowboy more than anything.

I've never seen any set using any of those techs display anything other than 720p. Is there one (or more) out there that are actually REAL 1080i? If so, which ones? That would be interesting to check that out!

I've never seen anything "real" 1080i, because it would have to be a fixed res display to show at such resolution, and if it's fixed res, than 1080i will be line doubled into 1080p.

The closest to that I've seen is TI's 1080p DLP projection set demo at this year's CES. They showed 1080i shit line doubled to 1080p. Infact, we were joking about how there was absolutely no pure 1080p source material shown at these demos, except for computers hooked up to Samsung DLPs showing MS Excel spreadsheet comparasin between a 1080p and a 720p DLP sets. :lol
 
How about upconverting DVD players? Don't those result in a better picture than it doing standard 480p? Or are the DVD players just better scalers, and the whole point is to bypass whats in the TV?
 
Downconverting is, of course, also bad. But since you are not creating information, only removing it, it results in a superior picture compared to one that is Upconverted. Of course, it is not the ideal situation, since you are not seeing the image as originally intended. But it is a better position to be in than upconverting.

No. Downconverting is worse to me as you are 'losing' detail. Upconverting at least means you can rest in peace as you get everything. Without any enhancing or processing you'll see the image for what it really is. A crap low res image(good thing Lucas upgraded SW movies :p). Generally downconverting is nicer on the eyes because you dont know what you are missing and arent trying to fill empty space. Try not to get in that situation if possible as you are losing resolution. You want the best picture afterall unless you have something else in mind.

If you want to see the results first hand, head to anywhere that sells TV's. Check out as many makes and models as you like, they will all perform the same. A low resolution image (SDTV, DVD, anything 480) upscaled to 720p looks WORSE than a similar image at 1080i downscaled to 720p. A LOT worse.

Ofcourse because the image has poor quality. No need to get a lesser tv in anyway. In that situation stuff like PP2 comes in handy as that crappy image will look as good as ever providing there is no better alternative anywhere. Otherwise you wouldnt be using it. :) I fail to see why you should sacrifice better resolution for lower resolution. Not logical. But at least sane people arent going in that direction. :)

PG2G said:
How about upconverting DVD players? Don't those result in a better picture than it doing standard 480p? Or are the DVD players just better scalers, and the whole point is to bypass whats in the TV?

Improved? there are factors and it depends but DVD can only do so much unless HD is encoded on the damn thing. Thats why we have new formats coming for the real deal with the capacity. DVD as it is isnt enough for HD content. They are better suited to the new HDTVs. DVD is fine for low res tv. As i said people wont be buying HDTV for low res content. Its for high res.
 
PG2G said:
How about upconverting DVD players? Don't those result in a better picture than it doing standard 480p? Or are the DVD players just better scalers, and the whole point is to bypass whats in the TV?

Whether the scalers in the DVD player, the TV, or a seperate box, it's still far from native output. No way a 480p DVD output will best BluRay 1080p DVD output on a 1080p HDTV set, no matter how good the scaler it's going through is. I don't care if it's a $10,000 seperate Farudja unit, or a built in unit in Philips TVs (rolleyes).
 
Shogmaster said:
or a built in unit in Philips TVs (rolleyes).

Its not just a scaler it can enhance a normal image in many ways too and increase the framerate :D Scaling is just one of things it does. Also its PP2. Its only in certain top models.

The most important features of Pixel Plus 2 are the advanced sharpness & resolution enhancements. These are accomplished through Philips&#8217; unique sub-pixel based Horizontal & Vertical Luminance Transient Improvement. First the incoming signal&#8212;from any source&#8212;is scaled up to a maximum resolution of 2,560,000 pixels. Then each individual pixel is altered to better match the surrounding pixels. The end result is an enormously sharper, crisper picture with much more depth impression.

2,560,000 is alot of pixels to mess with.
 
seanoff said:
Shog, nip into this thread at AVSforum.

http://www.avsforum.com/avs-vb/showthread.php?t=546984

Use the XBR960 as a second very large monitor!!! Seems to work well.


I'd be interested in your feedback if u got it up and running.

HOLY SHIT BRO!!!! THAT DUDER GOT HIS XBR960 RUNNING AT 1280x720!!!!! :D
My nipples just moistened!!

Now, in my own research world I have made what I believe to be a very large breakthrough... which was the discovery of "clone mode". This is facilitated by unchecking the "extend my Windows desktop onto this monitor" for monitor 2 (i.e. my Sony XBR).

By setting my primary (i.e PC monitor) to 1280x720 (and 85hz) and then activating the "clone" (i.e. the Sony), the 1280x720 resolution is imposed on the clone, and 60hz appears to be automatically selected as the refresh rate... which corresponds to "p". In otherwords, the Sony clone becomes 1280x720p. Apparently the ATI drivers are able to support separate refresh rates for the primary and clone, while applying the same resolution to both. This is fortunate, as if it imposed refresh rate as well then I'd have to set my primary to 60hz, which would produce flicker.

Of course setting my primary 4x3 CRT monitor to a 16x9 resolution of 1280x780, the display looks a bit odd (characters appear a bit taller than they normally are with a 4x3 resolution), but it's perfectly acceptable if this were going to be an HTPC and not a real "work" PC.

Anyway, once clone mode is established, if you then go into the "overlay" page of Settings -> Advanced, and push the "Theater mode" button, you are establishing the clone as a "theater mode" display by checking that second radio button. For a real HTPC, this would be your real HDTV screen.

Once you check "theater mode", the second and third items become un-grayed. Leave "same as source video" checked for second item, and check "16:9 (widescreen) for third item. The effect of this is to display in fullscreen on the clone any media window on the primary.

And now, amazingly, this works! When I'm not playing a DVD or a video, the clone screen is simply a fullscreen 16x9 1280x720 60hz duplicate of my primary PC monitor. Whatever is on my primary monitor (i.e. desktop, open windows, etc.) also appears on my clone (Sony). And the text is VERY VERY CLEAR. This is likely helped out by my service menu tweaks on the Sony, as well as the convergence repair (via magnets) that I had a Sony tech do when I first got the set. Anyway, text is super clear and readable... if that actually meant anything, which it doesn't since the clone is really for fullscreen media display.

When I play a DVD or a video using CinePlayer or other media player program, the clone goes into fullscreen 16x9 display mode even while the primary PC monitor still shows the media playing in a window on the desktop. 640x480 video window on the primary plays fullscreen 4x3 pillar-boxed (as it should) on the clone Sony. DVD's play on the Sony as fullscreen 16x9 720p.

Looks great!

I wonder if this is only possible through HDMI, or the other connections can support this feat. I wonder if you can successfully attach a DB15 VGA output to the HDMI via an adapter. SHIT SHIT SHIT!! This is VERY interesting news! Thanks!

If all this works, I gotta see I can squeeze another $2400 out of my budget for November.... :D
 
Shogmaster said:
HOLY SHIT BRO!!!! THAT DUDER GOT HIS XBR960 RUNNING AT 1280x720!!!!! :D
My nipples just moistened!!



I wonder if this is only possible through HDMI, or the other connections can support this feat. I wonder if you can successfully attach a DB15 VGA output to the HDMI via an adapter. SHIT SHIT SHIT!! This is VERY interesting news! Thanks!

If all this works, I gotta see I can squeeze another $2400 out of my budget for November.... :D

So what happens if you have a crap computer? :P Find another way to hack this and I'll be really excited.
 
Dragona Akehi said:
So what happens if you have a crap computer? :P Find another way to hack this and I'll be really excited.

This dude lives not that far from me. Maybe I can PM my way into his place for some intimate testing. ;)
 
Deg said:
PP2 gives you a better picture than you could possibly get :D

I'm gonna have to go with english as a second language, but this is the best line I've heard in a while.

Honestly, you are simply way off base here kid. You would much rather watch 720P source material on a 720P set than to have it "unconverted" to 1080P. You can't magically create detail that is not there.
 
Oracle Dragon said:
Ahhh... a 720p CRT set. The holy grail of TV. I searched long and hard for one, because of my HATRED for fixed panel displays.

- There are no 720p CRT sets. CRT displays do not posses the bandwidth necessary to generate a 720p image. That’s why they go with 1080i... which it can't do, either, but it can "fake it" well enough that the image is considered better. I don’t consider the 2 non-existent sets that claim to do it (a Princeton and some other model that was "discovered" a few weeks ago on these forums, that is for sale NOWHERE) to be viable solutions. YES CRT sets look better, but there comes a time when you have to say "I’m a frickin' IDIOT for spending $4500 on a 30" TV...."

Well the 35" ones are around $1500-2000 now, which isn't too much since as you said "CRT sets look better".

- All of this worrying will most likely not matter at all. Odds are that both the PS3 and Xbox 360, as well as any future versions, will allow you to set the resolution that the game is rendered at. There is no official documentation on this ability... but I have to believe it is there, because the current Xbox works that way. If the console renders the game internally at 720p, then all of your quality issues will disappear. It will look PERFECT on an LCD, DLP, or Plasma.

On the same note you'll be able to set your system to output at 1080i on a CRT which, although the CRT fakes it, should look perfect on your CRT.

- And the final rule of HDTV sets: UPconverting (making a smaller image larger to fit your screen) is BAD. DOWNconverting (making the a big image smaller to fit your screen) is GOOD. Well, not really. But its a lot less bad. Unless you buy a CRT HD set, all of your old games will look BAD. Xbox, PS2, Gamecube, anything even older... well, they will look the same as SDTV signals when they are upconverted. So, just forget about it, and worry about the new games.

So if you're a major enthusiast and you plan to have almost every console ever made hooked up to your one and only HDTV...CRT is the way to go as games should look the same as they do now (or slightly better if you're coming from a non-480p set) + you'll get extremely good looking quality out of the next-gen systems though it will have slightly less lines of resolution than a not as good looking but higher resolution LCD/DLP.

Seems to me that if you're a person who only plays current gen games, maybe a LCD/DLP/Plasma is in your best interest. But if your someone who has a huge backlog of current gen games that you plan on playing throughout the next-generation than a CRT is your best option.
 
Oh and personally I don't think size matters that much for gaming. After about 30" it's big enough for me, especially if you're going to be playing current gen and PS1/Saturn/N64/SNES games on it ^^;

But for films I feel size definitely does matter as you're trying to recreate the theater experience and films are often watched with friends/family. Whereas depending on the gamer, they might play most games alone.

So yea, I think my personal goal for the next 10 years would be the best CRT set I could get for gaming, and then a 1080p DLP for blue-ray films and maybe I'd move the PS3 from the CRT to the DLP when a 1080p game comes out.
 
I have a philips LCD set with pixel plus. Everything displays in 720p. I feed it almost nothing but 480i, and it looks fantastic. When the source is bad, it still looks bad, but it does a bloody excellent job of upscaling. It doesn't just interpolate, it inserts detail (fake, obviously, but a very good fake)


Can I just call out these downscaler soothsayers? "Hey don't worry, your pretty little x360 will scale the output just fine, you won't know the difference". Well, if its not displaying the original image, then its been messed about with, so there will be some difference.

And even though down conversion is better than up, you still need some good technology to deal with it. Remember the old DVD players that used to letterbox by simply throwing away 1 in 4 lines? They looked shit and almost killed anamorphic pressings in the early days.

Bottom line is no conversion is the best thing. And don't assume that it'll be fine to flick a switch in the dashboard. What if the X360 scaler is crap? MS will just tell you to buy a samsung HDTV.
 
Shogmaster said:
Whether the scalers in the DVD player, the TV, or a seperate box, it's still far from native output. No way a 480p DVD output will best BluRay 1080p DVD output on a 1080p HDTV set, no matter how good the scaler it's going through is. I don't care if it's a $10,000 seperate Farudja unit, or a built in unit in Philips TVs (rolleyes).

you're being unfair, and comparing apples and oranges.

I think the comments about pixel plus and other variants (eg faroudja), are about getting the best out of low res inputs on a high res panel. Mine does a v.good job of that.

Of course a higher res source will always look better because it has more pixels. But a native 720p image should look better on a 720p panel than a 1080i image downconverted.
 
I never knew the world of high-definition was so chaotic and insane.

Any guesses as to when we'll get decent sets, at a decent price, and using a single standard for resolution?

(... and no, you're not allowed to assume that the highest resolution will become the eventual standard.)
 
DavidDayton said:
I never knew the world of high-definition was so chaotic and insane.

Any guesses as to when we'll get decent sets, at a decent price, and using a single standard for resolution?

(... and no, you're not allowed to assume that the highest resolution will become the eventual standard.)

They all use one resolution. Just not all the same resolution.

Seriously though, weigh up what you'll be feeding it, which are most important to you, what happens to those inputs before they are displayed, are those important to you etc.

eg, leet gamer, 80% gaming (720p), 10% HDTV(50/50 720p/1080i), 10% DVD (480p). Wants the games to look the best they can. Likes sports (720p) so important they look good too. DVDs shouldn't look any worse than they do on his current (SD) set.

Buy a 720p native display - eg DLP rear projection.
 
Top Bottom