720p Games

Yusaku

Member
I'm helping a friend shop around for HDTVs. I'm not all that familliar myself, so I was suprised to find quite a number of HDTVs will do 1080i, but not 720p--not even converting 720p to 1080i. Like this Panasonic one he was looking at. The manual doesn't even mention 720p once.

Does this mean it wont be possible to play games like Soul Calibur 2 in 720p at all?
 
no it cant, a lot of the tvs can do 1080i but not 720p because 720p is a higher res or better picture quality. only DLP, LCD, and plasma are capable of 720p that are availble to consumers. more options will be availible next year. there is a nice panasonic LCD projection tv that is availible for about 2300. awesome picture. splurge and get it, its well worth it.
 
That's not 100% true. My TV, a sony CRT, does not display 720p natively, BUT it will upconvert it to 1080i. Believe me, you can tell a difference fom 480p too.
 
some of the nicer Sonys and Samsungs can do 720p, but at that point you're paying $1500-2000 for a 32" CRT, when you can get a 46" DLP set for the same amount, so YMMV.
 
What's the average response time on LCD TVs these days and how are the black levels? At this point, I still think LCDs are just crap for gaming, but I'm wondering when they will start to improve...
 
I have a 40" Panasonic PT40LC12 LCD widescreen I've been using for almost two years now, and I have no complaints whatsoever. Sharp, clear, colors look good, and very little of the "fuzzy filter" effect most LCD screens have on games. Does 480p, 720p, and 1080i.
 
joshschw said:
he knows he won't get HD programming unless he has a tuner, right? 'HD Ready' doesn't have a tuner built in.

Obviously. I don't see what that has to do with this thread. My concern is about 720p. I had assumed that anything that was called HDTV could at least receive 720p and 1080i, if not display them at their proper resolution. Obviously that's not the case.
 
dark10x said:
What's the average response time on LCD TVs these days and how are the black levels? At this point, I still think LCDs are just crap for gaming, but I'm wondering when they will start to improve...
i didn't have the television for long enough to get a good grasp on the black levels, but my Sony WEGA KLV-23M1 ($1500 23 inch LCD) had too much blurring for game use. it was most obvious in games at 60 fps, but i noticed it in almost everything. bugged me enough to return the set.

looked awesome for movies, though..

now i have a 32 inch sony crt. aside from its weight, i couldn't be happier. 480p, 720p, 1080i, big screen, all for $1000. it even seems to look better than the LCD display..
 
is that a CRT? Because CRT's don't have native resolution. And if it's an "HD-capable" CRT TV, you can bet that it accepts signals of 720p or 1080i. Not sure about how CRTs handle progressive scan though - but you can bet it will look better than standard NTSC 480i.

EDIT: And I believe for any TV to be considered an "HD" one (accepts ATSC signals) it must accept a signal in 720p or 1080i, if not display them at that resolution. I say this because different stations broadcast at different signals (ABC/CBS at 1080i, Fox at 720p), therefore, to see that signal OTA you must be able to accept both resolutions.
 
What's the average response time on LCD TVs these days and how are the black levels? At this point, I still think LCDs are just crap for gaming, but I'm wondering when they will start to improve...

:lol I thought about you last night while testing out my DVDI-D connection by playing Shrek 2. I'm never going back to CRT until they take on a smaller form factor.
 
I was wrong. Did a little reasearch and found out that an HD "monitor" doesn't have to accept any signal. Rather, as long as it can display one of the ATSC Formats (720p, 1080i, 1080p) it can be referred to as an HD monitor. For it to be an "HDTV" it has to accept all 3 signals, even if it does some conversion to display them in it's native format. Sorry about that.

And therefore, I wouldn't be surprised if that TV doesn't accept 720p signals.
 
Nerevar said:
EDIT: And I believe for any TV to be considered an "HD" one (accepts ATSC signals) it must accept a signal in 720p or 1080i, if not display them at that resolution. I say this because different stations broadcast at different signals (ABC/CBS at 1080i, Fox at 720p), therefore, to see that signal OTA you must be able to accept both resolutions.

Nonsense. HD is anything 720p and up (e.g. 1080i). Most CRT's don't automatically upconvert, but you don't really need it. Also, most people have no need for an HD tuner. The reason for this is that very few people (unless they're total cheapskates) will expect to get their HD from an OTA signal. Most people get their HD from satellite or cable, and the tuner is provided in their respective boxes.

Also, these set top boxes provide upconversion from 720p to 1080i, so you can see upconverted pics just fine on your 1080i set (and they downconvert 1080i as well if you own a 720p set). Many also have DVI outs, so you're looking at a lossless pass from the HD receiver to your TV.

Long story short, for most people not having 720p is not a huge loss if you have 1080i, and vice versa. Obviously having both natively would be the best, but that is not available right now at a reasonable price. And almost no one cares about tuners; less and less TV manufacturers are making sets with them, even on their higher-end sets.
 
DaCocoBrova said:
:lol I thought about you last night while testing out my DVDI-D connection by playing Shrek 2. I'm never going back to CRT until they take on a smaller form factor.

Heh heh...

I've been doing some gaming on an LCD lately (17" 16:10 "glossy" LCD with native 1680x1050 resolution -- it's even wider than my 19" CRT), and I can honestly say that I really dislike it. The ghosting does bother me quite a bit, but I had not anticipated just how much the black levels would drive me nuts. In a well lit room, the blacks CAN look VERY dark and shiny...but as soon as you turn out the lights, everything seems washed out. It really looks bad when playing a low light game.

The LCD rocks hardcore for viewing Windows and other still images. My CRT looks dirty and blurry in comparison, but for gaming, it just doesn't cut it. I must say, though, it does an amazing job at handling resolution that aren't native. I almost think that 1024x768 looks nicer on the LCD than on my CRT (though I can play most games at 1680x1050 at a smooth framerate -- even Half-Life 2!).

Still, it's convinced me that LCDs still aren't there. Even Samsung's newest 8ms reponse time display has some strange issues. The 8ms only applies to low contrast situations. If you transition from white to black, for example, the response time is no longer 8ms. Seems as if we might be reaching the limits of LCD technology...
 
Inumaru said:
Nonsense. HD is anything 720p and up (e.g. 1080i). Most CRT's don't automatically upconvert, but you don't really need it. Also, most people have no need for an HD tuner. The reason for this is that very few people (unless they're total cheapskates) will expect to get their HD from an OTA signal. Most people get their HD from satellite or cable, and the tuner is provided in their respective boxes.

Also, these set top boxes provide upconversion from 720p to 1080i, so you can see upconverted pics just fine on your 1080i set (and they downconvert 1080i as well if you own a 720p set). Many also have DVI outs, so you're looking at a lossless pass from the HD receiver to your TV.

Long story short, for most people not having 720p is not a huge loss if you have 1080i, and vice versa. Obviously having both natively would be the best, but that is not available right now at a reasonable price. And almost no one cares about tuners; less and less TV manufacturers are making sets with them, even on their higher-end sets.

HD is a series of display resolutions defined by an organization and published as the ATSC standard. This includes 720p, 1080i, and 1080p. An HDTV must accept input in all 3 formats for it to be a "true" HDTV. Therefore, it must have an integrated tuner. An HD "monitor" or HD "ready" TV doesn't have to. And yes, the STB can do the upconversion, but that's not to say that everyone uses that. My STB is outputting 1080i, and my display is a native 720p resolution - but I don't notice the difference nor do I care to go into my STB settings to fix it and my television's conversion is fine. And your comment that fewer TV manufacturers are making sets with them is plain wrong. A year ago you were hard-pressed to find any set with an integrated tuner, now a very large portion of HDTVs include them. And "reasonable price" is very subjective, I'd say a 42" GWIV for roughly $2200 with an included tuner is reasonable, and might be worth it for receiving free OTA HDTV.
 
Nerevar said:
And therefore, I wouldn't be surprised if that TV doesn't accept 720p signals.

Here is a quote from the thread about the Sanyo:

did get a chance to mess with the Sanyos today, and they do infact accept a 720p signal. I don't know (and don't know how to tell) if it is upconverted to 1080i. FWIW, when I hit the info button, it will show the information for the incomming signal, either 720p or 1080i These are the only tv's out of the 8 or so HD sets that we have that even displayed the 720p signal being output from the Dish 6000. If anyone knows how to test if it is native 720p, let me know and I'll see what i can do.

Also, the 30" wide allowed me to use all picture formats on 1080i. So I could put the 1080i signal in 4:3, full, zoom, etc. I still want to try that in 720p and 480p, but since it does it in 1080i, i assume it will do picture format changes in all modes. If i hadn't bought my Philips, the 30" Sanyo would be in my bedroom as we speak.
 
Nerevar said:
I was wrong. Did a little reasearch and found out that an HD "monitor" doesn't have to accept any signal. Rather, as long as it can display one of the ATSC Formats (720p, 1080i, 1080p) it can be referred to as an HD monitor. For it to be an "HDTV" it has to accept all 3 signals, even if it does some conversion to display them in it's native format. Sorry about that.

And therefore, I wouldn't be surprised if that TV doesn't accept 720p signals.

Just wanted to mention that no one that works with or sells HD sets (that I know of) care about this outdated definition of "monitor" versus "a real HDTV". It was once thought that the tuner issue was terribly important, but as the majority of HD users get their feed from a source that is providing the tuner and upconverter for them, at a nominal fee, most discussion of HD sets now treat the "HD monitor" as an HDTV, in spite of what was once the agreed-upon definition.

Lastly, of the 18 DTV formats, six are HD. Not sure where it's said that a TV has to accept 720p, 1080i, and 1080p to be called an HDTV. Furthermore, show me more than two sets on the market today that even accept a 1080p signal at all. They are just coming out later this year!
 
dark10x said:
What's the average response time on LCD TVs these days and how are the black levels? At this point, I still think LCDs are just crap for gaming, but I'm wondering when they will start to improve...

Response rate is not much of an issue on new LCD's. But Image processing is. Its hard to go wrong nowadays if you will use good inputs(component, DVI, HDMI) as the difference becomes less marked. For example some Philips models use Sharp screens but they display a better picture quality than comparable Sharp models. In this case Philips (Pixel Plus II) is better than what Sharp are using. Especially on poorer analogue inputs. The Sharp looks like it has a muddy layer on top where as the Philips is clean and sharp looking. The Philips also does the screens response time justice (8ms) where as the Sharp seems to lag with what looks like framerate issues in a game almost lol.


dark10x said:
Heh heh...

I've been doing some gaming on an LCD lately (17" 16:10 "glossy" LCD with native 1680x1050 resolution -- it's even wider than my 19" CRT), and I can honestly say that I really dislike it. The ghosting does bother me quite a bit, but I had not anticipated just how much the black levels would drive me nuts. In a well lit room, the blacks CAN look VERY dark and shiny...but as soon as you turn out the lights, everything seems washed out. It really looks bad when playing a low light game.

The LCD rocks hardcore for viewing Windows and other still images. My CRT looks dirty and blurry in comparison, but for gaming, it just doesn't cut it. I must say, though, it does an amazing job at handling resolution that aren't native. I almost think that 1024x768 looks nicer on the LCD than on my CRT (though I can play most games at 1680x1050 at a smooth framerate -- even Half-Life 2!).

Still, it's convinced me that LCDs still aren't there. Even Samsung's newest 8ms reponse time display has some strange issues. The 8ms only applies to low contrast situations. If you transition from white to black, for example, the response time is no longer 8ms. Seems as if we might be reaching the limits of LCD technology...

Hopefully in light of the above you can understand the situation better. Here in the UK Samsung havent exactly released any great LCD for tv viewing.

You have a weird LCD. The proportion is strange. I doubt it is any good for tv, gaming or movies. Why is it 16:10 ? For desktop work?
 
Inumaru said:
Just wanted to mention that no one that works with or sells HD sets (that I know of) care about this outdated definition of "monitor" versus "a real HDTV". It was once thought that the tuner issue was terribly important, but as the majority of HD users get their feed from a source that is providing the tuner and upconverter for them, at a nominal fee, most discussion of HD sets now treat the "HD monitor" as an HDTV, in spite of what was once the agreed-upon definition.

I'm sure no one cares who sells HD sets cares about because they want to assure someone that they are in fact buying an HDTV because all the broadcasts say "also in HDTV!" I'm just going by the CEA definition of an HDTV.

Inumaru said:
Lastly, of the 18 DTV formats, six are HD. Not sure where it's said that a TV has to accept 720p, 1080i, and 1080p to be called an HDTV. Furthermore, show me more than two sets on the market today that even accept a 1080p signal at all. They are just coming out later this year!

You're thinking of sets that can display 1080p natively. Notice how it specifies 1080i/p in the ATSC Table 3 video formats.

I refer you to the following link:

Description of the various HD and ED tuner / television designations
 
I've been doing some gaming on an LCD lately (17" 16:10 "glossy" LCD with native 1680x1050 resolution -- it's even wider than my 19" CRT), and I can honestly say that I really dislike it. The ghosting does bother me quite a bit, but I had not anticipated just how much the black levels would drive me nuts. In a well lit room, the blacks CAN look VERY dark and shiny...but as soon as you turn out the lights, everything seems washed out. It really looks bad when playing a low light game.

Analog or DVI-D connection? DVI makes a tremendous difference. I've never wanted to lick a screen before. Now I fight the urge often.
 
Inumaru said:
And I ask you to show me a set that accepts 1080p signals.

Eh, whatever, you're right - it doesn't have to accept 1080p to be a "true" HDTV (it only has to be ATSC compliant up to 1080i "or higher"). I was misreading the CEA definition of an HD tuner (which does have to accept 1080p). Regardless, it doesn't invalidate the original point which is that an HD "monitor" is not an HDTV, and cannot be marketed as such (despite what salesmen may say).
 
Response rate is not much of an issue on new LCD's. But Image processing is. Its hard to go wrong nowadays if you will use good inputs(component, DVI, HDMI) as the difference becomes less marked. For example some Philips models use Sharp screens but they display a better picture quality than comparable Sharp models. In this case Philips (Pixel Plus II) is better than what Sharp are using. Especially on poorer analogue inputs. The Sharp looks like it has a muddy layer on top where as the Philips is clean and sharp looking. The Philips also does the screens response time justice (8ms) where as the Sharp seems to lag with what looks like framerate issues in a game almost lol.

The Phillips includes DCDI technology from Faroudja. My Dell (which I believe Phillips makes) has that. Stunning IQ.
 
Yusaku said:
Obviously. I don't see what that has to do with this thread. My concern is about 720p. I had assumed that anything that was called HDTV could at least receive 720p and 1080i, if not display them at their proper resolution. Obviously that's not the case.

I realize that, but I read an article somewhere that stated around 50% of HDTV owners don't even get a HDTV broadcast, or anythingh for that matter. so I thougt to stick it in there.
 
AgentCooper said:
You can see one at the Sony Style store hooked up to a blu-ray player. They exist.

If you're referring to the Qualia Q004 SXRD Front Projector, you're wrong. I believe the SXRD can upconvert to 1080p, but it will not accept a 1080p signal.

Also, that projector costs around $12K, far north of the amount the original poster was looking to spend, which proves my point. Which was, "Show me a set that accepts 1080p (if you can) and I'll show you one that's prohibitevly expensive." $2K-4K is one thing, $10-20+ is quite another.
 
DaCocoBrova said:
The Phillips includes DCDI technology from Faroudja. My Dell (which I believe Phillips makes) has that. Stunning IQ.

After seeing that LCD. I love Philips. :D They know how to do a good tv with style.
 
AgentCooper said:
You can see one at the Sony Style store hooked up to a blu-ray player. They exist.

I saw it and the 71" rear projection version at the Las Vegas Qualia store (part of their Sony store in Ceasar's Palace) in controlled conditions showing Bluray stuff and they were no where as impressive as seeing 70" 1080p Samsung DLP at CES in bright whow lights and stuff. The Samsung will be a fraction of the cost too.

Here's my buddy modelling for the Sammy:

EdwoodCES2005_02DLP70in.jpg


EdwoodCES2005_03DLP70in.jpg


That's right. 10,000:1 contrast ratio peeps.

EdwoodCES2005_04DLP70in.jpg


EdwoodCES2005_054DLP70in.jpg


EdwoodCES2005_06DLP70in.jpg
 
Deg said:
looks fantastic. how is the image? CRT like? A little washed out but great? etc.

Digital stuff (like CG movies and shows) was vibrant as hell. Others looked great too. But keep in mind all the stuff shown was Hi Def stuff (1080p).

I really couldn't see a big difference between the skin tone between 1080p DLP and the Qualia's take on the LCOS. My friend has he could, but said it was very minute in favor of the LCOS in the "natural look" department. I couldn't tell shit frankly. They both looked fanatastic.

But when it came to pure digital source, DLP bitch slaps the LCOS up and down the street. Sweet sweet butter.
 
Shogmaster said:
Digital stuff (like CG movies and shows) was vibrant as hell. Others looked great too. But keep in mind all the stuff shown was Hi Def stuff (1080p).

I really couldn't see a big difference between the skin tone between 1080p DLP and the Qualia's take on the LCOS. My friend has he could, but said it was very minute in favor of the LCOS in the "natural look" department. I couldn't tell shit frankly. They both looked fanatastic.

But when it came to pure digital source, DLP bitch slaps the LCOS up and down the street. Sweet sweet butter.

Hot. Digital seems to be the way to go all tv's. HDMI should be very handly.

1080p? How much of an improvement is it? I have never seen 1080p.
 
Deg said:
Hot.

1080p? How much of an improvement is it? I have never seen 1080p.

No one in the public had until this CES AFAIK.

720p is 1280x720. 1080p is 1920x1080. It's a fair improvement... :lol


Oops. I guess you might be asking if I could see the difference. Oh yeah... HELL YEAH. Big difference. The detail is positively KERA-ZY! The first thing I'd do is to hook that bitch up as a monitor for a Home Theater PC I'd build.... 71" digital monitor in front of the sofa? Yes please.
 
Reading AVS, it seems like even the Samsung 1080p sets won't be able to accept 1080p from DVI inputs. Kinda useless if you ask me...
 
Shogmaster said:
No one in the public had until this CES AFAIK.

720p is 1280x720. 1080p is 1920x1080. It's a fair improvement... :lol


Oops. I guess you might be asking if I could see the difference. Oh yeah... HELL YEAH. Big difference. The detail is positively KERA-ZY! The first thing I'd do is to hook that bitch up as a monitor for a Home Theater PC I'd build.... 71" digital monitor in front of the sofa? Yes please.

:lol :D Like to see it :(


"Reading AVS, it seems like even the Samsung 1080p sets won't be able to accept 1080p from DVI inputs. Kinda useless if you ask me..."

HDMI can handle the bandwidth however. Right?
 
There is a post over there about Samsung's 2005 lineup, one of the questions is as follows:

10. Can the 68 series or 88 series 1080p sets accept a 1080p signal via the HDMI input? The general feeling is that the answer is NO ... but, did anyone ask this specific question?

1. Comments By HTwaits (AVS Member)
2. My understanding is that all of the 2005 batch of 1080p HDTV sets have 1080i input capability but not 1080p input capability.
3. I also have the impression that HD DVD players only output 1080i.
4. If I remember correctly, de-interlacing directly from 1080i to 1080p should give a very "true to the source" image unless a manufacturer de-interlaces 1080i to 520p instead of 1080p. I think scaling from 520p to 1080p would probably degrade the image.

5. Comments By Kyungkim (CES Attendee)
6. Yes I asked this specifically and got the reply that it does not, due to hdcp requirements. This has been pointed out to be bubkus, hdcp does not limit rez.
7. I asked this question at a number of mfg planning 1080p sets and the most telling came from sony. They said they do not support 1080p (in) since no consumer device is capable of outputing a true 1080p signal. Even the blu ray players sold in japan only output 1080i.
8. So im not holding my breadth on the digital side of the 1080p input on this generation, i dont think they will be capable of it. Im 99.99 percent positive the hdmi ports will not support 1080p from talking to the reps.
9. Im holding out against hope that the vga input hasnt been crippled. I know everyone wants digital, but rgb has been the way that g90 projectors have been fed with 1080p for years, so im sure it wont be so awful.


thread
 
Thanks for the thread, PG2G. AVS forums rock. More evidence that supports my belief that people shouldn't worry much about 1080p right now. It probably ain't happenin' outside of digital movie theaters in pure form for some time.
 
Top Bottom