• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

1080p - Marketing Myth

Status
Not open for further replies.
Pretty bizarre article, if you ask me.

Argument #1: TV doesn't use it.
Response: I believe "duh" would be appropriate here. But it's not for "production formats", it's because the bandwidth just isn't there to deal with it.

Argument #2: HD doesn't support the horizontal scan rate.
Response: Peter is actually using the argument that the only TVs worth considering are HD CRT sets currently on the market. The chicken and the egg argument hardly washes with me when the market is already doing just that.

Argument #3: okay, I don't even really understand this one. Somehow we went from 1080i to 1080p60. Moving along...

Argument #4: well, I give up, he's retreading.

No, 1080p doesn't seem likely for broadcast anytime soon. No, 1080p is not a common item on the marketplace. But saying it's not going to happen, it's not worth trying, and you'd be a fool to buy one sounds damn shortsighted.
 
Well I think he's just trying to say that you don't need to rush out to get a 1080p HDTV as nobody will be able to use it anytime soon, as it still has a lot of rebuilding to get done before it'll be reliable consistently for the money you pay for it.

I'm just now trying to learn about the differences between HDTVs, like 720i vs 540p, etc. I've got the difference between the 'i' and 'p' now... its just the rest of it that I'd like to know.
 
While some of his argument is valid - its not like upgrading a video card. If you buy a solid 1080p set, its not like its going to be obsolete any time soon or anything. You will just be an early adopter ahead of the curve. The author seems to be arguing against being an early adopter more than anything else.
 
whytemyke said:
Well I think he's just trying to say that you don't need to rush out to get a 1080p HDTV as nobody will be able to use it anytime soon, as it still has a lot of rebuilding to get done before it'll be reliable consistently for the money you pay for it.

Well, that's not really what he means (reliability doesn't really factor into it so much as other things), but you are right in that he has a far more benign point to make. It just seemed like an awful waste of words to get there.

I'm just now trying to learn about the differences between HDTVs, like 720i vs 540p, etc. I've got the difference between the 'i' and 'p' now... its just the rest of it that I'd like to know.

There's a lot of different things to learn, but I've tried to cover at least the basics. Good luck. :)
 
Phoenix said:
While some of his argument is valid - its not like upgrading a video card. If you buy a solid 1080p set, its not like its going to be obsolete any time soon or anything. You will just be an early adopter ahead of the curve. The author seems to be arguing against being an early adopter more than anything else.


However, if you live in the US or some other country where content makers have virtual control over standards; later on down the line your "ahead of the curve set" may require modification or work as seamlessly as it should if there are any new DRM mandates related to digital TV signals.

Also, Phoenix, what do you do at TimeWarner? I worked at AOL until the last half of 2003. I was there before the merger..or I guess technically it was AOL "consuming" TW :lol what a joke that was...
 
How about Blu-ray and HD-DVD? If either format is used to store and play back live HD content, it will have to be 1920x1080i (interlaced again) to be compatible with the bulk of consumer TVs. And any progressive-scan content will also have to be interlaced for viewing on the majority of HDTV sets.
The vast majority of DVD movies are stored on disc in a form of 24 fps progressive(not strictly since there's some MPEG quirks at hand)... which then undergo 2:3 pulldown in the player if the display is interlaced. In other words, somebody welcome this guy to 1995.

HereÂ’s why. To cut manufacturing costs, most HDTV sets run their horizontal scan at a constant 33.8 kHz, which is whatÂ’s needed for 1080i (or 540p). 1080p scans pictures twice as fast at 67.6 kHz. But most of todayÂ’s HDTVs donÂ’t even support external 720p signal sources, which requires a 44.9 kHz higher scan rate.
1080p30 and 1080i60 require the same horizontal scan and pixel clock rates(~32kHz and ~80MHz respectively). Unless he's saying 1080p30 must be shown as 1080p60, he's wrong here.
 
Even though I can't debate the various technical points he makes, you have to admit that he offers some strong evidence showing that the "next gen consoles will be stunning because of 1080p" is probably going to be something that we will look back on and laugh about in a few years.

While I can see a few developers offering 1080p as an option in some next gen console titles, this is probably going to be a much less significant feature than what is being hyped.

My guess is that 720p and even 1080i will be just fine for the next gen consoles....
 
Find a way to get that ps3 connected to a computer monitor at native resolution and you can get 1080p TODAY...

...if only you had a ps3 handy.
 
ddkawaii said:
However, if you live in the US or some other country where content makers have virtual control over standards; later on down the line your "ahead of the curve set" may require modification or work as seamlessly as it should if there are any new DRM mandates related to digital TV signals.

That's all a part of being an early adopter. Same thing happened with the adoption of DVD-Audio, Beta/VHS, 802.11x, etc. Being an early adopter implies a fair amount of risks as the standards aren't all there yet. One other area where a lot of people are buying in 'early' is WiMax.


Also, Phoenix, what do you do at TimeWarner? I worked at AOL until the last half of 2003. I was there before the merger..or I guess technically it was AOL "consuming" TW :lol what a joke that was...

Software Architect in Strategic Apps. AOL is actually one of my customers and I end up doing stuff for them from time to time.
 
Hitokage said:
The vast majority of DVD movies are stored on disc in a form of 24 fps progressive(not strictly since there's some MPEG quirks at hand)... which then undergo 2:3 pulldown in the player if the display is interlaced. In other words, somebody welcome this guy to 1995.
Hm, is that correct? I've always heard of the opposite.
 
Hitokage said:
Find a way to get that ps3 connected to a computer monitor at native resolution and you can get 1080p TODAY...

...if only you had a ps3 handy.

Seriously though, are there any computer monitors out there that will be able to handle a 1080p HDMI signal? (after using a DVI converter - also, how much are those DVI converters?).
I'm seriously thinking about investing in a 23" LCD monitor if I can find one capable of acting as a primary/secondary display for the PS3.

I know of one converting box that handles all types of inputs including HDMI... but that goes for $3000, so no go on that.
 
Supasso said:
Hm, is that correct? I've always heard of the opposite.
It's a tad tricky. You can compress progressive video as if it were progressive yet still store it as technically seperate fields. Flags in the stream tell the decoder what to do with it. From the DVD FAQ:
A disc has one track (stream) of MPEG-2 constant bit rate (CBR) or variable bit rate (VBR) compressed digital video. A restricted version of MPEG-2 Main Profile at Main Level (MP@ML) is used. SP@ML is also supported. MPEG-1 CBR and VBR video is also allowed. 525/60 (NTSC, 29.97 interlaced frames/sec) and 625/50 (PAL/SECAM, 25 interlaced frames/sec) video display systems are expressly supported. Coded frame rates of 24 fps progressive from film, 25 fps interlaced from PAL video, and 29.97 fps interlaced from NTSC video are typical. MPEG-2 progressive_sequence is not allowed, but interlaced sequences can contain progressive pictures and progressive macroblocks. In the case of 24 fps source, the encoder embeds MPEG-2 repeat_first_field flags into the video stream to make the decoder either perform 2-3 pulldown for 60Hz NTSC displays (actually 59.94Hz) or 2-2 pulldown (with resulting 4% speedup) for 50Hz PAL/SECAM displays. In other words, the player doesn't "know" what the encoded rate is, it simply follows the MPEG-2 encoder's instructions to produce the predetermined display rate of 25 fps or 29.97 fps. This is one of the main reasons there are two kinds of discs, one for NTSC and one for PAL. (Very few players convert from PAL to NTSC or NTSC to PAL. See 1.19.)

From DivX:
uite often end up even blurrier.

DVD & TELECINE

DVDs offer a strange twist to the whole telecine and 3:2 pulldown business. Almost all DVDs will have the movie stored as whole pictures at 24 fps. This is the original format of the film with no telecine. At the start of every MPEG-2 DVD file there are certain header codes that tell it how to play back the DVD. Because it's stored digitally, it can give the fields or frames from the DVD to the hardware or software in any order it likes. It can split the movie into two fields and perform telecine instantly. To do this, there are three flags that can be applied to the header code: RFF (repeat first field) TFF (top field first) and FPS (frames per second).

For a PAL DVD the FPS flag can be set to 25 and the DVD will send the picture information to the hardware at 25 fps instead of 24 fps as is stored on the DVD.

For NTSC DVDs, the movie needs to be 29.970 fps so the FPS flag is set to 29.970. But this looks odd because the movie is over far too soon. Imagine you're playing cards. If you throw 4 cards on the floor every second the whole pack will be finished in half the time than if you threw 2 cards onto the floor. The solution is to telecine the movie with 3:2 pulldown to increase the amount of "cards" we have to start with. To do this, it sets the RFF and TFF flags in the header code. By setting the DVD to Repeat the First Field again you make the video display the fields in the order 3, 2, 3, 2. By setting the TFF flag, you set the DVD to start from the top field so the order always goes: top, bottom, top, and bottom.

Theoretically, it should be possible to patch the header code of a DVD's MPEG-2 file and make it play back at 24 fps instead of the 29.970 fps. Some people have made patches to do this, but so far, for another unknown reason, they're unreliable and the video turns out terribly.

From the documentation for mplayer:

How telecine is used. All video intended to be displayed on an NTSC television set must be 60000/1001 fields per second. Made-for-TV movies and shows are often filmed directly at 60000/1001 fields per second, but the majority of cinema is filmed at 24 or 24000/1001 frames per second. When cinematic movie DVDs are mastered, the video is then converted for television using a process called telecine.

On a DVD, the video is never actually stored as 60000/1001 fields per second. For video that was originally 60000/1001, each pair of fields is combined to form a frame, resulting in 30000/1001 frames per second. Hardware DVD players then read a flag embedded in the video stream to determine whether the odd- or even-numbered lines should form the first field.

Usually, 24000/1001 frames per second content stays as it is when encoded for a DVD, and the DVD player must perform telecining on-the-fly. Sometimes, however, the video is telecined before being stored on the DVD; even though it was originally 24000/1001 frames per second, it becomes 60000/1001 fields per second. When it is stored on the DVD, pairs of fields are combined to form 30000/1001 frames per second.

When looking at individual frames formed from 60000/10001 fields per second video, telecined or otherwise, interlacing is clearly visible wherever there is any motion, because one field (say, the even-numbered lines) represents a moment in time 1/(60000/1001) seconds later than the other. Playing interlaced video on a computer looks ugly both because the monitor is higher resolution and because the video is shown frame-after-frame instead of field-after-field.
When you watch progressive video, you should never see any interlacing. Beware, however, because sometimes there is a tiny bit of telecine mixed in where you would not expect. I have encountered TV show DVDs that have one second of telecine at every scene change, or at seemingly random places. I once watched a DVD that had a progressive first half, and the second half was telecined. If you want to be really thorough, you can scan the entire movie
 
Not to mention some mastering studios just work off of interlaced tapes anyway. A true, 100% progressive DVD is pretty rare, because even if the movie makes it to mastering/authoring clean, something is bound to happen.

Yet another reason to like switching to a newer codec in the future.
 
Crazymoogle said:
Not to mention some mastering studios just work off of interlaced tapes anyway. A true, 100% progressive DVD is pretty rare, because even if the movie makes it to mastering/authoring clean, something is bound to happen.
Like menus.
 
Pointless article. Buy a 1080p set when they are cheap enough, and just make sure if supports 1080p natively. Ignore this twat. PEACE.
 
1080p 's biggest selling point is HD-DVD and Blu Ray movies. Broadcast TV and games probably won't see a wide use of it in the next 5 years.
 
COCKLES said:
Can someone clear up the difference between a 'native' set and a non-one?

Fixed pixel TVs (like LCD, DLP, LCOS, Plasma) have a physical native resolution, as in there are a certain number of real pixels. No matter what you display on a fixed-pixel set, the content is being scaled to the resolution the set has. Where this gets awkward is that many fixed-pixel sets have a number of pixels different from one of the TV standards. So while 720p is 1280x720, the set might be 1024x1024, 1366x768, or something else.

For movies this isn't really a big deal. For games it could be, since interface elements are going to get scaled in sometimes unpredictable ways.

The best way to notice this problem is by using a computer LCD screen and playing a game at something less than native resolution; even with DVI you will notice a blurrier image.

Non-fixed pixel TVs are CRTs. Instead of having actual pixels, they have a grille or mask for dealing with the electron gun. That means they can resize the lines fairly easily, so multiple resolutions isn't a big deal. The problem is that the set can only fire so fast and resolve only a certain amount of detail, so HD signals typically lose detail as lines are squished or lost in the firing process (if conversion isn't done beforehand, even). The detail loss is significant enough that most - if not all - 1080i programs on TV today are not using the full resolution.
 
Crazymoogle said:
The best way to notice this problem is by using a computer LCD screen and playing a game at something less than native resolution; even with DVI you will notice a blurrier image.

Or to plug a laptop into a conventional TV. It will look soooooo ass that you will want to take it back to the store - as I've done for several sets before giving up on CRT for high resolution and moving on to LCD/Plasma.
 
Status
Not open for further replies.
Top Bottom