There are plenty of CRT front projectors that can do 1080p.
75% of HD in US is 1080i. Only Fox, Abc and ESPN use 720p60. Rest of the channels broadcast in 1080i60.
You can see resolution changes on a small monitor, you can see it on a big tv with a larger dot pitch size. I know I can see the difference up to 1600x1200, which is still more vert lines than 1080p. I'm sure I'm gonna be able to tell the difference when I double or triple the size of the screen. It may not be future-proof, but once 1080p becomes cheap, I know I'll be comfortable spending $1500-2000 on a good screen that will be sure to last me a good 10 years. And I don't expect either HDTV or HD-DVD formats to exceed 1080p for the next 20 years. I hope I'm not just being optimistic. PEACE.mrklaw said:If 720p is the broadcast standard, and likely game consoles will be 720p items next gen, then the only reason to go 1080p is for HD-DVD (in whatever format)
In which case, and IMO, thats a projection item right there. I don't need 1080p on a 32" screen, you just won't notice the difference. You want 1080p on a 100"+ screen.
So I'm happy with my 720p 32" set, and will upgrade my front projector when they hit 1080p. Don't forget that budget projectors can do 720p already for $2k
IAWTP. Interlacing should never have been included in the HDTV standard except for back. compat with 480i. And I was really hoping they would have bumped the refresh rate to 85Hz. IIRC Gates wanted some of these changes and tried for more convergence with PCs, but wasn't given much thought. Prog. scan 4 life....says the guy with a 20" Samsung POS in the living room. :lolKleegamefan said:Thank god for that.....interlaced needs to die....
Pimpwerx said:You can see resolution changes on a small monitor, you can see it on a big tv with a larger dot pitch size.
TAJ said:>>>And I was really hoping they would have bumped the refresh rate to 85Hz. <<<
85Hz makes absolutely no sense. It wouldn't work well with either film or video sources, only new 85Hz video material. 72Hz and 96Hz would be great for film and new video, but problematic for legacy video. 120Hz would be outstanding for all sources except PAL video, but who gives a shit about that?
Well, the point being to increase the refresh rate. I used 85Hz since it's basically what passes for flicker-free these days. 72Hz isn't a significant step up to make a diff. 96Hz would be cool. 85Hz is a 25% increase. I think it'd be great too as tennis broadcasts and some autoracing camera pans show the shortcomings in current broadcast standards. You still get strobe effect. With greater detail of a better screen, this become more prominent. But it's really inconsequential for 90% of people out there, so an admitted nitpick on my part.TAJ said:>>>And I was really hoping they would have bumped the refresh rate to 85Hz. <<<
85Hz makes absolutely no sense. It wouldn't work well with either film or video sources, only new 85Hz video material. 72Hz and 96Hz would be great for film and new video, but problematic for legacy video. 120Hz would be outstanding for all sources except PAL video, but who gives a shit about that?
No, I've just got good eye sight. Been using computers my whole life and still 20/20. A 19" monitor is what a regular tv set was back when I was a kid. Even at full viewing distance I can makeout the difference between Q3A running in max resoluton and running at VGA. If you can't see it, you might not have very good vision. On a full-size tv, the dot pitch isn't nearly as fine as even an average monitor, so you're gonna see it then. If the source material is shot at the target resolution, the difference should be noticeable. Going from 720p to 1080p would be a 50% in the vertical scanlines. This is ignoring the equally important increase in horizontal resolution. PEACE.You're also probably sitting less than a foot from your small monitor.