Sure, It doesn't bother me. You'd be surprised at how quick you adapt to it. I'm about 2 feet from the monitor and I can see everything fine. Going back to anything less than this size for working on a PC is just not an option at this point. Truth be told though, I used to be about 1 yard away from it at my old place, but they won't let me wall mount this sucker because of apartment restrictionsdammitmattt said:Do you actually sit that close when you are using your keyboard?!?!?
dionysus said:The gap is as big as ever power wise, but as you get ever increasing visual fidelity you also get diminishing returns in what the consumer notices. The gap is going to be more evident in other areas as time goes on I think, especially in open world type games. I would like to see more of the R&D in the PC industry move away from graphics optimization and into other areas.
Dot50Cal said:Sure, It doesn't bother me. You'd be surprised at how quick you adapt to it. I'm about 2 feet from the monitor and I can see everything fine. Going back to anything less than this size for working on a PC is just not an option at this point.
Vista exclusivity. Probably the biggest hurdle to seeing DX10 games on the PC. Don't get me wrong, I'm sure there are a lot, but you have to wonder if they are truely putting the time and effort into developing it when they know only a select few will see the results.TheExodu5 said:Same goes to the GPUs. We've got DX10 available, and some GPUs that can output incredibly performance, but games are staying around the console specs this time around, since developers know that's where a lot of the money is.
I've had glasses since I was 12, and I've had this monitor ever since the PS3 launched. My eyesight has remained the same prescription. If anything, this monitor helped. The old CRT's I would use produced a lot of eye strain if I used them for too long. Not the case with this LCD.CultureClearance said:Eye damage is eye damage just like ear damage is ear damage...getting used to it is a bad thing.
beermonkey@tehbias said:I play Wipeout HD in 1080p on a Pioneer Kuro 1080p 50" plasma from only six feet away and I still think resolution is overemphasized. I'm all about AA, max settings, and framerate. I have an HTPC hooked up to that same plasma and I regularly bump games down from 1920x1080 if it will let me crank up the settings, the AA/AF, or get the framerate locked at 60.
Wipeout HD is a great looking game, though.
Dot50Cal said:Vista exclusivity. Probably the biggest hurdle to seeing DX10 games on the PC. Don't get me wrong, I'm sure there are a lot, but you have to wonder if they are truely putting the time and effort into developing it when they know only a select few will see the results.
brain_stew said:It isn't adaptive supersampling though, its just straight supersampling. The support is just there when you have extreme amounts of excess performance in the bag, and so can afford the luxury of the absolute best image quality. The evidence is in the screen captures, which look insanely good. Its literally realtime bullshot mode.
From nHancer's explanation of AA modes:
I was joking Dot! Someone said jokingly - "wait until the comfy chair argument comes out" - and I was just playing the game. This thread is basically a joke now...Dot50Cal said:Thats a really silly argument. You know that every HDTV these days has either DVI or VGA. Heck, this is my setup. A 46 inch Bravia is my monitor, which also double as my console display. And I even have a big comfy couch behind it!!11
:lol But you know someone was like "Yeah, hes right!"deepbrown said:I was joking Dot! Someone said jokingly - "wait until the comfy chair argument comes out" - and I was just playing the game. This thread is basically a joke now...
Yep. I expect it to linger just as much as the ATI 9800 Pro did. That thing produced good performance for what seemed like years after it debuted.TheExodu5 said:Hopefully, with the push Windows 7 is getting, we'll see DX11 adoption rate be higher than the DX10 rate was. Truthfully though, I don't expect DX9 to die out for quite a few years.
someONE? Nah...this thread has 42,000 views...I expect a few thousand nodded inside their heads.Dot50Cal said::lol But you know someone was like "Yeah, hes right!"
shongololo said:Again, I know what supersampling is, it's just nearly always implemented as adaptive (ie only performed on edges) due to how processor and memory intensive it is. If the new nvidia cards can spit out unfettered 3x3 supersampling with no performance hit on CoD then fair fucks to them. A lazy google search doesn't confirm them doing that, you got a link?
brain_stew said:Well the fact that they claimed that with COD4 should tell you that you can't take anything they say at face value. The PS3 version of all games running on this engine have suffered noticeably worse performance (see the videos I linked earlier), it is what it is.
If you didn't notice the sub par performance last time, I doubt you'd notice it this time either.
The nice thing about having a resolution that high is that you practically don't even need AA on at that point. I rarely use any AA on my 1080p set. I'd much rather crank the in-game IQ options.TheExodu5 said:@beermonkey. Yeah, I think MSAA and general IQ are more important than brute resolution. It's sad that my PC monitor has one of the worst scalers I've ever seen in my life, otherwise I wouldn't be constrained to 1680*1050. Thankfully, my new card generally has no problem with this res.
shongololo said:Again, I know what supersampling is, it's just nearly always implemented as adaptive (ie only performed on edges) due to how processor and memory intensive it is. If the new nvidia cards can spit out unfettered 3x3 supersampling with no performance hit on CoD then fair fucks to them. A lazy google search doesn't confirm them doing that, you got a link?
TheExodu5 said:While the gap may be as big power wise, it's just not being utilized where that huge gap exists: on the CPU. The i7 seems to be by far the most powerful CPU we've ever seen, and is a hell of a lot more performance than even last year's Core 2 Duos, but both CPUs tend to produce very similar benchmarks when it comes to gaming. The multiple cores and multithreading just aren't used to their potential at this point.
Same goes to the GPUs. We've got DX10 available, and some GPUs that can output incredibly performance, but games are staying around the console specs this time around, since developers know that's where a lot of the money is.
To be honest though, I'm thankful for this. It means that people can buy a very performant PC for a very small amount of money compared to the past. It also means that, to IQ appreciating gamers such as myself, I can finally run the best looking games at high resolutions, with huge amounts of AA or supersampling, and still get 60fps.
---
I sit about 2' away from my 22" 1680*1050 monitor. Hell, for some games like STALKER, I even lean in towards to monitor when I'm trying to see something really far away. I could see myself getting a 24" 1080p monitor and not sitting any further away.
andycapps said:I didn't notice it last time.. I played it on both platforms and didn't see a difference, but I didn't do them side by side either. I thought it ran very smoothly and looked decent. I never thought it looked that great, but it has a solid framerate and solid gameplay. I do have some major gripes with the multiplayer, but I don't want to derail this and piss on anyone's parade. And anyway, I'll get it regardless of my gripes. So yeah, PS3 version should be fine with me.
brain_stew said:Have you got an Nvidia GPU?
dionysus said:I agree with everything you are saying. But remember the transition from first generation 3D to second generation 3D. Those kind of noticeable leaps aren't going to happen again in a span of 5 years, if ever.
Yes, I have since I was 12 (before I even got into computers). My prescription hasn't changed in several years either. For certain games which have poor mouse support, I usually just connect the 360 controller to my PC and play that way. Dead Space was a notable example.Stabbing Robot said:Dot50Cal's setup is hilariously awesome. Do you wear glasses?
beermonkey@tehbias said:No, ATI 4850 with passive cooler (HTPC must be silent). I built it before the Nvidia 260 et al had even shipped and I have no interest in replacing the GPU for some time to come. I'm plenty happy.
RankoSD said:lol at lazy devs
looks more like an expansion pack then sequel...
RankoSD said:lol at lazy devs
looks more like an expansion pack then sequel...
dammitmattt said:On to other topics, PC gamers are more insufferable than the indignant Sony and Nintendo fans put together. I've never seen such a huge inferiority complex during my entire time posting on internet message boards. And that includes die-hard Saturn fans.
brain_stew said:I'll provide screenshots in a bit, everything is antialiased, textures, alpha textures, shadows, the lot. Looks the same as the "PR bullshots" that we tend to get for console games.
Here's the link to nHancer's explanation for now anyway:
http://www.nhancer.com/?dat=d_AA
Performance at 720p with 2x1 supersampling and raw 1080p is very similar, just as you'd expect.
shongololo said:The GTS250 has a limit of 2560x1600 which suggests 720p at 3x3 is beyond it if it were true unfettered supersampling.
dammitmattt said:On to other topics, PC gamers are more insufferable than the indignant Sony and Nintendo fans put together. I've never seen such a huge inferiority complex during my entire time posting on internet message boards. And that includes die-hard Saturn fans.
brain_stew said:Fwiw, here's a direct framerate comparison, the PS3 version is clearly suffering from poorer performance but its still a generally high framerate so its less noticeable. Dropping from 60fps to the high 40s is much less noticeable than dropping from 30fps to the low 20s for example, which is why fewer notice it. Its another reason why people are such fans of 60fps. Any drops from 30fps and things start to get into stutter territory, the odd dropped frame (even if its quite regular) at 60fps and most won't notice the difference.
http://www.eurogamer.net/videos/dig...-frame-rate-analysis-clip-compilation?size=hd
deepbrown said:I was joking Dot! Someone said jokingly - "wait until the comfy chair argument comes out" - and I was just playing the game. This thread is basically a joke now...
If you're on a Kuro, you're actually only seeing about 900p, not 1080p. As far as I know, the latest Panasonic G10 displays full 1080p.beermonkey@tehbias said:I play Wipeout HD in 1080p on a Pioneer Kuro 1080p 50" plasma from only six feet away and I still think resolution is overemphasized. I'm all about AA, max settings, and framerate. I have an HTPC hooked up to that same plasma and I regularly bump games down from 1920x1080 if it will let me crank up the settings, the AA/AF, or get the framerate locked at 60.
Wipeout HD is a great looking game, though.
Gully State said:When was this thread ever serious?
Epic Tier 3 Engineer said:If you're on a Kuro, you're actually only seeing about 900p, not 1080p. As far as I know, the latest Panasonic G10 displays full 1080p.
Still looks absolutely phenomenal. Goes to show you how retarded the whole thing is.
brain_stew said:As promised here's some supersampled shots:
]
Edit: Eww, jpeg compression spoils these, sorry. You can still see all the extra detail that is added to the image when you use supersampling though, check out the trees. The shadows look much better this way as well.
Tokubetsu said:How does one even go about enabling SS?
Dot50Cal said:
RankoSD said:lol at lazy devs
looks more like an expansion pack then sequel...
Cadaver Dog said:You wouldn't know the difference if they didn't spell it out.
You don't like it, don't buy the game.
-PXG- said:You guys are STILL arguing about this? Fuck...
Rez said:this thread ended when someone seriously mentioned the comfy chair.
the last few pages don't exist.