Modern Warfare 2 Will Render at 600p

dammitmattt said:
Do you actually sit that close when you are using your keyboard?!?!?
Sure, It doesn't bother me. You'd be surprised at how quick you adapt to it. I'm about 2 feet from the monitor and I can see everything fine. Going back to anything less than this size for working on a PC is just not an option at this point. Truth be told though, I used to be about 1 yard away from it at my old place, but they won't let me wall mount this sucker because of apartment restrictions :P
 
dionysus said:
The gap is as big as ever power wise, but as you get ever increasing visual fidelity you also get diminishing returns in what the consumer notices. The gap is going to be more evident in other areas as time goes on I think, especially in open world type games. I would like to see more of the R&D in the PC industry move away from graphics optimization and into other areas.

While the gap may be as big power wise, it's just not being utilized where that huge gap exists: on the CPU. The i7 seems to be by far the most powerful CPU we've ever seen, and is a hell of a lot more performance than even last year's Core 2 Duos, but both CPUs tend to produce very similar benchmarks when it comes to gaming. The multiple cores and multithreading just aren't used to their potential at this point.

Same goes to the GPUs. We've got DX10 available, and some GPUs that can output incredibly performance, but games are staying around the console specs this time around, since developers know that's where a lot of the money is.

To be honest though, I'm thankful for this. It means that people can buy a very performant PC for a very small amount of money compared to the past. It also means that, to IQ appreciating gamers such as myself, I can finally run the best looking games at high resolutions, with huge amounts of AA or supersampling, and still get 60fps.

---

I sit about 2' away from my 22" 1680*1050 monitor. Hell, for some games like STALKER, I even lean in towards to monitor when I'm trying to see something really far away. I could see myself getting a 24" 1080p monitor and not sitting any further away.
 
Dot50Cal said:
Sure, It doesn't bother me. You'd be surprised at how quick you adapt to it. I'm about 2 feet from the monitor and I can see everything fine. Going back to anything less than this size for working on a PC is just not an option at this point.

Eye damage is eye damage just like ear damage is ear damage...getting used to it is a bad thing.

I used to crank my cans loud and I "got used to it." Not until my friends mentioned how loud I was playing my tunes in my car did I realize what that actually means.
 
TheExodu5 said:
Same goes to the GPUs. We've got DX10 available, and some GPUs that can output incredibly performance, but games are staying around the console specs this time around, since developers know that's where a lot of the money is.
Vista exclusivity. Probably the biggest hurdle to seeing DX10 games on the PC. Don't get me wrong, I'm sure there are a lot, but you have to wonder if they are truely putting the time and effort into developing it when they know only a select few will see the results.

CultureClearance said:
Eye damage is eye damage just like ear damage is ear damage...getting used to it is a bad thing.
I've had glasses since I was 12, and I've had this monitor ever since the PS3 launched. My eyesight has remained the same prescription. If anything, this monitor helped. The old CRT's I would use produced a lot of eye strain if I used them for too long. Not the case with this LCD.
 
Comfy couch am cry.

That's the resolution on my netbook and it's smaller then the 1024x768 that I used on my CRT way back in the day. Almost half of your 1080p is going to go to waste, guess we really do need a new console generation.
 
beermonkey@tehbias said:
I play Wipeout HD in 1080p on a Pioneer Kuro 1080p 50" plasma from only six feet away and I still think resolution is overemphasized. I'm all about AA, max settings, and framerate. I have an HTPC hooked up to that same plasma and I regularly bump games down from 1920x1080 if it will let me crank up the settings, the AA/AF, or get the framerate locked at 60.

Wipeout HD is a great looking game, though.

Have you got an Nvidia GPU?

If you're not a resolution whore and prefer the look of highly anti aliased 720p graphics then you should really check out the hidden AA modes in nHancer.

For games suited to 720p display, sometimes its better putting that excess power to increasing AA rather than increasing resolution, especially if your display has a great scaler like your Kuro does.

Quite a few modern games will play just fine with combined 2x2 (or 2x1) supersampling + 4xmsaa w/ transparency multisampling and they'll look so insanely smooth. Its quite a nice look, especially for games that have excessive shader/texture/alpha aliasing which is becoming increasingly common.
 
Dot50Cal said:
Vista exclusivity. Probably the biggest hurdle to seeing DX10 games on the PC. Don't get me wrong, I'm sure there are a lot, but you have to wonder if they are truely putting the time and effort into developing it when they know only a select few will see the results.

Yeah...they really shouldn't have banked on Vista taking over the PC gaming crowd.

Hopefully, with the push Windows 7 is getting, we'll see DX11 adoption rate be higher than the DX10 rate was. Truthfully though, I don't expect DX9 to die out for quite a few years.

@beermonkey. Yeah, I think MSAA and general IQ are more important than brute resolution. It's sad that my PC monitor has one of the worst scalers I've ever seen in my life, otherwise I wouldn't be constrained to 1680*1050. Thankfully, my new card generally has no problem with this res.
 
brain_stew said:
It isn't adaptive supersampling though, its just straight supersampling. The support is just there when you have extreme amounts of excess performance in the bag, and so can afford the luxury of the absolute best image quality. The evidence is in the screen captures, which look insanely good. Its literally realtime bullshot mode.

From nHancer's explanation of AA modes:

Again, I know what supersampling is, it's just nearly always implemented as adaptive (ie only performed on edges) due to how processor and memory intensive it is. If the new nvidia cards can spit out unfettered 3x3 supersampling with no performance hit on CoD then fair fucks to them. A lazy google search doesn't confirm them doing that, you got a link?
 
Dot50Cal said:
Thats a really silly argument. You know that every HDTV these days has either DVI or VGA. Heck, this is my setup. A 46 inch Bravia is my monitor, which also double as my console display. And I even have a big comfy couch behind it!!11
I was joking Dot! Someone said jokingly - "wait until the comfy chair argument comes out" - and I was just playing the game. This thread is basically a joke now...
 
deepbrown said:
I was joking Dot! Someone said jokingly - "wait until the comfy chair argument comes out" - and I was just playing the game. This thread is basically a joke now...
:lol But you know someone was like "Yeah, hes right!"
TheExodu5 said:
Hopefully, with the push Windows 7 is getting, we'll see DX11 adoption rate be higher than the DX10 rate was. Truthfully though, I don't expect DX9 to die out for quite a few years.
Yep. I expect it to linger just as much as the ATI 9800 Pro did. That thing produced good performance for what seemed like years after it debuted.
 
shongololo said:
Again, I know what supersampling is, it's just nearly always implemented as adaptive (ie only performed on edges) due to how processor and memory intensive it is. If the new nvidia cards can spit out unfettered 3x3 supersampling with no performance hit on CoD then fair fucks to them. A lazy google search doesn't confirm them doing that, you got a link?

He provided the exact description of what nForcer is doing. It's not adaptive supersampling. Adaptive supersampling, as you describe it, is object based, which is just regular anti-aliasing.
 
brain_stew said:
Well the fact that they claimed that with COD4 should tell you that you can't take anything they say at face value. The PS3 version of all games running on this engine have suffered noticeably worse performance (see the videos I linked earlier), it is what it is.

If you didn't notice the sub par performance last time, I doubt you'd notice it this time either.

I didn't notice it last time.. I played it on both platforms and didn't see a difference, but I didn't do them side by side either. I thought it ran very smoothly and looked decent. I never thought it looked that great, but it has a solid framerate and solid gameplay. I do have some major gripes with the multiplayer, but I don't want to derail this and piss on anyone's parade. And anyway, I'll get it regardless of my gripes. So yeah, PS3 version should be fine with me.
 
TheExodu5 said:
@beermonkey. Yeah, I think MSAA and general IQ are more important than brute resolution. It's sad that my PC monitor has one of the worst scalers I've ever seen in my life, otherwise I wouldn't be constrained to 1680*1050. Thankfully, my new card generally has no problem with this res.
The nice thing about having a resolution that high is that you practically don't even need AA on at that point. I rarely use any AA on my 1080p set. I'd much rather crank the in-game IQ options.
 
shongololo said:
Again, I know what supersampling is, it's just nearly always implemented as adaptive (ie only performed on edges) due to how processor and memory intensive it is. If the new nvidia cards can spit out unfettered 3x3 supersampling with no performance hit on CoD then fair fucks to them. A lazy google search doesn't confirm them doing that, you got a link?

I'll provide screenshots in a bit, everything is antialiased, textures, alpha textures, shadows, the lot. Looks the same as the "PR bullshots" that we tend to get for console games.

Here's the link to nHancer's explanation for now anyway:

http://www.nhancer.com/?dat=d_AA

Performance at 720p with 2x1 supersampling and raw 1080p is very similar, just as you'd expect.
 
TheExodu5 said:
While the gap may be as big power wise, it's just not being utilized where that huge gap exists: on the CPU. The i7 seems to be by far the most powerful CPU we've ever seen, and is a hell of a lot more performance than even last year's Core 2 Duos, but both CPUs tend to produce very similar benchmarks when it comes to gaming. The multiple cores and multithreading just aren't used to their potential at this point.

Same goes to the GPUs. We've got DX10 available, and some GPUs that can output incredibly performance, but games are staying around the console specs this time around, since developers know that's where a lot of the money is.

To be honest though, I'm thankful for this. It means that people can buy a very performant PC for a very small amount of money compared to the past. It also means that, to IQ appreciating gamers such as myself, I can finally run the best looking games at high resolutions, with huge amounts of AA or supersampling, and still get 60fps.

---

I sit about 2' away from my 22" 1680*1050 monitor. Hell, for some games like STALKER, I even lean in towards to monitor when I'm trying to see something really far away. I could see myself getting a 24" 1080p monitor and not sitting any further away.

I agree with everything you are saying. But remember the transition from first generation 3D to second generation 3D. Those kind of noticeable leaps aren't going to happen again in a span of 5 years, if ever.
 
andycapps said:
I didn't notice it last time.. I played it on both platforms and didn't see a difference, but I didn't do them side by side either. I thought it ran very smoothly and looked decent. I never thought it looked that great, but it has a solid framerate and solid gameplay. I do have some major gripes with the multiplayer, but I don't want to derail this and piss on anyone's parade. And anyway, I'll get it regardless of my gripes. So yeah, PS3 version should be fine with me.

Fwiw, here's a direct framerate comparison, the PS3 version is clearly suffering from poorer performance but its still a generally high framerate so its less noticeable. Dropping from 60fps to the high 40s is much less noticeable than dropping from 30fps to the low 20s for example, which is why fewer notice it. Its another reason why people are such fans of 60fps. Any drops from 30fps and things start to get into stutter territory, the odd dropped frame (even if its quite regular) at 60fps and most won't notice the difference.

http://www.eurogamer.net/videos/dig...-frame-rate-analysis-clip-compilation?size=hd
 
brain_stew said:
Have you got an Nvidia GPU?

No, ATI 4850 with passive cooler (HTPC must be silent). I built it before the Nvidia 260 et al had even shipped and I have no interest in replacing the GPU for some time to come. I'm plenty happy.
 
Dot50Cal's setup is hilariously awesome. Do you wear glasses?
I'm a couch guy myself, and it's hard to hold a mouse/keyboard upside down over your head while lying stretched out across the sofa. Also, personally I find I can resume play quicker after sipping a beer in this position while using a controller.
This game is going to be sweet.
 
dionysus said:
I agree with everything you are saying. But remember the transition from first generation 3D to second generation 3D. Those kind of noticeable leaps aren't going to happen again in a span of 5 years, if ever.

The fact of the matter is its much easier to scale graphics than it is to scale CPU dependant functions. You can push bleeding edge graphics and still have your game be playable on low end rigs, push bleeding edge CPU technology / RAM demands and anyone outside of the high end is SOL if they want to run your game. Its a market reality.

Fwiw, as an image quality whore its not too much of a concern, I'll always find something to waste my GPU power on. :D
 
Stabbing Robot said:
Dot50Cal's setup is hilariously awesome. Do you wear glasses?
Yes, I have since I was 12 (before I even got into computers). My prescription hasn't changed in several years either. For certain games which have poor mouse support, I usually just connect the 360 controller to my PC and play that way. Dead Space was a notable example.
 
beermonkey@tehbias said:
No, ATI 4850 with passive cooler (HTPC must be silent). I built it before the Nvidia 260 et al had even shipped and I have no interest in replacing the GPU for some time to come. I'm plenty happy.

Ah, fair enough.

If you ever do jump to the Nvidia ship at some point its a nice feature to check out, especially if (like you say) you don't mind 720p resolution but love anti aliased graphics and have a TV with an excellent built in scaler.

I love messing with this sort of stuff, just playing around seeing the different effect of different settings and finding all the different "looks" I can get out of a game. Its fun to experiment with. :D

Edit: I'll be joining Dot50Cal's group next cademic year, my 40" 1080p Sammy HDTV will be going on my desk in pride of place next to my 1080p monitor when I make my next move. Fuck yeah, for huge 1080p visuals from two feet away! :lol

If you can't make out the individual nostril hairs of the actor on your latest BD movie then you're not sitting close enough to your HDTV!! Ultimate immersion! :lol
 
30mwges.gif

I'm outta here.
 
dammitmattt said:
On to other topics, PC gamers are more insufferable than the indignant Sony and Nintendo fans put together. I've never seen such a huge inferiority complex during my entire time posting on internet message boards. And that includes die-hard Saturn fans.

Not my fault that you clowns missed out on the greatest system of all time.

Oh wait I'm talking about the Playstation 2. Nevermind.
 
brain_stew said:
I'll provide screenshots in a bit, everything is antialiased, textures, alpha textures, shadows, the lot. Looks the same as the "PR bullshots" that we tend to get for console games.

Here's the link to nHancer's explanation for now anyway:

http://www.nhancer.com/?dat=d_AA

Performance at 720p with 2x1 supersampling and raw 1080p is very similar, just as you'd expect.

The GTS250 has a limit of 2560x1600 which suggests 720p at 3x3 is beyond it if it were true unfettered supersampling.
 
shongololo said:
The GTS250 has a limit of 2560x1600 which suggests 720p at 3x3 is beyond it if it were true unfettered supersampling.

That's a limit of the display port is it not? These cards can output a higher resolution over 2x DVI ports.

dammitmattt said:
On to other topics, PC gamers are more insufferable than the indignant Sony and Nintendo fans put together. I've never seen such a huge inferiority complex during my entire time posting on internet message boards. And that includes die-hard Saturn fans.

You're just jealous.
 
As promised here's some supersampled shots:

2qvwdbm.jpg




Supersampling:
2rpvevk.jpg



No supersampling:
2i932on.jpg


And for good luck:
b3szdh.jpg


Edit: Eww, jpeg compression spoils these, sorry. You can still see all the extra detail that is added to the image when you use supersampling though, check out the trees. The shadows look much better this way as well.
 
brain_stew said:
Fwiw, here's a direct framerate comparison, the PS3 version is clearly suffering from poorer performance but its still a generally high framerate so its less noticeable. Dropping from 60fps to the high 40s is much less noticeable than dropping from 30fps to the low 20s for example, which is why fewer notice it. Its another reason why people are such fans of 60fps. Any drops from 30fps and things start to get into stutter territory, the odd dropped frame (even if its quite regular) at 60fps and most won't notice the difference.

http://www.eurogamer.net/videos/dig...-frame-rate-analysis-clip-compilation?size=hd

Checked that out, and you're right.. it doesn't bother me. If I didn't see the numbers at the top showing that the frame rate was dropping, I never would have known. If it dropped down to mid 20's from 60, then yeah, I'd probably notice. I know there are people out there that this does effect, much like how some people get motion sickness during first person games, but it doesn't affect me. As long as the new game doesn't have worse framerate on PS3 than COD4 did, then I'll be fine with it.
 
beermonkey@tehbias said:
I play Wipeout HD in 1080p on a Pioneer Kuro 1080p 50" plasma from only six feet away and I still think resolution is overemphasized. I'm all about AA, max settings, and framerate. I have an HTPC hooked up to that same plasma and I regularly bump games down from 1920x1080 if it will let me crank up the settings, the AA/AF, or get the framerate locked at 60.

Wipeout HD is a great looking game, though.
If you're on a Kuro, you're actually only seeing about 900p, not 1080p. As far as I know, the latest Panasonic G10 displays full 1080p.

Still looks absolutely phenomenal. Goes to show you how retarded the whole thing is.
 
Epic Tier 3 Engineer said:
If you're on a Kuro, you're actually only seeing about 900p, not 1080p. As far as I know, the latest Panasonic G10 displays full 1080p.

Still looks absolutely phenomenal. Goes to show you how retarded the whole thing is.

I know about motion resolution. And I'll keep my Kuro over the Panny. (EDIT: might get the V10 Panny though)
 
brain_stew said:
As promised here's some supersampled shots:

]

Edit: Eww, jpeg compression spoils these, sorry. You can still see all the extra detail that is added to the image when you use supersampling though, check out the trees. The shadows look much better this way as well.

How does one even go about enabling SS?
 
Dot50Cal said:

That's not correct, this setting only applies to alpha textures and the amount is determined by how many msaa samples you're using.

To enable true supersampling you need to download nhancer from here:

http://www.nhancer.com/

You can then choose from a range of supersampling and (my favourite) "combined" supersampling/multisampling aa modes and apply them on a per game basis.

Combined 2x2 supersampling w/ 4xmsaa w/transparency multisampling is damn nice looking and the performance hit isn't too great at 720p with modern GPUs, quite a few games will run with this setting just fine.

You can also try "options > enable experimental modes" in nhancer to get access to some new multisampling modes including the infamous quincunx (of PS3 fame) and 4x9Tap which is a similar algorithm to quincunx only for a 4xmsaa application and the results are generally better.
 
RankoSD said:
lol at lazy devs
looks more like an expansion pack then sequel...

God some people are so far up their own assholes. I imagine you saying that and then tapping on your long extra-light cigarette and taking a sip out of your cooler.
 
Cadaver Dog said:
You wouldn't know the difference if they didn't spell it out.

You don't like it, don't buy the game.

BS, even if you don't quite know, "why", its quite easy to tell that COD games look blurrier than most games. Given the two images side by side, I'd expect at least 9 out of 10 would be able to tell that a native 720p game looked "better" even if they didn't know why.

Its not a huge earth shattering deal, no, but don't for one minute claim it makes no difference either, losing over 30% of your visual information clearly has an impact.
 
this thread ended when someone seriously mentioned the comfy chair.

the last few pages don't exist.
 
Rez said:
this thread ended when someone seriously mentioned the comfy chair.

the last few pages don't exist.

I really don't see why that isn't a reasonable argument. I have no problem admitting that setting up a system to play from your couch in your living room has considerable advantages.

I'm not just talking about hooking it up to a 50'' Plasma, which I do with my PC and I'm typing from right now. I mean the inviting and comfortable nature of a couch, which can't be easily replicated because M/KB were clearly intended for desktop space.

Yes, it's easier to game on consoles if I have anyone over to the house, particularly women. Yes, it's easier to show my friends games. It's more inviting and comfortable. You can, for example, lay down across a couch while you play with a traditional game pad.

Why is this argument ridiculed so vehemently?
 
90% of games support the 360 pad and the rest can be tricked into accepting the 360 pad, if kicking back with a controller is what makes you happy.
 
Top Bottom