Modern Warfare 2 Will Render at 600p

GodofWine said:
The first MW2 trailer looks great, and looks like great fun. period.

It's a little known fact, but pixels are made of fun. So your screen resolution is directly proportional to the amount of fun entering your system. Of course, as a previous post showed in its own way, the power of fun radiation diminishes to distance squared, so it's also very important to not have too many feets between you and your high resolution game.
 
I really don't care about it being 600p, its just dumb to see PS360 owners defend their sub HD resolution while they rag about it on the wii.
 
Guled said:
I really don't care about it being 600p, its just dumb to see PS360 owners defend their sub HD resolution while they rag about it on the wii.
Just like PC owners rag about it to console owners.
The discussion itself is dumb, the game isnt going to be less fun because the frame buffer size.

Osaka said:
Snap indeed my good friend, snap indeed.
 
NinjaFridge said:
This thread is like 20 pages too long and is just an excuse for thinly-veiled trolling. BTW about the 4850, what is it like compared to the 4770? It can be found cheaper but is the 4770 better?

4850 is around 5-10% faster on avaerage, though the 4770 has a much lower power draw, which is nice. Still, with current pricing its hard to recommend a 4770 tbh.
 
Osaka said:
288s29t.jpg


Ugh, yeah, I hate sitting at my pc desk in my big comfy chair.


I rather play on my comfy couch:D
23vew3s.jpg
 
OverHeat said:
I rather play on my comfy couch:D

That's not comfy, your ass and back will be drenched in sweat in 10 minutes from starting to play on that black leather couch

Or should I say, your ass will be overheated
 
I really don't care about it being 600p, its just dumb to see PS360 owners defend their sub HD resolution while they rag about it on the wii.
Err, 600p upscaled to 720p still looks SIGNIFICANTLY better than the awful video output on the Wii. There is nothing "dumb" about it.

Not only is the Wii rendering at 640x480 resolution, but it also tends to suffer from additional image quality flaws (dithering). Furthermore, the 600p number seems close to 480p, but in reality, it's still a widescreen resolution. 1024x600 is much cleaner than 640x480 (4:3) stretched to fill a 16:9 screen.

I rather play on my comfy couch
I play PC games on my couch. :O

For that reason, I hope this game supports the 360 pad this time around.
 
Osaka said:
That's not comfy, your ass and back will be drenched in sweat in 10 minutes from starting to play on that black leather couch

I dont play naked :lol

P.S. the couch is dark brown
 
The-Warning said:
Yeah thinly veiled like a mack truck in a dress.

(I don't know what that means.)

I just think it's silly that people are comparing the jump from PS360 to PC to the jump from Wii to PS360. I mean come on that's just pushing it.

No its not, not in anyway at all. A $500 PC will render the game at 13.5x the resolution with better graphics settings whilst maintaing the same (if not better) framerate as the consoles.

That's an order of magnitude. That's not pushing it in any way at all. That's just fact.

Upper midrange 2005 gaming technology is hugely outdated in 2009, this shouldn't be surprising, its just the way of the world, face up to it and move on.


dark10x said:
Err, 600p upscaled to 720p still looks SIGNIFICANTLY better than the awful video output on the Wii. There is nothing "dumb" about it.

Not only is the Wii rendering at 640x480 resolution, but it also tends to suffer from additional image quality flaws (dithering). Furthermore, the 600p number seems close to 480p, but in reality, it's still a widescreen resolution. 1024x600 is much cleaner than 640x480 (4:3) stretched to fill a 16:9 screen.


I play PC games on my couch. :O

For that reason, I hope this game supports the 360 pad this time around.

All true. Whilst 600p with 2xaa (and likely zero or 2x af) is pretty damn disappointing its still a huge leap over the abhorrent image quality that the Wii outputs.

Agreed on the 360 pad support as well, there's no reason to leave it out, the code is already there, and its already an accepted standard on the platform now, get with the times.
 
brain_stew said:
No its not, not in anyway at all. A $500 PC will render the game at 13.5x the resolution with better graphics settings whilst maintaing the same (if not better) framerate as the consoles.

That's an order of magnitude. That's not pushing it in any way at all. That's just fact.

No, it's just bad maths.
 
kamorra said:
Compared to most peoples HDTVs, 24" is tiny.

The absolute size is not a factor that should be considered when judging the effect of resolution on visuals. Its the amount of your field of view taken up by that display that matters, and a 24" monitor from a few feet will take up more of your field of view than a 40" HDTV form 10 foot. 1080p makes a huge difference to image quality on a 23" monitor from a metre away or less.
 
I love how one huge part of GAF talks about how this gen should go on for like five to seven more fucking years with no new hardware, and another huge part of GAF is mad that everything isn't at least 720p back buffer. :lol

These smaller resolutions are being used to make these games look MORE like the PC counterparts. Tradeoffs are being made and this is necessary because the 360 and PS3 GPUs aren't all that.
 
shongololo said:
No, it's just bad maths.

What's bad maths? 1024x600 = 614 400 pixels

1280x720 with 3x3 supersampling = 921600 x 9 = 8 294 400

8 294 400 / 614 400 = 13.5

So yes, in my tests with 3x3 supersampling I was rendering at 13.5x the resolution thankyou very much, and on my $150 GPU I got a perfect 60fps refresh, this is with improved textures and 16x anisotropic filtering to boot.

A greater than 13.5x increase in performance is generally classed as an order of magnitude increase in power in the computer world.

So again, what's wrong with my maths?
 
beermonkey@tehbias said:
I love how one huge part of GAF talks about how this gen should go on for like five to seven more fucking years with no new hardware, and another huge part of GAF is mad that everything isn't at least 720p back buffer. :lol

These smaller resolutions are being used to make these games look MORE like the PC counterparts. Tradeoffs are being made and this is necessary because the 360 and PS3 VRAM and RAM aren't all that.
Fixed.

Ok, this thread deserves his place on the classic archive.
Cheers guys, bye.
 
Fersis said:
Fixed.

Ok, this thread deserves his place on the classic archive.
Cheers guys, bye.

Fillrate and memory bandwidth limitations are a bigger factor imo. These consoles just weren't designed with 720p/60fps gameplay in mind.

You don't go cutting both your fillrate and memory bandwidth in half (like Sony did with RSX) if you're bothered about decent performance at high resolutions with nice image quality.
 
BriareosGAF said:
To be really fair, more like the 360 GPU's constrained embedded RAM and the PS3's terrible fillrate.

Yeah, that's more exact in this case, yes.

The fact that a 1024x600 buffer with 2xmsaa, tidily fits into the 360's 10MB of eDRAM is no coincidence of course.

Honestly, RSX could have been a half decent 1080p GPU if Sony didn't gimp it like they did. It certainly wouldn't have had half as many sub-HD games at the very least anyway.
 
This is what I get for coming back to Gaming Discussion after being in OT for so long. PC owners bragging about resolutions and console owners fighting on their righteous battle. Ugh, the game is going to be great regardless of native resolution. Is it disappointing that they still can't hit native 720p on consoles at 60 fps? Yes, but it's obvious now that these consoles aren't as powerful as we thought they were 3 years ago. It's also obvious that they'd be able to do a lot more if they were specializing in one platform versus having to build for multiple platforms.
 
brain_stew said:
What's bad maths? 1024x600 = 614 400 pixels

1280x720 with 3x3 supersampling = 921600 x 9 = 8 294 400

8 294 400 / 614 400 = 13.5

So yes, in my tests with 3x3 supersampling I was rendering at 13.5x the resolution thankyou very much, and on my $150 GPU I got a perfect 60fps refresh, this is with improved textures and 16x anisotropic filtering to boot.

A 13.5x increase in rendering power is generally classed as an order of magnitude increase in performance in the computer world.

So again, what's wrong with my maths?

Well, you're including an AA routine as part of your resolution for a start. It's highly likely your card uses adaptive supersampling, which won't get close to 9 times the original resolution.
 
brain_stew said:
Yeah, that's more exact in this case, yes.

The fact that a 1024x600 buffer with 2xmsaa, tidily fits into the 360's 10MB of eDRAM is no coincidence of course.

Honestly, RSX could have been a half decent 1080p GPU if Sony didn't gimp it like they did. It certainly wouldn't have had half as many sub-HD games at the very least anyway.
yup
 
old hardware is old. Can't expect up-to-date graphics (arguably) with ancient hardware. They could target 720p and 60fps, but it'd also look old (for an FPS).


Also, 1080p
ost

Reaaaally? :|
 
andycapps said:
This is what I get for coming back to Gaming Discussion after being in OT for so long. PC owners bragging about resolutions and console owners fighting on their righteous battle. Ugh, the game is going to be great regardless of native resolution. Is it disappointing that they still can't hit native 720p on consoles at 60 fps? Yes, but it's obvious now that these consoles aren't as powerful as we thought they were 3 years ago. It's also obvious that they'd be able to do a lot more if they were specializing in one platform versus having to build for multiple platforms.

Honestly, I think there is a decent amount of specialisation going on here, on the 360 side. The fact that the buffer fits precisely into 360's eDRAM, and the fact that the engine runs a very smooth 60fps most of the time on 360 whilst spending most of its time below that on PS3 is no coincidence here. It shouldn't be surprising either, considering the game's history.

Its clearly targeted at 360 hardware, and is a bit of an awkward fit for the PS3.
 
Guled said:
I really don't care about it being 600p, its just dumb to see PS360 owners defend their sub HD resolution while they rag about it on the wii.

Not quite the same, people rag on the Wii because most wii games don't even look better then FF12 or VP2 or other late gen PS2 games. I mean shit the Wii is much more powerful then the PS2 so WTF? I don't remember people saying this game looks like shit because its not in 720p, they said it looks like shit cause it looked worse or on par with a PS2 game. Now of course that has started to change which is why you see much less bickering then you did 2 years ago.

Hell non standard resolution isn't something that happened on HD consoles, it happened on the PS2 and hell is probably happens on the Wii as well.

But shit what I want to know is why didn't we have this thread when COD4 first came out and everyone knew it was 600p?


AlStrong said:
old hardware is old. Can't expect up-to-date graphics (arguably) with ancient hardware. They could target 720p and 60fps, but it'd also look old (for an FPS).


Also, 1080p
ost

I Agree here the GPU in console were developed in what 2004? While PC get new cards every 9 months or so, but its always been like this since the dawn of consoles so this is nothing new.

Either way it would be interesting to see how COD4/MW2 runs on a PC with 3.0 gig Dual core and a 7600GT (I think that was what the RSX was equal too) at 720p to see if it could hold 60 FPS.
 
I don't doubt for a second that Modern Warfare 2 will look significantly better on a good PC. It will sport a significantly higher resolution and all kind of anti-aliasing and other neat graphic enhancements. I also don't doubt that any, so called, hardcore gamer didn't know this before. He probably already knew that most new PC games do look significantly better than any of his 360 or PS3 games.

But now what? Preaching for another good 10 pages that consoles are obsolete and everyone should switch to PCs? I really don't like the "but it's fun!!" argument but in this thread I have to wonder if the only important thing in gaming are the graphics.
 
shongololo said:
Well, you're including an AA routine as part of your resolution for a start. It's highly likely your card uses adaptive supersampling, which won't get close to 9 times the original resolution.

Supersampling through nForcer will render the game at a higher resolution. This isn't simply MSAA.
 
AlStrong said:
old hardware is old. Can't expect up-to-date graphics (arguably) with ancient hardware. They could target 720p and 60fps, but it'd also look.. old.

Yeah 3-4 years is old.

Oh wait, to a PC gamer, if you are not spending $200 every 6 months on a video card then OH NOES my PC is teh ancient, time to upgrade it so one game can take advantage of it.

I can make hyperbowl statements too.
 
brain_stew said:
Honestly, I think there is a decent amount of specialisation going on here, on the 360 side. The fact that the buffer fits precisely into 360's eDRAM, and the fact that the engine runs a very smooth 60fps most of the time on 360 whilst spending most of its time below that on PS3 is no coincidence here. It shouldn't be surprising either, considering the game's history.

Its clearly targeted at 360 hardware, and is a bit of an awkward fit for the PS3.

So they're not claiming that the game will run identically on both platforms like they did on COD4? That seems like a pretty big step down and a big slap in the face to all the people that bought COD4 on PS3. While the 360 version sold more undoubtedly, the PS3 version sold quite a bit.
 
TheExodu5 said:
Supersampling through nForcer will render the game at a higher resolution. This isn't simply MSAA.

I'm well aware of that but 3x3 adaptive supersampling isn't equivalent to 9 times the resolution.
 
andycapps said:
Yes, but it's obvious now that these consoles aren't as powerful as we thought they were 3 years ago.

On the other hand at this point last gen Half-Life 2 for the Xbox was 3-4 months from shipping, and even though it ran 640x480 it still looked like total shit compared to a weak gaming PC. The gap was bigger then than now. Expectations have changed.
 
shongololo said:
Well, you're including an AA routine as part of your resolution for a start. It's highly likely your card uses adaptive supersampling, which won't get close to 9 times the original resolution.

No its pure supersampling, a locked away feature on Nvidia cards, it does exactly what its advertised as doing. The evidence is in the image quality, even 2x2, looks phenomenal. Its real, no BS supersampling.

I can't post up any pictures atm as I'm on a different PC but suffice to say its smoother than a baby's bottom and an awful lot more detail is revealed.

It is what it is, similar performance from a $150 GPU at 13.5x the rendering resolution. That's progress for you.


deepbrown said:

I often wonder what a full blown RSX would have been capable of, I always tend to take image quality and framerate over new shader effects, so I think it'd have produced some damn nice looking games, even at 1080p.

I think Sony banked on PS3 being the industry leader, and lead platform meaning your average developer would be pushing a lot more out of Cell than they are now. Which would make the slightly underwhelming RSX a much smaller factor. Its precisely what happened last generation with the PS2 for example, market dominance meant even your average multiplatform game was able to take good advantage of the PS2's more exotic hardware.
 
BriareosGAF said:
To be really fair, more like the 360 GPU's constrained embedded RAM and the PS3's terrible fillrate.
RSX has higher fillrate.
The bandwidth advantage of xenos is only for very specific operations concerning alphablending of stuff the can fit in the 10Mb and AA.
 
andycapps said:
So they're not claiming that the game will run identically on both platforms like they did on COD4? That seems like a pretty big step down and a big slap in the face to all the people that bought COD4 on PS3. While the 360 version sold more undoubtedly, the PS3 version sold quite a bit.

Well the fact that they claimed that with COD4 should tell you that you can't take anything they say at face value. The PS3 version of all games running on this engine have suffered noticeably worse performance (see the videos I linked earlier), it is what it is.

If you didn't notice the sub par performance last time, I doubt you'd notice it this time either.
 
GodofWine said:
Whats the point of 1080p in general really? thats just a sexy number that 99/100 people wouldn't be able to tell from a 720p display in their house.

And probably most cannot tell 720p from 600p..motion blur etc covering up the flaws like airbrushing away a little cellulite in Maxim. Still looks good.


Anyone who complains about the resolution in this thread should not buy the game, period.
The first MW2 trailer looks great, and looks like great fun. period.
Go play Wipeout HD and you'll see.
 
shongololo said:
I'm well aware of that but 3x3 adaptive supersampling isn't equivalent to 9 times the resolution.

It isn't adaptive supersampling though, its just straight supersampling. The support is just there when you have extreme amounts of excess performance in the bag, and so can afford the luxury of the absolute best image quality. The evidence is in the screen captures, which look insanely good. Its literally realtime bullshot mode.

From nHancer's explanation of AA modes:

nHancer said:
Supersampling

Supersampling is a very straight forward method, that was also first introduced on Voodoo 4/5 cards. It's currently not available for 8x00 cards.

The image is just rendered with a higher resolution internally. After a whole image has been rendered, it is then scaled down to the target resolution. While doing that, each pixel is colored from the avarage of all apropriate pixels of the high-resolution image.

Supersampling modes are named after the size of the internally used resolution. I.e. the mode 2x1 means that the internal resoltion has twice the horizontal resolution and the same vertical resolution. The result is, that each final pixel is made from the average of 2 pixels. With the 4x4 supersampling mode, each final pixel is created from the average of 16 pixels.
 
deepbrown said:
yeh, tiny.




:D
Thats a really silly argument. You know that every HDTV these days has either DVI or VGA. Heck, this is my setup. A 46 inch Bravia is my monitor, which also double as my console display. And I even have a big comfy couch behind it!!11

pIoYgqyiV.jpg
 
beermonkey@tehbias said:
On the other hand at this point last gen Half-Life 2 for the Xbox was 3-4 months from shipping, and even though it ran 640x480 it still looked like total shit compared to a weak gaming PC. The gap was bigger then than now. Expectations have changed.

The gap is as big as ever power wise, but as you get ever increasing visual fidelity you also get diminishing returns in what the consumer notices. The gap is going to be more evident in other areas as time goes on I think, especially in open world type games. I would like to see more of the R&D in the PC industry move away from graphics optimization and into other areas.
 
Osaka said:
You mean my 24" HDMI 1080p monitor? :lol

I've never understood the "small monitor" argument when one can very easily hook up a PC to 50 inch HDTV.

If I come across some dough in the next couple of months, I will build a new PC and purcahse this for it. For now, I'll be okay with playing this on console.
 
-viper- said:
Go play Wipeout HD and you'll see.

I play Wipeout HD in 1080p on a Pioneer Kuro 1080p 50" plasma from only six feet away and I still think resolution is overemphasized. I'm all about AA, max settings, and framerate. I have an HTPC hooked up to that same plasma and I regularly bump games down from 1920x1080 if it will let me crank up the settings, the AA/AF, or get the framerate locked at 60.

Wipeout HD is a great looking game, though.
 
Dot50Cal said:
Thats a really silly argument. You know that every HDTV these days has either DVI or VGA. Heck, this is my setup. A 46 inch Bravia is my monitor, which also double as my console display. And I even have a big comfy couch behind it!!11

Do you actually sit that close when you are using your keyboard?!?!?

--------------

On to other topics, PC gamers are more insufferable than the indignant Sony and Nintendo fans put together. I've never seen such a huge inferiority complex during my entire time posting on internet message boards. And that includes die-hard Saturn fans.
 
Top Bottom