Are current PC games a full "Generational Leap" ahead of current console games?

Arguing that the 360/PS3 version of multiplatform games are essentially the same as the PC version is the same as arguing that the Wii version of multiplatform games are the same as the 360/PS3 versions. It's simply not true.

If the console versions of BF3 are the same as the PC version, then the Wii version of MW3 is the same as the 360/PS3 version.
 
Arguing that the 360/PS3 version of multiplatform games are essentially the same as the PC version is the same as arguing that the Wii version of multiplatform games are the same as the 360/PS3 versions. It's simply not true.

This is simply not true. Is there some fundamental changes happening in gameplay when comparing HD console and PC games?
 
Careful, we're reaching the point of that last thread that was about this subject. Arguments devolved into lunatics saying things like "well since you're sitting far away from your tv, it'll still look the same so they'll end up looking virtually identical".

did infinity blade on iphone arrive at that point? :P
 
i get the impressino somepeople here are connecting generation leap with = texture quality. its not surprising some people dont seem to care about resolution, frame-rate, aa and af but hey what the ho

Yup, exactly.

Even if you get a game like Witcher 2 on consoles it will never, ever have the bells and whistles that makes it a next gen experience on PC.
 
People are seriously in for a rude awakening next gen. If you're even hoping for 1080p and 60fps next gen, there's a 4x requirement right there. Since a next-gen console has to stick to a limited TDP, I'd probably expect a CPU that's at most, twice as fast, and a GPU that's roughly the speed of a 6850 (which is about 5x as fast as current console GPUs).

You will not be getting 1080p and 60fps. If the estimate of the GPU is roughly correct, you have 5x the video rendering capabilities, and 80% of that would be required for 1080p/60fps rendering. Not much left over for extra visual effects. If the games are 1080p/30fps, then you're using 40% of the added capabilities and are looking at nice overhead for additional shaders/effects.

Still, things are not terribly rosy. Hardware has simply not advanced as quickly this generation than it has in the past.
 
Watching youtube footage of both BF3 and Witcher 2, both on ultra settings... I can't say I'm impressed...at all. The Witcher 2 in particular. What is so demanding and/or impressive about that game?
 
Careful, we're reaching the point of that last thread that was about this subject. Arguments devolved into lunatics saying things like "well since you're sitting far away from your tv, it'll still look the same so they'll end up looking virtually identical".

But if it works and the person is satisfied, who gives a shit what the final result is. Seriously some of you PC enthusiast just cant stand when somebody's opinion goes against the greater well being of all things PC gaming related.
I swear some of you are like the gaming community's equivalent to the Jehovah witness, always cramming your beliefs and ideologies down peoples throats and crying foul and discontent when something goes awry in the face of PC gaming.
 
Watching youtube footage....

oh you :)


But if it works and the person is satisfied, who gives a shit what the final result is.


God fucking damn it you entered the thread....

LISTEN! This is a thread about if A looks better than B ( or to be specific if A is a general leap over B, not "are you still satisfied" bs) , I ( and all the other sane persons ) don't give a shit if you're satisfied with B since that has nothing to do with this topic.
 
This whole thread is phrased wrong.
Next gen console games running at whatever resolution they are running at are going to run with more AA and higher resolutions on the current PC hardware than they will on the your fancy new console.

The tech is so far ahead we have been chomping at the bit for something to let loose on, so we just keep upping the resolution and frame rate because we can.

Witcher 2 looks great but it is still missing a ton of DX11 effects that can make it run faster and look better.
 
Watching youtube footage of both BF3 and Witcher 2, both on ultra settings... I can't say I'm impressed...at all. The Witcher 2 in particular. What is so demanding and/or impressive about that game?

I have no idea why a Youtube video of The Witcher 2 could possibly be underwhelming.

I watched a badly recorded VHS version of Avatar the other day. That CG is crap.

Witcher 2 is on consoles now. Consoles have caught up.

Just fucking accept it, man. I used to have respect for you.
I. Just. Can't. Let. Go.

Call me when I can have a field of 100 cars in a console racing game :)
 
Watching youtube footage of both BF3 and Witcher 2, both on ultra settings... I can't say I'm impressed...at all. The Witcher 2 in particular. What is so demanding and/or impressive about that game?

youtube footage

lol

can all these people do is use compressed videos and downscaled, awful screenshots for their views?
 
Yup, exactly.

Even if you get a game like Witcher 2 on consoles it will never, ever have the bells and whistles that makes it a next gen experience on PC.

I looked at those comparison shots between PC and 360. Sure the textures are muddier on 360 and some effects are gone, but the characters and environment still have the same amount of detail. There aren't entire objects or anything missing. I think you guys don't realize that resolution, frame rate, and textures alone don't account for a whole lot in most people's eyes.

For this to be a whole gen backwards on console, the 360 version would have to end up looking closer to TW1.
 
I have no idea why a Youtube video of The Witcher 2 could possibly be underwhelming.

I watched a badly recorded VHS version of Avatar the other day. That CG is crap.


I. Just. Can't. Let. Go.
But didn't you know that watching youtube video is the same as playing the game in front of you? there's no quality difference at all. It's just all in our minds.
 
I'm far past the point where pure technicality impresses me... now developers, please bring on creative art direction.
 
Frankfurt said:
Nobody but hardcore PC fans play games on their desks, which is why their comments on graphics-related threads are always so baffling.

I play the most hardcore console games available at my desk, too, though! I can't play them on an HDTV.

But if it works and the person is satisfied, who gives a shit what the final result is.
People who care about the details. Enthusiasts. It's that simple. It's weird to shout about being persecuted simply because someone has different standards than you do.
 
LISTEN! This is a thread about if A looks better than B, I ( and all the other sane persons ) don't give a shit if you're satisfied with B since that has nothing to do with this topic.

100% correct.

I have no idea why a Youtube video of The Witcher 2 could possibly be underwhelming.

I watched a badly recorded VHS version of Avatar the other day. That CG is crap.


I. Just. Can't. Let. Go.

I saw some guy's 240p phone video of the Arsenal - Leeds game. I can't believe how crap real life looks.
 
I have no idea why a Youtube video of The Witcher 2 could possibly be underwhelming.

I watched a badly recorded VHS version of Avatar the other day. That CG is crap.

I took into account the IQ difference with youtube before coming to that conclusion. That said, reading the specs of the rigs used, I was expecting a perfect 60fps@1080p at all times. That was not the case.
 
I looked at those comparison shots between PC and 360. Sure the textures are muddier on 360 and some effects are gone, but the characters and environment still have the same amount of detail. There aren't entire objects or anything missing. I think you guys don't realize that resolution, frame rate, and textures alone don't account for a whole lot in most people's eyes.

For this to be a whole gen backwards on console, the 360 version would have to end up looking closer to TW1.

I really hope you're not using the examples on page 2, those are horrible comparisons.

720p console shots vs 1080p max settings PC shots, everything else is invalid.
 
Devs matter... I honestly think ND could produce a game, targetted at spec similar to 2500k with a 580 that looked a wide margin better than Crysis 2, Witcher 2, or Battlefield 3... Sure hardware might be about what we are seeing now on PC, but devs will be targetting more powerfull hardware. A PC port of U3 at 60 fps and 1080 would instantly be one of the best looking games on PC. It would look much nicer than on PS3, but it would fundamentally be an upresed port, not NEXT GEN....
 
Still, things are not terribly rosy. Hardware has simply not advanced as quickly this generation than it has in the past.
I still have hope. A lot of games could be run at 1080p60 even back on an 8800GT with an original Core2Duo. I'm sure it won't be the standard, but it should definitely be more common.

Ahhh good times.
I know, this seems like a silly argument, but sitting 6-7ft away from a 50-60" screen does actually manage to artificially improve image quality through viewing distance. I game on both a 27" PC monitor and a 50" Pioneer plasma. When I use the monitor I have to crank up AA in order to produce an image that is reasonably smooth. When gaming in my primary setup, however, I often disable or reduce AA in order to eliminate small performance dips. From where I sit, the loss in image quality is minor.

Console games look very nice on that setup, but viewing them on a monitor with your face to the screen reveals their limitations.

This was NEVER the case in previous generations, however. You could fire up Quake 2 on N64 or PSX vs the PC version and there would be a WORLD of difference even if you stood 25ft away.

I'm certainly not discounting image quality here as it is VERY important, but different viewing environments and display types DO make a huge difference. It's not JUST about how many pixels you can put on the screen.
 
The only time I seen graphics that even start to actually feel "next gen" to me occur once in a while for a few seconds when I'm playing Crysis 2 on DX11 with tessellation. Just for a second I'll notice an insanely detailed wall or ground texture, and I imagine tessellation is gonna be the big feature of next gen console graphics.

TW2 to me just looks like what DOOM 3 and Half-Life 2 probably looked like back in 2004. Yes, pretty amazing, but today's console games look a lot better.
 
Nope.

If you play the same game at your 1080p TV on PC and PS360, sitting 8 feet away (as i play every game), the difference is just not there. The only people who would brag about it would be PC gamers playing both on a desk, on their monitor. Nobody but hardcore PC fans play games on their desks, which is why their comments on graphics-related threads are always so baffling.

Wow, that's just ridiculous. It's like the difference between an upscaled DVD and a BR when I did this on my HDTV before my HDMI borked on me.
 
I took into account the IQ difference with youtube before coming to that conclusion.

I must applaud your imagination but that's a conclusion I could never reach on my own without you know, actually seeing the difference for myself.

I've played tw2 on a 2560x1440 IPS monitor and I've seen a 1080p 25 fps .flv compressed to hell and back clip of it on youtube.
 
I looked at those comparison shots between PC and 360. Sure the textures are muddier on 360 and some effects are gone, but the characters and environment still have the same amount of detail. There aren't entire objects or anything missing. I think you guys don't realize that resolution, frame rate, and textures alone don't account for a whole lot in most people's eyes..

Oh I realise that because so many people still think the IQ of the sub-HD games running at 30fps or less this gen are acceptable.

Doesn't mean they're right, and just because they can't spot the difference between a game running on consoles and a game running on a high end PC doesn't mean the differences aren't there and very real. Elements like AA, AF, higher framerate etc all make a HUGE difference. To get Witcher 2 on consoles is impressive, and it might not look ugly, but it will never match the same experience on PC by a long shot.
 
oh you :)





God fucking damn it you entered the thread....

LISTEN! This is a thread about if A looks better than B, I ( and all the other sane persons ) don't give a shit if you're satisfied with B since that has nothing to do with this topic.

You brought up in the fist place, and to even accuse someone of being insane for not sharing the same beliefs as you and the other "sane" people, speaks volumes in terms of how insecure some of you really are.
 
They're both roughly on the same level with PC obviously having way better image quality. 720p with no anti-aliasing or post-processing AA just looks pretty bad on modern TVs. Sub-HD with AA provides a smooth image, but at a cost of detail.
 
I must applaud your imagination but that's a conclusion I could never reach on my own without you know, actually seeing the difference for myself.

I've played tw2 on a 2560x1440 IPS monitor and I've seen a 1080p 25 fps .flv compressed to hell and back clip of it on youtube.
So you're saying its good looks rely entirely on image quality and resolution then?

Amazing visuals should still be perfectly visible when viewed as a video.
 
You brought up in the fist place, and to even accuse someone of being insane for not sharing the same beliefs as you and the other "sane" people, speaks volumes in terms of how insecure some of you really are.

Mate you have issues and that goes beyond arguing with me on gaf about something as petty as video games. If you think my "insecurity" drives me to argue with everyone else that doesn't share my "beliefs" then I'd need more time than is available.


So you're saying its good looks rely entirely on image quality and resolution then?

I... I said that? Well I guess that's what I was saying.



Amazing visuals should still be perfectly visible when viewed as a video.

So you're saying TW2 looks like a donkey's ass and is the worst looking game ever?

Oh wow whaddaya know it's a great technique :)!
 
TW2 to me just looks like what DOOM 3 and Half-Life 2 probably looked like back in 2004. Yes, pretty amazing, but today's console games look a lot better.

I... what?

Am I reading this right? you think current console games look better than TW2 on PC which was released just last year?
 
I still have hope. A lot of games could be run at 1080p60 even back on an 8800GT with an original Core2Duo. I'm sure it won't be the standard, but it should definitely be more common.

Certainly not any recent console port, though. 6850/GTX 460 tend to be roughly the minimum required to get a 30fps console game running at 1080p/60fps (varies by game). Even faster is needed in several cases (or at least, to prevent drops below 60fps).
 
I... what?

Am I reading this right? you think current console games look better than TW2 on PC which was released just last year?

You're reading it wrong (it is poorly worded).

He's saying that current console games look better than Half-Life 2 or Doom 3 which could be considered 2004 equivalents of The Witcher 2 in that they were clearly a large step up over what the consoles were delivering at that time. Like TW2 will, they also received XBOX ports which were downgraded pretty heavily but still decent (TW2 360 seems to be a much more accurate port, however).
 
I really hope you're not using the examples on page 2, those are horrible comparisons.

720p console shots vs 1080p max settings PC shots, everything else is invalid.

I don't think so. The comparison screens I saw were the same size, and resolution alone doesn't make the point moot.

Another comparison: Konami and Bluepoint got Metal Gear Solid 3 running at 720p and 60fps on the PS3 compared to 480p and 30fps on the PS2. The difference is fucking incredible, but MGS3 doesn't all of a sudden look like a current gen game now.
 
Graphically, yes. PCs are (and have been for a while) a generation ahead of consoles. The IQ is pretty incredible.

The actual games themselves, however, are not a generation ahead. That's because 3rd party games are designed around PS360 hardware. Like, Dead Space 2 looks a lot better on PC, but is still the same game that is on consoles.



I have to agree too. Gears 3 is not good looking. There are jaggies everywhere. It's just not "clean", if you know what I mean. I mean it was fun, I beat it, and it would have impressed me in 2005, but in 2011? Come on now...

eh, I think Gears 3 looks great. It's easily one of the best looking console games.
 
LISTEN! This is a thread about if A looks better than B ( or to be specific if A is a general leap over B, not "are you still satisfied" bs) , I ( and all the other sane persons ) don't give a shit if you're satisfied with B since that has nothing to do with this topic.

According to the OP:

But question this thread is posing is more along the lines of software. Are the games themselves that are currently available on PC's what you would consider a full generational leap over the games available on consoles? (ignoring the theoretical power of kickass PC hardware)

for an example, see:
It's totally unimaginable to think of a game experience like MGS 3 or San Andreas running on a PS1. It's completely unthinkbable to imagine, say, the Assassin's Creed engine running on a PS2, no matter how much you reduced the graphics. Each gen seems to have brought new gaming experiences that were essentially impossible on previous hardware.

Think more along the lines of are there any gameplay experiences on PC that are next gen, and less along the lines of graphics, is what I get. The post early on about Shogun 2, is a good example of what it sounds like the OP was asking.
 
You're reading it wrong (it is poorly worded).

He's saying that current console games look better than Half-Life 2 or Doom 3 which could be considered 2004 equivalents of The Witcher 2 in that they were clearly a large step up over what the consoles were delivering at that time. Like TW2 will, they also received XBOX ports which were downgraded pretty heavily but still decent (TW2 360 seems to be a much more accurate port, however).

Yes, thank you.
 
If you cannot tell the difference between ps3/360 games and maxed out pc games or do not think the difference is very big, then next gen is not going to look very 'next gen' to you. If this is the case for many/most people, then I'd expect that ps4/720/etc. would have to include something else that is fundamentally different in order to create that generational gap.

edit:
Think more along the lines of are there any gameplay experiences on PC that are next gen, and less along the lines of graphics, is what I get. The post early on about Shogun 2, is a good example of what it sounds like the OP was asking.

I'm not sure how this is different than the argument that 64 players vs 24 players for BF3 makes the pc next-gen (which some people have seemed to disagree against.)
 
According to the OP:

That's not what I'm arguing against in this case, I'm arguing against the fact that ( be it generational leap in visuals or something not related to the technical prowess of say a pc ) I find it incredibly redundant and silly of people saying " what should it matter if you're happy with the result "? That's what I'm up against, the very reason this thread exists is that people could be happier with the results and that shouldn't be dismissed.

Let's for take this as a non-visual example " there is no difference in 16p vs 64p in multiplayer since I'm perfectly content with the former". And yes I wholeheartedly agree with the shogun example, and imagine someone coming and saying "What difference does it make if I'm happy with an army size on tenth of that?"
 
Unless higher resolution and faster frame rate = generational leap, then no. Super Mario 64, Ocarina of Time, and Final Fantasy VII were a generational leap ahead of their predecessors. Fully explorable 3 dimensional environments were a fundamental leap. There is nothing on PC today so fundamentally complex that it can't be played on current gen consoles.
 
I have to agree too. Gears 3 is not good looking. There are jaggies everywhere. It's just not "clean", if you know what I mean. I mean it was fun, I beat it, and it would have impressed me in 2005, but in 2011? Come on now...
Really now?

I wish I could demonstrate the game on my setup for people making these statements as that isn't the case at all.

It looks quite clean on my Pioneer and flawless on the HMZ-T1 headset. On my PC monitor (which is an LCD) and my upstairs TV (also LCD) it looks much worse.

That's not what I'm arguing against in this case, I'm arguing against the fact that ( be it generational leap in visuals or something not related to the technical prowess ) I find it incredibly redundant and silly of people saying " what should it matter if you're happy with the result "? That's what I'm up against, the very reason this thread exists is that people could be happier with the results and that shouldn't be dismissed.
I think both sides need to give a little. There is a significant jump in image quality and performance when you move to a PC, but I think developers deserve some credit for squeezing as much out of current console hardware as they have.
 
Top Bottom