• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Are current PC games a full "Generational Leap" ahead of current console games?

DKCR would be another Wii game that looks as good as any 360/PS3 game in HD:

dolphin2010-12-0400-40djyu.jpg
 
Heh, can't say I'm surprised. You hadn't posted since then so there was no way to prove you didn't have a 6970. They walked right into that one.
 
Heh, can't say I'm surprised. You hadn't posted since then so there was no way to prove you didn't have a 6970. They walked right into that one.
I have the same graphics card and I mostly agree with him. Maybe it's the card! Dun dun dun...
 
Generational? No.

Do PC games look significantly better when designed and backed by sufficient hardware? Yes.

Generational will be huge worlds of the Samaritan demo.

Samaritan Demo is running using a 3-Way SLI GeForce GTX 580 on PC. If we are lucky enough Sony and MS are using current GPU architecture(Nvidia 500 series or AMD 6000) in a single chip. When Next Gen consoles launch around 2013, the PC market will be in Nvidia's 700 series and AMD 8000, powerful enough to run the Samaritan Demo in a single card configuration. So yes, it's a Generation Leap, people thinking that they will get Samaritan level of graphics in the next gen is wrong.
 
Agreed. I'd like to see and objective one though.
I added an example to my initial post, btw.

My point is that an objective definition really doesn't exist without numerous caveats and points of valid disagreement. It's very clear that this discussion has devolved into a PC vs. Consoles thread (as it expected) whereas the discussion should be how far removed are PC multi-plats from their console counterparts; not in visual fidelity but in 'essence.' Is BF3 on consoles a wholly unrecognizable entity which shares only its name in common? Does one's assessment attempt to remove disparities that exist only as a consequence of the hardware they are on, rather than how much they utilize it from the ground up, i.e., the differences between a 'next-gen' game and current-gen game running on next-gen hardware? To me a next-gen game would be one where at its core it fundamentally could not transition, seem feasible, or exist in any comparable fashion except on a PC. And by that, I mean as a 'core' experience and not if it could transition at certain resolutions and with a certain amount of AA.
 
The Witcher 2 BF3 ect look amazing on PC, and they blow the console versions out of the water. The hardware is MORE than a generation beyond console games.

That said, the software itself is nowhere near a generational leap.

As someone else posted
i3skFINVnyVk2.jpg
iL11EdRUbH9cA.jpg


Sure there is a huge difference between IQ of the PC and 360 version, but they dont look anywhere near a generational leap. Even when taking into account diminishing returns.

Whatever game Naughty Dog desires to make for the PS4 a couple years into the PS4's lifespan, will absolutely blow away the Witcher 2. I guarantee it.

That said, by that time, PC's will be running games that look even better.

This is the way of things.
 
I added an example to my initial post, btw.

My point is that an objective definition really doesn't exist without numerous caveats and points of valid disagreement. It's very clear that this discussion has devolved into a PC vs. Consoles thread (as it expected) whereas the discussion should be how far removed are PC multi-plats from their console counterparts; not in visual fidelity but in 'essence.' Is BF3 on consoles a wholly unrecognizable entity which shares only its name in common? Does one's assessment attempt to remove disparities that exist only as a consequence of the hardware they are on, rather than how much they utilize it from the ground up, i.e., the differences between a 'next-gen' game and current-gen game running on next-gen hardware? To me a next-gen game would be one where at its core it fundamentally could not transition, seem feasible, or exist in any comparable fashion except on a PC. And by that, I mean as a 'core' experience and not if it could transition at certain resolutions and with a certain amount of AA.

I would consider BF3 on consoles on the same level as DOOM3 or HL2 on xbox.
 
I have the same graphics card and I mostly agree with him. Maybe it's the card! Dun dun dun...

I'm not overly impressed with BF3, to be honest. The visuals in that game do nothing for me. I don't think that's a fault with PC gaming...I think it's just a very boring looking game, with some horrendous artistic choices. For example:

bf3_puking.jpg
 
Are people really trying to tell me they don't see much difference between those Witcher 2 screens and this direct feed UC3 capture?

flashmedialiveencoder20h.jpg


And to put an even bigger spanner in the works, considering that those 360 shots came out as soon as they announced the port the "console" comparison shots for TW2 were most likely captured on PC with projected settings applied. If you think the 360 version will have DOF and IQ that high-quality you craaaaaaazy.
 
ITT: 'No? Then you don't have a PC lololololol'

Solid007 bringing the truth.

Anyways. Fresquito, if you wished to continue our discussion just PM me. I've had enough stupidly unnecessary screenshots from all ends.
 
What will happen is that everyone who said The Witcher 2 wasn't all that, will suddenly act like the new consoles are bringing never before imagined visual fidelity.

PC graphics are shit right up until the moment they are reproduced by a console - then they are "stunning!"

The Witcher 2 is a gorgeous game, and when reproduced exactly on Next Box, will remain gorgeous, but will be one of those games where people will say "Is this really the leap we were expecting?". The answer would be no...first gen games never look as good as last gen games. Now, pretty much the moment we have this conversation in the future, there will be another conversation about Crysis 3 PC and whether anyone has a rig that can run it in Ultra Setting and the graphics will melt faces.
 
Samaritan Demo is running using a 3-Way SLI GeForce GTX 580 on PC. If we are lucky enough Sony and MS are using current GPU architecture(Nvidia 500 series or AMD 6000) in a single chip. When Next Gen consoles launch around 2013, the PC market will be in Nvidia's 700 series and AMD 8000, powerful enough to run the Samaritan Demo in a single card configuration. So yes, it's a Generation Leap, people thinking that they will get Samaritan level of graphics in the next gen is wrong.

First off, you'll be looking at a mid-range 700 chip at best. That will not come anywhere near 3x GTX 580 in terms of performance. 3x GTX 580 is 900W TDP. Consoles deal with ~200W total TDP. That's for everything combined, so you have to assume maybe 100-120W TDP at best for a GPU. They're not going to have cards that are 9x as efficient as a GTX 580 in 2 years time. Not even close.
 
Samaritan Demo is running using a 3-Way SLI GeForce GTX 580 on PC. If we are lucky enough Sony and MS are using current GPU architecture(Nvidia 500 series or AMD 6000) in a single chip. When Next Gen consoles launch around 2013, the PC market will be in Nvidia's 700 series and AMD 8000, powerful enough to run the Samaritan Demo in a single card configuration. So yes, it's a Generation Leap, people thinking that they will get Samaritan level of graphics in the next gen is wrong.

I am pretty sure that the demo was not very optimized at all, they said they could probably get it working on one 580. It was made by like three people or something. That is the kind of visuals I want to see next gen though... I honestly think we will get close to it, given current gen consoles can output something like Uncharted 3...if nothing else it proves that PC devs are absolutely not pushing the hardware to its limits.

When the 360 released I think it had a fairly contemporary card, did it not? They shouldnt release a new console with a 500 series....I am willing to hold out for a while until stuff like Samaritan can be produced. I am content enough playing nice looking games on PC for now, but I want the next consoles to offer a sizeable leap...

Jim-Jam: There are far, far more impressive scenes in U3 than that. I actually think Drake's model looks better than Geralts though. Certainly it animates more impressively.
The desert section for instance: Spoilers http://www.youtube.com/watch?v=ALkaw5U3rgI&feature=relmfu
 
I built my PC like 2 years ago for ~500 and it runs circles around the consoles yeah.

I built my PC when the first Crysis came out and I run practically all console ports at 1080p with 4x AA at high/very high settings.

I did upgrade from an 8800gts to a 4890 though.
 
The Witcher 2 BF3 ect look amazing on PC, and they blow the console versions out of the water. The hardware is MORE than a generation beyond console games.

That said, the software itself is nowhere near a generational leap.

As someone else posted
i3skFINVnyVk2.jpg
iL11EdRUbH9cA.jpg


Sure there is a huge difference between IQ of the PC and 360 version, but they dont look anywhere near a generational leap. Even when taking into account diminishing returns.

Whatever game Naughty Dog desires to make for the PS4 a couple years into the PS4's lifespan, will absolutely blow away the Witcher 2. I guarantee it.

That said, by that time, PC's will be running games that look even better.

This is the way of things.
What's with the colours in the console pic?

Also, huge difference.
 
Unlikely. At worst, people running top end rigs today might have to cut back on crazy amounts of AA.

Not quite.

Has everyone forgot about Epic's 2011 Unreal NEXT-GEN Tech Demo? It needed THREE GeForce GTX 580 in SLI to get this done.

So yes. Even MOST people running top-end rigs today might have to UPGRADE. My 6970 will overheat trying to run this demo AT THAT LEVEL OF DETAILS with consistent ~30fps.

(EDIT: late :/ )

uRiyq.jpg
 
Are people really trying to tell me they don't see much difference between those Witcher 2 screens and this direct feed UC3 capture?

flashmedialiveencoder20h.jpg


And to put an even bigger spanner in the works, considering that those 360 shots came out as soon as they announced the port the "console" comparison shots for TW2 were most likely captured on PC with projected settings applied. If you think the 360 version will have DOF and IQ that high-quality you craaaaaaazy.

Does that look like a Wii games to you? The difference between Wii and PS3 is a generational leap. Are you saying there's the same difference between this Uncharted pic you posted and Witcher 2 that there is between a Wii game and Uncharted? Is there magical details that only the PC elitists can see which do not exist to my weary eyes?
 
So yes. Even MOST people running top-end rigs today might have to UPGRADE.

They bruteforced the tech-demo with virtually no actual optimization according to themselves. They didn't want to bother with optimizing so just went ahead with a tri sli build.
 
Well, yeah and the fact that today's PC Gamers will also have to upgrade their CPU&GPU to play next-gen console ports even if you have HD6970 like I do boggles my mind.

However, I should've known better since this is the norm. So, to answer the question, YES&NO.

YES, PC games look better.

NO, Today's PC is NOT next-gen since we will have to upgrade our CPU&GPU again to play next-gen console ports.
If you have halfway decent hardware and 720p/30fps is good enough for you then you probably won't need to upgrade. Especially considering how well PC titles scale. I just can't see that happening if the new generation of console hw comes out within two years.
 
Not quite.

Has everyone forgot about Epic's 2011 Unreal NEXT-GEN Tech Demo? It needed THREE GeForce GTX 580 in SLI to get this done.

So yes. Even MOST people running top-end rigs today might have to UPGRADE.

You're being overly optimistic if you think next-gen consoles will approach anywhere near the level of the Samaritan demo. Even considering that it's not technically feasible, the development budget just isn't there.
 
How can anyone argue that the PC is not a generational leap ahead?

Does anyone even remember when Half Life 2 came out. Nothing on consoles even looked close back then, and the overall console experience back then was way behind those PC exclusive games. For those who say IQ/Resolution/AA doesn't matter, I really can't understand why not. If the experience isn't that different for you that's fine, but you can't just deny the difference and say it "plays the same." The core mechanics aren't going to change; that isn't an argument for saying a generational leap didn't happen.

Half-Life 2 did a lot more than just better IQ and framerate. It had lighting effects and physics and other things on a level that just wasn't possible on the PS2 - and they still got it running on the Xbox 1. Even then, current gen games look at lot better than HL2. HL2 was like a half-step forward. Same as DOOM 3, Splinter Cell Chaos Theory, and Riddick back in 2004/5.


I think you're in for a hard awakening when next gen comes out.

For reference, just look at what you need to run BF3 at max settings, 1080 @ 60fps with AA enabled. If you really think next gen consoles will be much more powerfull than that, I repeat, you're in for a rough awakening. I'm not saying graphics won't look good, but if you want to improve BF3 graphics, you'll need to sacrifice IQ and framerate.

PCs are using the extra power to get IQ and framerate, and that's only possible because hardware is already next gen compared to consoles. You could sacrifice IQ and framerate and get better graphics, and that's what consoles do, what Crysis did back in the day, because nobody could play the game with good framerate and good IQ when it was released, it was so demanding.

You have your preferences and I have mine, but that's not the topic. The thing is: to run a game such as BF3 or TW2 at those settings you need next-gen hardware, thus they are next gen.

When the next consoles arrive and are "only" producing visuals at the level of The Witcher 2 on High at 720p/30fps I think people will start to understand what the term "diminishing returns" means.

Perhaps you're right and these diminishing returns ensure that BF3, TW2, and Crysis 2 PC really are what we get out of next gen consoles. I actually do think that's what PS4 and Xbox 720 games may look like for the first couple years after launch. Remember: a lot of us were disappointed with launch 360 graphics until Gears came out.

Still, there are people in the development community who want shit to look like Avatar, Samaritan, or maybe the CG movies from FFXIII, even if it's just at 30fps or or 720p. At the very least I think next gen games at launch are going to tessellate the fuck out of everything just like they overused normal mapping and called it a day early in this gen.

By saying "diminishing returns" means that we're looking at "next gen" games right now on PC, you're basically admitting that the next gen leap isn't going to look as impressive as previous generational leaps.
 
Putting it into shame is a bit too much imo.

Puts it to shame considering it's a Wii title. The world geometry it comparable, but then it also has a ton of foliage going for it as well. FF XIII grass is entirely flat. It's pretty shameful.

And I will agree that Uncharted 3 looks very, very good for a console title. It has the best models I've seen for any game, period. The budget for PC games simply isn't there to produce that kind of artwork.

I think Uncharted is an artistic achievement more than anything.
 
Are people really trying to tell me they don't see much difference between those Witcher 2 screens and this direct feed UC3 capture?

flashmedialiveencoder20h.jpg
Naughty Gods just utterly destroyed Witcher 2 with that shot. That's the difference I see. PCs/games wish they could handle that level of ambient occlusion.
 
Does that look like a Wii games to you? The difference between Wii and PS3 is a generational leap. Are you saying there's the same difference between this Uncharted pic you posted and Witcher 2 that there is between a Wii game and Uncharted? Is there magical details that only the PC elitists can see which do not exist to my weary eyes?

To me that does look like a "Wii game" actually, assuming that's your benchmark for when a game on the HD consoles looks "last gen". There are no magical details:

- Textures blur beyond about 1m because of a lack of AF
- Jaggies everywhere in this bitch
- Low-poly models on the background objects
- 720p

If you use your eyes, these things are readily apparent.

By saying "diminishing returns" means that we're looking at "next gen" games right now on PC, you're basically admitting that the next gen leap isn't going to look as impressive as previous generational leaps.

Actually not quite, see above. I do think it's impressive, but I also think a lot of people in this thread are indulging in some cognitive-dissonance.

The point is that we've got to the point with TW2 where you can walk up and press your nose up against a texture and see nothing blurry, where you can inspect a character model and see almost no sharp edges, where DOF actually looks like DOF and not some bad gaussian blur. At this point the things which will push games closer to "Avatar" are better image-quality and framerates, and if people don't think that PC games look a whole generation ahead they won't think it about the consoles when they come out either.
 
I am pretty sure that the demo was not very optimized at all, they said they could probably get it working on one 580. It was made by like three people or something. That is the kind of visuals I want to see next gen though... I honestly think we will get close to it, given current gen consoles can output something like Uncharted 3...if nothing else it proves that PC devs are absolutely not pushing the hardware to its limits.

When the 360 released I think it had a fairly contemporary card, did it not? They shouldnt release a new console with a 500 series....I am willing to hold out for a while until stuff like Samaritan can be produced. I am content enough playing nice looking games on PC for now, but I want the next consoles to offer a sizeable leap...

Jim-Jam: There are far, far more impressive scenes in U3 than that. I actually think Drake's model looks better than Geralts though. Certainly it animates more impressively.

As already mentioned, the demonstration ran in real-time on a 3-Way SLI GeForce GTX 580 system, but even with the raw power that configuration affords, technological boundaries were still an issue, and for that reason, Daniel Wright, a Graphics Programmer at Epic, felt that "having access to the amazingly talented engineers at NVIDIA’s development assistance centre helped Epic push further into the intricacies of what NVIDIA’s graphics cards could do and get the best performance possible out of them." Being a tightly controlled demo, Samaritan doesn’t include artificial intelligence and other overheads of an actual, on-market game, but with enough time and effort, could the Samaritan demo run on just one graphics card, the most common configuration in gaming computers? Epic’s Mittring believes so, but "with Samaritan, we wanted to explore what we could do with DirectX 11, so using SLI saved time."

http://uk.geforce.com/whats-new/art...dia-talk-samaritan-and-the-future-of-graphics

The demo was optimized by Nvidia's engineers. Next Gen is not even close to that demo. I expect Witcher 2 at maybe ultra settings no more.
 
Are people really trying to tell me they don't see much difference between those Witcher 2 screens and this direct feed UC3 capture?

flashmedialiveencoder20h.jpg


And to put an even bigger spanner in the works, considering that those 360 shots came out as soon as they announced the port the "console" comparison shots for TW2 were most likely captured on PC with projected settings applied. If you think the 360 version will have DOF and IQ that high-quality you craaaaaaazy.

This is supposed to look as good as TW2?

lol
 
To me that does look like a "Wii game" actually, assuming that's your benchmark for when a game on the HD consoles looks "last gen". There are no magical details:

- Textures blur beyond about 1m because of a lack of AF
- Jaggies everywhere in this bitch
- Low-poly models on the background objects
- 720p

If you use your eyes, these things are readily apparent.



Actually not quite, see above. I do think it's impressive, but I also think a lot of people in this thread are indulging in some cognitive-dissonance.

Wow, you've been playing some pretty sweet looking Wii games. On dolphin I presume?
 
Meh, I think that Uncharted shot actually does look better than The Witcher 2, IQ aside. The lighting is beautifully done.

Sadly, that doesn't really hold true for most of the game.
 
Meh, I think that Uncharted shot actually does look better than The Witcher 2, IQ aside. The lighting is beautifully done.

Sadly, that doesn't really hold true for most of the game.

Yeah, the Yemin part killed the game for me visually. His teeth even shimmers.

A lot of it does look good though.
 
Is the lighting in the uncharted pic pre-baked or not? i"m guessing it is.


The art looks great though. But that's what you get with a major developer on a big budget.
Lot's of smoke and mirrors.
 
You are using "Wii game" to mean a game a whole generation behind what I'm playing right now, so yes it does look like a "Wii game" but not a Wii game.

So you believe that Uncharted 3 is a whole generation behind and look as atrocious compared to Witcher 2 as Zelda Twilight Princess does next to Uncharted 3. Its the exact same quality difference in your eyes. Sorry, but 99% of people will not see it that way. The Witcher 2 is unquestionably beautiful, with amazing IQ, and is a step in the right direction, but IMO, not a "Leap", a nice big step.
 
Top Bottom