Are current PC games a full "Generational Leap" ahead of current console games?

The tech is definitely there, but this generation the publishers decided that releasing identical games across as many platforms as possible was the best way to go. I personally think the big publishers trying to force platform homogenization was one of the major reasons they lost so much money this generation so hopefully it'll be abandoned going forward, but I kind of doubt it. My opinion on the OP's question is that there already exist PC games that can't be done on consoles, but in the console popular genre's the games haven't move much past the console baseline because the publishers don't think they need to improve the PC versions much at all in order to get good enough sales out of the PC versions.

there are plenty of PC exclusive games.
 
Whether 4 or 5 games today are actually a generation ahead is more a matter of opinion at this point. But what is undeniable is that the difference between console and PC games today is nowhere near as at the end of the previous gen. For instance, RAGE (despite how pretty its art looks) is nothing like DOOM 3 when it released. Even when Crysis was released the distance was still pretty big, but we haven't got much far ahead of Crysis in 4 years.

In a hypothetical scenario in which Crysis 2 was made under the original's philosophy. And as the original, only the 2010's equivalent of an 8800GT could run it, then you would get a game that is undoubtedly a gen ahead. Such game was never made.
On the other hand, good IQ has never been cheaper. 1080p/4xAA/60fps isn't just for enthusiasts anymore. That's the trade-off.
 
Its really tough to say. On one hand, the games definitely look similar since well, they are essentially the same game. But when you factor in 1080p, AA/AF, 60 FPS and scale (ie. BF3), then the console versions are laughable in comparison.
 
Dirt 3, lossless PNGs, "console" shot at 720p/medium settings upscaled in Photoshop:

dirt3_game2012-01-100grjpb.png

x~30

dirt3_game2012-01-100baksq.png

x60
 
I have a gaming capable PC. i5 and 6950 and all that. Not an enthusiast kinda rig, but games run well. I consider myself 'informed' about the subject. And I agree that there is generally not a huge difference in terms of immersion.
How can immersion be measured? For me immersion breaks when the frame-rate drops or prominent textures blur in the distance due to no filtering. A PC fixes those issues giving me, personally, a smoother and much more immersive experience, especially if I include something like 3D vision or hardware PhysX.
 
It's hard to say there's a generational leap when 99% of games are developed with the consoles in mind which limits how much time is spent on the extra bells and whistles for the PC version which are just that, bells and whistles. They make the game look A LOT better in most cases, but it's not a generational leap when you're still suck with the same models as the console versions basically. Like it's been said, polygons are always the biggest thing people notice in a generational jump because of the detail it adds. The extras on PC like better draw distance, LOD's, shadows, lighting, etc. all contribute to a better looking game, but nothing that is truly a generation ahead.

I play games on my PC and love the added detail but I can't say it's a generation ahead. The only game that really made me even possibly consider that was The Witcher 2 and that shit is just insane. Metro 2033 is also a game that blows me away in some areas compared to others.

I look at games like BF3, Metro 2033, TW2, modded Skyrim, etc. as a good indication as to where we're heading with the next consoles but not the best that they'll be able to do.

Is a generational leap possible if a huge team took on a PC exclusive that only catered to the stronger PC's out there? Hell yeah, but is that going to happen? Nope. That's why it's not a leap ahead in the end because of games catering to the consoles first and not using full resources on the PC version, but also the PC exclusives don't want to limit their market so they make the engine flexible and appeal to a broad audience which once again limits the time spend on the higher end.

More post processing is the trend right now. :P
 
At the other side of the spectrum, I've seen folks who are more along the lines of "meh... sure games look great at 1080p, 60 fps with tons of AA, but the difference in the overall visual package isn't that huge. Comparing a game like say, Uncharted 3 to the Witcher 2 doesn't really yeild that huge of a difference in the overall visual package."

These people are wrong, blind, or in denial.
 
Most games are aimed at consoles so there really aren't many big budget PC games that push the hardware like Crysis did back in the day.

The Witcher 2 is a good example of a stunning game on a smallish budget that showcases the PC.
 
PC games are able to run at much higher resolutions and much better frame rates. That doesn't mean they look next gen. The assets are still based on current gen console titles for the most part.
 
It's important to keep in mind that graphical potential isn't just limited to hardware. It already takes a significant amount of time and resources to create art and animation for current-gen games. While the latest PC hardware is certainly capable of creating revolutionary visuals that blow consoles out of the water, the cost of creating the necessary assets would be very steep and ultimately counter-productive. Publishers have repeatedly shown that they want shorter dev cycles and more potential buyers. This means less time creating assets and more time ensuring that these assets can seen on the maximum number of platforms.

Games are never going to reach the cutting edge of technology because getting there is too expensive.
 
My kind of topic.

Plunged in on December, build a variant of the 1500 USD PC in the PC gaming thread and to me there is no generational leap between consoles and PC. Especially as soon as you take out anything that isn't texture and anti-aliasing out of the equation.

As I said in a similar thread before this. If the generational leap is going in The Witcher 2 medium settings vs. The Witcher 2 ultra settings I'm going to be so disappointed in Xbox Next and PS4.
 
There are downsides to L.A. Noire's facial animations. They are pretty low resolution, and when seen in a high resolution environment, they look like videos of actors being played on a 3D mesh. They're also capped at 60fps.

That kind of tech is promising, however, and I'd bet storage space was one of the main reasons they are not as high resolution as they could be.

I was disappointed by the PC version of L.A. Noire, because the facial animation FMVs look pretty bad when the rest of the game is running at 1920x1200 and looking pretty sharp. I started the game on the PS3 but shelved it when the PC version got announced. Honestly, since the game was so low-res and blurry on the PS3, the face tech looked better for it.
 
If the next generation is defined by IQ improvements, than what's going on in PCs right now is representative of next-gen.

But if you're looking for a true next-gen look beyond IQ improvements, I think the UE3 Samaritan demo is probably the benchmark. And I'm not sure many high end PCs can run that in real-time right now.
 
Seeing as that I play both PC/console games on a 1080p, I would like to see a 1080p PC shot maxed settings vs. a 720p console shot stretched to fill 1080p as it is on my TV. Also, lossless screens are a must.

When making the comparisons though, I hope you keep in mind that a 720p image sitting two meters away from you doesn't look like a 720p image stretched to fill the monitor two feet in front of you. It's also worth noting that if JPEG compression and being rendered at 720p is all it takes to mask the differences between two versions of a game, then I don't see how anybody could possibly argue that the PC version was "next gen". Like, if you blew screenshots of Modern Warfare: Reflex up to 720p, people aren't going to suddenly confuse it for the 360 version.
 
My kind of topic.

Plunged in on December, build a variant of the 1500 USD PC in the PC gaming thread and to me there is no generational leap between consoles and PC. Especially as soon as you take out anything that isn't texture and anti-aliasing out of the equation.

As I said in a similar thread before this. If the generational leap is going in The Witcher 2 medium settings vs. The Witcher 2 ultra settings I'm going to be so disappointing in Xbox Next and PS4.

People always mentions The Witcher 2. But I think it is not the the best example when talking about tech. The game is dx9 and it's only the best looking pc game because of its superb art direction and beautiful textures. Crysis 2 in the other hand, does all kind of crazy effects that no other game has.
 
If the next generation of consoles launch games rival even the best looking pc games today, I will be disappointed.

Next gen consoles should surpass anything the pc can throw today.
 
If the next generation of consoles launch games rival even the best looking pc games today, I will be disappointed.

Next gen consoles should surpass anything the pc can throw today.

Why would it? Where is this magical, futuristic hardware coming from that PCs don't have access to?
 
As I said in a similar thread before this. If the generational leap is going in The Witcher 2 medium settings vs. The Witcher 2 ultra settings I'm going to be so disappointed in Xbox Next and PS4.

To be fair, that kind of leap would be TW2 medium, 720p and no AA/AF vs TW2 Ultra, 1080p with 4xAA 16xAF or similar with 3D and other goodies. From a technical perspective that's nothing to scoff at. Like I've speculated before, above all else the next generation is going to bring about better image quality. That's what hardware power will be spent on, the assets will benefit greatly because of it but I'm betting some people will be disappointed, especially those with High-end systems.
 
If the next generation of consoles launch games rival even the best looking pc games today, I will be disappointed.

Next gen consoles should surpass anything the pc can throw today.

Partially agree. But the ports of those best looking release titles will run easily in current PC hardware. (unless we are talking about 2015)
 
Sorry folks, PCs are already a generation ahead of what the next consoles are looking like they'll end up being. It's the nature of the business model that forces consoles to be within a certain budget, form factor, and thermal limitations. They'll look good but but even by the time the console launches the hardware inside will be over a year old. By the time devs have the ability to take advantage of the console hardware PC tech will be several years ahead at the budget price point, let alone the higher end machines.

It's not like any of this matters though, we're all free to make our choices.
 
People always mentions The Witcher 2. But I think it is not the the best example when talking about tech. The game is dx9 and it's only the best looking pc game because of its superb art direction and beautiful textures. Crysis 2 in the other hand, does all thing of crazy effects that no other game has.
I was choosing The Witcher 2 because it has dominated the "high-res PC games" thread for months.

I got it in the gog.com-sale, put everything on the highest setting except for "Ubermode" and proceeded to not be impressed. In stills it looks much better than in motion, as the animation and cinematography isn't AAA-budget. Which is understandable, but it seems like a certain amount of people don't think animation/animation blending matters.
Then you have clipping and dithering on everything.

I need to check out Crysis 2 then.

SparkTR said:
To be fair, that kind of leap would be TW2 medium, 720p and no AA/AF vs TW2 Ultra, 1080p with 4xAA 16xAF or similar with 3D and other goodies. From a technical perspective that's nothing to scoff at. Like I've speculated before, above all else the next generation is going to bring about better image quality. That's what hardware power will be spent on, the assets will benefit greatly because of it but I'm betting some people will be disappointed, especially those with High-end systems.
I'm sure it's technically demanding but I don't think all those things translate into great results automatically. You could and I've done it change people's config from 16xAF to 8xAF (you could go lower, I just did that) and ask them if they can guess what AF it's running at (two options) and the guessing was in line with chance.

Now give me a higher object density, more players, better animation and I'm in. That some texture is not as blurred in the deep periphery I'm not noticing.

I think Battlefield 3 on PC vs. consoles make a much better argument for next-gen, because it translates into what I consider real limitations.
But still, I would peg BF3 on PC right now as a launch title for next-gen consoles. If those kind of qualities would be normal next-gen I would still be disappointed.

In the end it's a semantics argument, whether or not you consider cleaner image quality to be next-gen. From my perspective Dolphin level graphics Skyward Sword and no loading times. That would be next-gen for the Wii for me.
 
Let's take for example skyrim. Apparently it's the best on pc, and rightfully so spec wise. But are you telling me that someone playing on the xbox 360 will have any less an experience overall?

Yes, but not because of the graphics. Bethesda Games usually have an incredible modding community in the background. There will be so much content in the next few months, it will be incredible. And no, not all of it is asian hentai stuff.
 
I was choosing The Witcher 2 because it has dominated the "high-res PC games" thread for months.

I got it in the gog.com-sale, put everything on the highest setting except for "Ubermode" and proceeded to not be impressed. In stills it looks much better than in motion, as the animation and cinematography isn't AAA-budget. Which is understandable, but it seems like a certain amount of people don't think animation/animation blending matters.
Then you have clipping and dithering on everything.

I need to check out Crysis 2 then.

The Witcher 2 still looks better though , Crysis 2 does not really takes full advantage of it's more advanced tech.
 
I think the actual hardware may well be a generation ahead (definitely if you count SLI). But I can't think of a PC game that makes the current consoles look completely dated. Especially considering some of PS3's best. The aircraft takeoff in Uncharted3 matches the best I've seen on PC for scale and sheer bombast.
I wonder if you would say the same thing if those PS3 games would be available on pcs.
 
Yes, but not because of the graphics. Bethesda Games usually have an incredible modding community in the background. There will be so much content in the next few months, it will be incredible. And no, not all of it is asian hentai stuff.

That doesn't effect the core game. That's irrelevant.

I just don't see justification. The pc version of skyrim simply isnt leaps and bounds of its console brothers!

To be a full generation leap, it would have to be the difference between say morrowind and skyrim. Now that's a generation leap, and that's what I expect from the next consoles. I also expect to be dissapointed as well, if pc versions of games are the next gen,.
 
"Uncharted 3 to the Witcher 2 doesn't really yeild that huge of a difference in the overall visual package"

A lot of times people compare Uncharted 3 on a TV at 7 feet away with Witcher 2 on a monitor a foot away from your eyes.

An easy comparison is to take Witcher 2 on a modern computer at 1080p with bells and whistles and play for a bit. Then turn it down to console specs of 720p or less with little if any aa/af and play the same bit again on the same monitor. The difference will be immediate and drastic.

I really don't see how people can't see the difference between The Witcher 2 and the Uncharted games. I've played 2 and seen some videos of 3 and wasn't impressed with the graphics at all.


Yep. Even when I go back to it, it looks as mind blowing as the first time.

I'm currently on my second play through. Played it first at launch. And its still crazy. Its worst parts look like the best parts of current gen games while its best parts look like a CG film.

Nope.

If you play the same game at your 1080p TV on PC and PS360, sitting 8 feet away (as i play every game), the difference is just not there. The only people who would brag about it would be PC gamers playing both on a desk, on their monitor. Nobody but hardcore PC fans play games on their desks, which is why their comments on graphics-related threads are always so baffling.

I only play PC games on my comfy couch sitting 7 feet away from my TV.

Night and day between playing The Witcher 2 and any console game.

The folds in Demond's shirt are highlighted in different ways by the really nice self shadowing as different light sources shine on him. I'm not sure what type of visual effect this is...it's not actually rendered shadows. It's more like an ambient occlusion map or something like that. It's probably my favorite effect in that game.

Holy shit! After just getting off of The Witcher 2 this game literally does look last generation. Those textures are HORRENDOUS! So is the shitty ambience (that's what its called right?)?

And this is from the PC version?
 
There are not any experiences out on console or pc at this point that are completely new experiences in all aspects of what we call games. There are many games in history that felt completely new and were more than just a generation ahead. I dont feel we are going to see any of that without developers and publishers taking more risks like they have in the past.


The biggest question I have is when will we see new techniques arrive that really change what we see or hear or become immersed in?
 
The blade on the sword doesn't look shiny or metal-ly like a real sword blade. It looks like it was colored in dark grey using MS paint.
Are you kidding? That's exactly how a real sword would look in overcast lighting. Look at the subtle shine of the fuller. It's pretty much perfect.

I really don't see how people can't see the difference between The Witcher 2 and the Uncharted games. I've played 2 and seen some videos of 3 and wasn't impressed with the graphics at all.
I agree. People talked up UC3 so much that when I finally saw it all I could see were the flaws. It lacks the depth and richness of the Witcher 2 even on modest settings.
 
oh, and in b4 inevitable "yes. /thread." and "first post nails it" "didn'treadlol.gif" etc. :p

I don't think they're generation leap ahead, I don't think there are many games optimized for the PC, the image quality, graphic quality is much better though, and I consider that to be generation ahead.

What I mean is that if a game was to be optimized for PC, some of the things would be quite different, there would be less "zones" in open world games probably, there would probably be more models on screen, physics might improve since the game isn't created with consoles in mind. Of course it's impossible to tell if it would be true, skyrim would probably still be skyrim if optimized for the PC, depends on the dev, you'll probably still fight the same stuff with the same models in higher resolution textures through the same looking area.

So no, interms of gameplay, we're still in the 360/ps3 era because they made the games with those hardwares in mind.
 
I agree. People talked up UC3 so much that when I finally saw it all I could see were the flaws. It lacks the depth and richness of the Witcher 2 even on modest settings.

I remember playing Uncharted 2 in its heyday. Which at the time was being hailed as the "undisputed graphical king". I played it and honestly it just looked like a typical console game.

I still say that the best looking console game I ever saw was God of War III. I also remember Metal Gear Solid 4 having some impressive moments for its time (though it doesn't hold up as well now). Though to be fair I watched God of War III right after its release. So I may be looking back at it with rose tinted glasses.
 
Skyrim is an interesting case because mods will bring into the next gen. Bethesda was restricted to the limits of the console, but modders aren't. Not only can they pile on better textures and post-processing effects, but they can make the world more alive by cramming it full of more stuff, and have more scripting going on to add on all sorts of things that would make a 360 explode if it tried to run them all.
 
Pretty damn close I'd say.

Depending on the hardware in the Loop and PS4 there might be more focus on tessellation in the engines for those platforms, which could make a huge difference if the engines are designed around that archetype.

Overall I think some are setting expectations too high on what the hardware will realistically be able to do, and what is cutting edge today. You guys have to realize to get those insane increases we were getting, takes more and more power to do. Games are already being designed with shaders approximating movie tech, or as close as realtime rendering can currently get.

Just temper your expectations my friends. The tech going into all three is a vast bit more powerful than either the PS3 or 360. It has to be, just by being newer.
 
Top Bottom