• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Are current PC games a full "Generational Leap" ahead of current console games?

I'm in this boat as well. Granted, I'm usually a bit late to the party on the graphical powerhouse games (so I haven't played Witcher 2 or BF3 yet) but I feel like most PC ports right now are very much the equivalent of the early 360 HD-ified ports. I love the benefits of current games with improved IQ and framerate, but I fully expect developers to be pushing consoles far beyond that in five years (albeit probably at lower fps/IQ standards). I mean, look at what they're doing now with nearing six-year-old console hardware.

There is no doubt in my mind that most developers are being held back by current console hardware.

Well, yes. Most of these games have been designed to run on the antiquated hardware seen in the PS3 and 360. A limiting factor in its own right.

The hardware they are running on in the best of conditions though... is a lot more than a generational leap. The engines they are running on are capable of much more than either the PS3 or 360. But still designed to run and look serviceable on them. Which is why under the best of conditions the games end up looking like they are 90% there. Just wait until all engines are designed to efficiently use Tessellation. Polies seamlessly added or removed as the scene warrants it.

The shader based tech is honestly well known enough that it isn't a big restructuring needed to achieve optimal use going into "next gen". Shader libraries have never been as full and well documented as they are now. But using tessellation well is the new something shiny.

All other areas have seen such drastic improvement. Lighting on high end rigs is already well over a generations removed. Global Illumination isn't exactly a fairy tale prebaked thing anymore. Shadows have precision that can border on reality under peak conditions.

And still, even with all of this tech coming easy to this new hardware, these engines are designed to run on just about anything. The hardware is most assuredly "next gen". The engines still have a little to upgrade too.

Next gen consoles using hardware releasing this year, are going to be so far advanced technically as to not be entirely fair to the PS3 or 360. Extreme differences will be hard to ascertain early on, but as engines are upgraded and refined to take advantage of the tech, we will see stuff in realtime rendering that would have been fiction a generation before.

That's if anyone is willing to invest the money to really push them.
 
Frame rates have little to do with generational leaps.

The things that matter is stuff like lighting, shadows, textures, shader techniques etc etc.

not in my book.

framerate is the most important.

i don't care how good the game looks ...if its running at a crap framerate its not worth a damn to me.
 
Frame rates have little to do with generational leaps.

The things that matter is stuff like geometry, lighting, shadows, textures, shader techniques etc etc.

Thats the core of the graphics.

Then you can display it at different resolutions and at different framerates. But a 480p game at 30fps could still be a next gen game.

So Mario Galaxy could be a next gen game?
 
Question just for discussions sake: which PS3 game is more impressive from a technical standpoint: Rage (60 fps with minor tearing) vs. Killzone 3/U3/GOW3 (25-30 fps)

PC gaming really made me respect what RAGE does on consoles, and I would probably prefer the choices they made resource wise over the Sony 1st party titles. My opinion is all of those nice sparks and fire effects we still get on consoles aren't very effective when they are sputtering out at sub 30 fps.

Gow 3 runs genreally at 45-60 fps and also is the best looking game on console.
 
I like the part where he pretends that texture-resolution, shader-precision, LOD distance and post-processing are identical between PC and consoles and it's just a matter of resolution and aliasing.
 
Frame rates have little to do with generational leaps.

The things that matter is stuff like geometry, lighting, shadows, textures, shader techniques etc etc.

Thats the core of the graphics.

Then you can display it at different resolutions and at different framerates. But a 480p game at 30fps could still be a next gen game.

So if a developer like Naughty Dog dropped the resolution for their new game to 480p but throws much more effects, than run it at 22fps like Square does for FF for some reason and it looks next generation than it's a "next generation" game?

Everything is part of the equation.
 
I think what he means is that nobody would mistake Uncharted 3 for a PS2 game if it was rendered at 480p, not even if you cut textures and shader precision down as well. It's still well beyond the PS2's capabilities.

I expect a lot better from the next generation than simply today's games with better clarity. I want a game that even if rendered at 720p would clearly be beyond what PS360 can handle.

And looking over the Witcher 2 screens, a huge part of it will be materials. Fabrics need to look like fabrics, leather needs to look like leather and skin needs to look like skin. The Witcher 2 is impressively detailed, but it's all clearly painted on.
 
So if a developer like Naughty Dog dropped the resolution for their new game to 480p but throws much more effects, than run it at 22fps like Square does for FF for some reason and it looks next generation than it's a "next generation" game?

Everything is part of the equation.

So Mario Galaxy could be a next gen game?

I think what he meant is how the image is rendered as opposed to how many pixels are there in the image itself. Simply put a movie with CGI or even a normal one looks far more realistic at 480p compared to the highest resolution games of today because of the intricacies of lighting, texturing, shading and geometry... (mostly lighting and shading that changes the deal altogether).
 
PC hardware is way ahead of current home consoles in pretty much every imaginable way.

PC games are not.

The problem is that, despite what some may tell you, most PC gamers don't really have a high-end gaming PC, they have some HP or Acer that works well for doing some work, surfing the net and checking your e-mail.

A minority will have a proper mid-range or higher gaming PC.

In both cases the machine will be a few years old on average, meaning it's already outdated compared with what's currently available on the market.

PC devs create their games with that in mind. The result for those with really high-end gaming PCs is that games aren't really taking advantage of their machines and all they can do is crank up the resolution/AA and get smoother framerates.

The high budgets needed to create truly next generation games also depends on multiplatform development these days, which means current consoles set the standard so you won't see an overall improvement on the PC side until next gen home consoles are on the market (and it will probably take some time after that).

As for your question, a "generational leap" is defined by overall improvements that let you have better experiences or do things that weren't possible before, which usually includes graphics too.
 
I don't have anything particularly fancy (i5 760 and gtx 460, both oc'd maybe 15%), but I started out this generation on the Wii and 360 and now have a very hard time going back. It's a generational leap for me in that I wonder why I'm wasting my time on a console game when it would be so much better on my PC. BF3 looks much better than anything I've seen on the 360 and, though it probably goes without saying, various Dolphin-emulated Wii games have been fantastic. I did get The Witcher 2, and judging by the early bits it is the prettiest game I've ever seen, but I couldn't get more than an hour or two in because I just didn't like the fighting. I had a hard time playing through Skyward Sword on the Wii, and a lot of the games I've got on the 360 just seem really blurry. I don't mean to dump on the system and say that it was always that way; I thought it was beautiful when I got it, but to my eye it's clearly way behind a lot of the things I'm playing on the PC.

For my part, I'd be pretty happy if the next Nintendo system can do what Dolphin can do, and if the next Xbox can give me something like BF3 or Witcher 2 on my computer. It's a generational leap for me, at least, though I'm not going to object if next-gen console games don't just pick up higher resolutions and whatnot and instead give me a reason to upgrade to ivy bridge or whatever comes after.

Edit: I should probably add that, despite not having a very impressive processor or graphics card, I did splurge this year on a nice big SSD. That's a next-gen experience right there.
 
I think what he meant is how the image is rendered as opposed to how many pixels are there in the image itself. Simply put a movie with CGI or even a normal one looks far more realistic at 480p compared to the highest resolution games of today because of the intricacies of lighting, texturing, shading and geometry... (mostly lighting and shading that changes the deal altogether).

this is what I meant earlier! God of War 3 in SD does not look like it was possible on a PS2. That's what next Gen meant to me. But according to people in this thread (who admittedly know PC hardware better than I do) we aren't going to get that. We're going to get the Witcher 2 on PC. And if I read correctly its not even going to be as good as that. That fucking sucks. So yeah, I still stand by my opinion that PCs are a generation ahead. Hopefully tessellation takes off and we still get that new gen "wow" factor.
 
Really now?

(More here from our very own Jediyoshi:
http://dolphinsnacks.com/screenshots/supermariogalaxy/)

yes, really
Now this is a game with simple cartoon graphics which make it easy to hide hardware's weakness, it's a perfect example of developing with a hardware in mind: going for a very simple, flat, almost untextured look, using mostly basic shapes etc. Like Zelda WW, this is a game that will never look old and especially benefit from a perfectly clean look.

It's very pleasing but it's obviously a polished up last gen game; Banjo nuts and bolts or Ratchet & Clank are clearly different.
 
What have some of you hardcore PC only gamers been doing for the past thirty years? Unable to deal with jaggies or slightly muddier textures. Have you guys just started gaming in the past few years? Have you been playing games and just shaking by the amount of jaggies and lack of full res textures?
 
What have some of you hardcore PC only gamers been doing for the past thirty years? Unable to deal with jaggies or slightly muddier textures. Have you guys just started gaming in the past few years? Have you been playing games and just shaking by the amount of jaggies and lack of full res textures?

I'm going to play the numbers and say that there's a pretty high chance I've been playing games for longer than you. During that time I've watched standards for visual fidelity change continuously, and during large parts of that time the shifting standards were directly responsible for huge technological shifts and improvements which benefited me as a gamer.

So I'll turn it around and ask you in turn: how does it hurt you in any way that we have higher standards than yours? No great journey of discovery and illumination begins with contentment, you should be glad that we're actually demanding more so that you can remain complacent.
 
not in my book.

framerate is the most important.

i don't care how good the game looks ...if its running at a crap framerate its not worth a damn to me.

This. I don't care if you can pull off a benchmark picture that looks great but runs like shit. PC hardware allows for higher-rez, better A.I., 60FPS, etc. The only reason why PC games are not blowing console games out of the water - most PC games nowadays are console ports, designed to run on 6 year old hardware. Try Witcher 2 to see how would all games looked like, if they were designed for PC in the first place.


What have some of you hardcore PC only gamers been doing for the past thirty years? Unable to deal with jaggies or slightly muddier textures. Have you guys just started gaming in the past few years? Have you been playing games and just shaking by the amount of jaggies and lack of full res textures?

Except last couple of years I was playing games which were getting more and more complex every year, if not every month. So yeah, I had muddier textures, but I understood this was the first and only time I will see them, more processing power means better textures, etc. Now it stopped. DA2 has shit PS3/X360 textures that hurt my eyes, Skyrim has shit armor textures that hurt my eyes. It hurts even more because I knew PC is fully capable of having high-rez textures, however devs are too lazy to create them most of the time. Other times they will simply lock it down not to create huge disparity between versions.
 
I'm not going to come at you, with eyesight that bad I'd be afraid you wouldn't get out of the way in time.

What are you gonna tell me, that the cinematic moments in The Witcher 2 that factually animate like shit and have sub-amateurish direction are better? Or are you gonna complain about soupy textures? Or are gonna post a picture of Dungeon Siege 3 again?
 
I can't tell you how good it feels to run games at 1080p with a fluid framerate. I upgraded my PC halfway through this gen and I'm still getting an 'Oooooh!' moment every time I fire up a new game and start moving around. That fidelity in motion alone is a huge jump from my 360 / PS3.
 
It may to some people, and to others it won't. 720p and below tarnishes the image so much that no matter what's going on behind that awful IQ, it's hard to look past the blurriness once you've become accustomed to higher resolutions. Downplaying the importance of IQ is silly since it affects everything.

If there was a console out right now that was outputting games at 1080p and even just 30fps, a lot more members of GAF would willingly admit just how important resolution is. But most don't have access to hardware that can consistently do 1080p, so it's easier to just say it doesn't matter.

thats bull, the 720-1080 difference is not that big of a deal
 
What are you gonna tell me, that the cinematic moments in The Witcher 2 that factually animate like shit and have sub-amateurish direction are better? Or are you gonna complain about soupy textures? Or are gonna post a picture of Dungeon Siege 3 again?

I haven't played it, but your assessment is sound.

But that's not exactly tech. Artistically something with an insane budget and backing from a big manufacturer with plenty of time is going to be doing things with the hardware that is very impressive.

Sony Santa Monica is that kind of studio. Big budgets, great talent. But the hardware they are pulling it off on is on paper at least a generations removed. What games for PC's tend to lack is the budgets to go that extra mile, and though it occasionally pains me to say, the talent of such a studio.

But that veers back into the subject of artistry. Console budgets, especially for manufacturer owned software developers tends to be much higher than your average studio. They've got the money to go that extra mile, or ten. Look at the stuff Nintendo devs can produce on GCN era tech!

But that doesn't mean the hardware isn't dated.
 
What have some of you hardcore PC only gamers been doing for the past thirty years? Unable to deal with jaggies or slightly muddier textures. Have you guys just started gaming in the past few years? Have you been playing games and just shaking by the amount of jaggies and lack of full res textures?
It's not like PC gamers have only been enjoying comparatively superior IQ for the past 5 years, it's a long standing benefit of PC gaming for good reason.

"Do you know that computer's resolutions are so high because they use
screens three or four times as good as a television? You can't get 1024x768
with 24 million colors on a TV, let me tell you. If you tried, it would
flicker horribly. The best you could do is 640x480"
-circa 1996
 
It's not like PC gamers have only been enjoying comparatively superior IQ for the past 5 years, it's a long standing benefit of PC gaming for good reason.

Totally agree, I've always been an avid PC gamer especially during the lulls between console cycles. But I'm mainly speaking to those that are actively against anything that is not the pinnacle of PQ perfection. Turning away gaming experiences because lack of AA or lower resolution textures is absurd.
 
1) PC hardware allows for higher-rez, better A.I., 60FPS, etc.
2) The only reason why PC games are not blowing console games out of the water - most PC games nowadays are console ports, designed to run on 6 year old hardware.

1) when has a PC version of a game shown better AI than other available versions ?

2) What some of you guys don't understand is no one is arguing otherwise. PC's could, but they don't, or very rarely do.
Yes, that's because games are designed to run on old hardware...and that's not necessarily a console, that's the average PC most people use.
 
Totally agree, I've always been an avid PC gamer especially during the lulls between console cycles. But I'm mainly speaking to those that are actively against anything that is not the pinnacle of PQ perfection. Turning away gaming experiences because lack of AA or lower resolution textures is absurd.

I agree with that, at worst it's distracting, no reason to avoid a game (in most cases). Though that's not to say I'm understating the benefits of better image quality, which I very much prefer having.
 
I haven't played it, but your assessment is sound.

But that's not exactly tech. Artistically something with an insane budget and backing from a big manufacturer with plenty of time is going to be doing things with the hardware that is very impressive.

Sony Santa Monica is that kind of studio. Big budgets, great talent. But the hardware they are pulling it off on is on paper at least a generations removed. What games for PC's tend to lack is the budgets to go that extra mile, and though it occasionally pains me to say, the talent of such a studio.

But that veers back into the subject of artistry. Console budgets, especially for manufacturer owned software developers tends to be much higher than your average studio. They've got the money to go that extra mile, or ten. Look at the stuff Nintendo devs can produce on GCN era tech!

But that doesn't mean the hardware isn't dated.

Everything you said is true. Frankly my post there was really unfair to CDP, they're a studio with many limitations what they pulled off with The Witcher 2 is really remarkable. I still stand by what I said about GOW3, it has the best visual execution I've seen yet, sure a lot of that is due to SSM employing artists that come from Hollywood and whatnot, but like I said before, artistry is also the main reason TW2 looks as good as it does(and why something like Two Worlds 2 looks like crap).
 
1) when has a PC version of a game shown better AI than other available versions ?
Exactly, the bottleneck there is in programming, and developers being willing to spend their resources on A.I., much more than it is in hardware power.
 
I'm mainly speaking to those that are actively against anything that is not the pinnacle of PQ perfection. Turning away gaming experiences because lack of AA or lower resolution textures is absurd.

I agree that iq isn't as important. jmho, but Skyrim PS3 (regardless of save file bug) and lots of console games generally are not even running a steady 30 fps. I don't know much about game development, but if there was a way to sacrifice some of the visual effects for a constant 28-30 instead of mid and low 20's like Skyrim and other open worlds I'd be all for it.
 
What are you gonna tell me, that the cinematic moments in The Witcher 2 that factually animate like shit and have sub-amateurish direction are better? Or are you gonna complain about soupy textures? Or are gonna post a picture of Dungeon Siege 3 again?

:lol

I actually don't remember much about the cinematics in The Witcher 2, but I do remember playable sequences like the opening of act II which factually animate hundreds of characters gorgeously on one of the most impressive environments in gaming. But that's really besides the point.
 
qft

there just trying to justifying there two thousand dollar rigs

I'm definitely not one of them.

My main PC still runs on an integrated card not suitable for KotOR. Currently I'm mainly using a 2002 labtop. It has a few issues running modern high res vids. In fact anything over 360p seems to run like crap on it.

I'm just a techie.
 
qft

there just trying to justifying there two thousand dollar rigs

To be fair you could say similar for people who downplay the gaming capabilities of a high-end rig. I've had one for a few years (mid-range now) and the bump of visual quality in multiplats coupled with their inexpensiveness has been worth the investment, especially now that it seems as the next generation won't get into gear until 2014.
 
Ignore him Thunder Monkey, Snuggler is always trolling PC threads.
I'm willing to forgive him because we've had mischievous fun in other threads.

I just want to make the point that it isn't some kind delusion. Both the PS3 and 360 have a little more in common with modern hardware than the Wii, but PC tech has gotten that much more powerful than those consoles. In a pure brute force sense orders of magnitude more powerful. A generations removed you might say.

I tend to prefer the finesse of console hardware overall, but I look at the power under the hoods of modern PC's and think "Jesus, even the WiiU's 2009 variant is going to blow away what the PS3 and 360 can do."

And whatever MS and Sony are using will be variants of more modern hardware, probably with more than a few bells and whistles that the WiiU lacks.

Engines being used right now in PC dev are still tailored to run on that older hardware, but when they've received all the freedom in the world to indulge in that tech we are going to see some very impressive things.

edit: Hell, we already have. That's with the tethers to an older generation of hardware.
 
Whenever I'm playing a console game I can't help but notice the horrible jaggies. It's so noticeable after spending time with a gaming PC.
 
the thing i notice the most with gaming on a console is the shit framerate and screen tearing.

very hard to get used to after playing @ 60 fps/vsync'd on pc.
 
1) when has a PC version of a game shown better AI than other available versions ?

2) What some of you guys don't understand is no one is arguing otherwise. PC's could, but they don't, or very rarely do.
Yes, that's because games are designed to run on old hardware...and that's not necessarily a console, that's the average PC most people use.

1. It could. With PC you an freely expand on processing power, with consoles you are working on a FIXED resource system - put better A.I., scale down the graphics, etc.

2. No, games designed for PC are never designed for average Joe and his PC. Sure they run ok on most 2-year old rigs, however the main draw is always cutting tech features. Nobody will sell you a PC telling you "you will be able to play games 2 years from now!", they will say things like "you want the best you can get NOW? buy this PC!".

qft

there just trying to justifying there two thousand dollar rigs

I can spin this around - stop justifying console games are not inferior/PC games could be much better looking, just because you don't have money to buy a gaming rig. If I'm paying 2000$ for my PC, I want games to reflect that. In most cases hey don't, as they are programmed for your 249$ box.
 
I just realized from looking at the "Hi-res pc screenshot" thread that a lot of PC-gaf have console exclusive game avatars. So this hyperbole that PC-gaf is out to make all their console game sound like "crap" is a bit of a hyperbole.

I can spin this around - stop justifying console games are not inferior/PC games could be much better looking, just because you don't have money to buy a gaming rig. If I'm paying 2000$ for my PC, I want games to reflect that. In most cases hey don't, as they are programmed for your 249$ box.

You got trolled. . .Snug is a PC gamer.
 
I just realized from looking at the "Hi-res pc screenshot" thread that a lot of PC-gaf have console exclusive game avatars. So this hyperbole that PC-gaf is out to make all their console game sound like "crap" is a bit of a hyperbole.

Indeed it is. I do almost all my gaming on PC yet a console game was my favourite game last year.
 
I'm going to play the numbers and say that there's a pretty high chance I've been playing games for longer than you. During that time I've watched standards for visual fidelity change continuously, and during large parts of that time the shifting standards were directly responsible for huge technological shifts and improvements which benefited me as a gamer.

So I'll turn it around and ask you in turn: how does it hurt you in any way that we have higher standards than yours? No great journey of discovery and illumination begins with contentment, you should be glad that we're actually demanding more so that you can remain complacent.

Perfect.

I'm not going to come at you, with eyesight that bad I'd be afraid you wouldn't get out of the way in time.

Love it.
 
I don't think I could handle the next generation of consoles to be underwhelming in power. We'd be stuck with "what if this were made for better hardware?" posts for years again.

I hope at least one of the manufacturers just goes balls-to-the-wall all out and crams some good fucking tech into the next round of consoles (unlikely). Or developers migrate to a more powerful platform (unlikely).


pc junkies make this thread pretty much unbearable if you have any difference in opinion.
But opinions can be wrong.

thats bull, the 720-1080 difference is not that big of a deal
case in point.

:p
 
case in point.
Deep down in your heart you know this to be true.

PC games for more than 10 years could sport high resolutions and AA and AF.
So why are most shots in the high res PC gaming thread of recent games?
Why are there maybe 3-4 games that dominate the thread?

Resolution, AA & AF isn't everything. Polys, effects and all that other jazz that allows art to shine through is the real measurement of graphics.

Also LOL at the continued implication that PC gamers are the objectives one because they "also play on consoles, unlike the console people", conveniently ignoring the dissenters with big gaming PCs.

Again: It's a difference in opinion, some value IQ above most things and others value higher polygon count.
The best case in this thread that has been made for a generational leap of a game is Shogun 2.
 
objectively, multiplatform games on PC just look better. But I'm looking at rationalising my setup this year, and frankly the gaming PC is first on the hitlist to go. The drop in quality will be noticable but is something I can live with, as the experience is pretty similar otherwise. And I'm surprised that I'm saying that, as I love graphics as much as the next man.

The only thing making me pause is the thought that next-gen consoles won't be out until Xmas 2013 or even later. Whats acceptable now might start feeling really long in the tooth in a year or so..
 
Deep down in your heart you know this to be true.

PC games for more than 10 years could sport high resolutions and AA and AF.
So why are most shots in the high res PC gaming thread of recent games?
Why are there maybe 3-4 games that dominate the thread?

Resolution, AA & AF isn't everything. Polys, effects and all that other jazz that allows art to shine through is the real measurement of graphics.

Also LOL at the continued implication that PC gamers are the objectives one because they "also play on consoles, unlike the console people", conveniently ignoring the dissenters with big gaming PCs.

Again: It's a difference in opinion, some value IQ above most things and others value higher polygon count.
The best case in this thread that has been made for a generational leap of a game is Shogun 2.
First statement is correct. Second isn't. They're all part of the equation, you can't ignore one or the other.

Super Mario Galaxy Dolphin easily passes as a current generation game, Mario Party 9 Dolphin doesn't. Uncharted 3 in 1080p and 60fps would be up there with the best games available on any platform, Bioshock in 480p looks like shit etc etc.

You have to look at the total package, including IQ and framerate, but also polycount, shaders, lighting, textures etc.
 
Top Bottom