Are current PC games a full "Generational Leap" ahead of current console games?

come on guys, 1:1 comparison of pc and consoles specs is stupid. I know some of you guys are either stupid or abit dim, but even you warriors should know this.
 
Simply setting the detail to Medium in order to emulate a console is also questionable.
agreed. it's hard to really get a feel for 20-something fps, 98% screen tearing and pop in that lands like a 747 with no undercarriage from a screenshot.
 
Some of you are going to be severely disappointed when next generation console projects will be shown to the public. They're not going to look better than Shogun 2, Anno 2070, Battlefield 3 or the Witcher 2 on 1080p with some decent AA and a solid framerate.
 
Generational? No.

Do PC games look significantly better when designed and backed by sufficient hardware? Yes.

Generational will be huge worlds of the Samaritan demo.
 
Any genre that includes complex simulation elements is a "generation" ahead of consoles. People dismiss these games claiming they are only on PC because of controller input limitations. There is more to it than that.

Shogun 2 is borderline unplayable in my opinion because of hardware limitations on a overclocked i7 920, 6 gigs of ram, and a 7200 rpm hard drive. It has nothing to do with graphics.

Old games like X3:TC (2008 but the engine is even older) can have multiple hundreds of independent AI routines controlling turrets, ships, etc running at the same time all the while also simulating 10+ star systems in detail (wherever you have ships) and 100+ star system economies in the general.

Yea, maybe shooters and other console genres have hit a wall on diminishing returns, but you can't ignore the stuff that isn't even on consoles because it doesn't fit the argument.
 
I like how people are downplaying IQ and framerate. I mean, if you downplay IQ, why did you buy a HDTV in the first place? Why don't you stick to your VHS videos? Why? IQ and framerate are the most important graphical aspects in gaming

Well, i think Avatar on vhs looks better than the original Tron in 1080p.

I remember people bragging about their PC being able to run Quake 3 in 2000x1400 at 100 fps 15 years ago; Although it DID take a generational leap in terms of processing power compared to consoles to achieve that, that's not my idea of generation leap (or money well spent on pc components).

IQ is important, even moreso framerate, but, like one of the guys responsible for Rallisport Challenge 2 looking so brilliant once said, finding the right balance is the key: equally distributing resources across all aspects, between rendering quality and the quality and quantity of the assets being rendered is what matter the most.
And you can't do that when you have to worry about multiple hardware configurations.

When devs have to offer roughly the same experience across all configurations and are not willing to spend resources on something few people will experience anyway, more powerful GPU's will "only" give them the opportunity to bump all those aspects that can be improved 'for free', automatically.
Conceiving graphics for a more powerful hardware from the ground up is a different thing and gives completely different results.
 
even with all the optimization in the world the new consoles will not be running samaritan. In anything larger than the space they created for it. it took a couple of guys 8 months to make a static street with a guy smoking.

Regardless of what the new consoles have in them the current highend machines will be playing games from the entire next generation at least 720p 30fps. Whats even sweeter is the rumors of multigpu systems which will just make the higher end pc systems run even better then they are now since games will be able to thread the graphics better.
 
IQ is important, even moreso framerate, but, like one of the guys responsible for Rallisport Challenge 2 looking so brilliant once said, finding the right balance is the key: equally distributing resources across all aspects, between rendering quality and the quality and quantity of the assets being rendered is what matter the most.
And you can't do that when you have to worry about multiple hardware configurations.
That's why, back in 2001, I always thought Metal Gear Solid 2 (at 480i) looked significantly better than, say, Max Payne at 1280x1024. As you say, it's all about finding the right balance.
 
No, never. I remember when HL² was released on PC it pretty much graphically blew everything away you could buy on any console. Today? Well, yeah, Battlefield 3 on ultra looks good but I wouldn't say the console games look like they're a generation behind.
 
That's why, back in 2001, I always thought Metal Gear Solid 2 (at 480i) looked significantly better than, say, Max Payne at 1280x1024. As you say, it's all about finding the right balance.

That's not hard to comprehend, MGS2 probably had a much higher budget.People keep forgetting that no one is spending a lot of money on PC games.If someone would spend 40 million on a PC game then things would become clear.With PCs there is no balance, sky's the limit.
 
The fact that Crysis still looks technically better then all console game released is as much proof as anything that PC were a lot more powerful even when Crysis came out.

Crysis came out in 2007.

It's quite obvious that PC's are held back because of console ports. Nobody is making graphics pushing big budget games for PC anymore. The last one was Crysis. Sure there have been games released on PC that might look better by now. But they didn't have the budget Crysis had.

There might have been games on consoles that had better art direction, better use of mirrors and smoke and experience of hardware to make their games come close to a game that came out in 2007.

It's a miracle how efficient developers have become at milking that 7 year old hardware.
 
Hyperbole, The Thread.


PC games look great and consoles are holding game development back. Its not even a debate. When things like 4GB+ of ram are becoming standard even for budget gaming rigs games could be much more than they are now.

Id put money on the fact that games like skyrim could of been even bigger had consoles next gen come out.

Im primarily a console gamer and im happy with how games look, That being said i want them to look better. I think very few people have ever played a game and said "this looks to good i dont want an improvement"

What im more intrested in is the improvement in scale that bigger specs can bring. Development times are a huge factor aswell.

60fps
1080p
Bigger Environments
Longer Games.

Thats all i want from next gen consoles

I would say their are many pc users with low end gfx card as well. Even they are holding back then ?
 
What do people expect from next gen consoles?
If it is 1080p/60fps with added effects and AA/AF, then PC is a gen ahead.
At this point I expect IQ to still be pretty questionable on "next gen" consoles. I don't think we'll be getting 1080p with real AA in many games. Same thing for 60 FPS.
 
I would say their are many pc users with low end gfx card as well. Even they are holding back then ?

They have always been there. The difference is that most PC games now are console ports. If we are lucky we get some added effects instead of just a better ressolution and aa.

Games used to be made for the best hardware out there and were downscaleable for people with lesser hardware. And if at some point ( usually about 3-4 years ) you couldn't run the game even at low settings it was time to buy a new PC or upgrade.

Now they are slightly upscaled versions of games that are designed to run on 7 year old hardware.


For instance my current PC is about 6 years old and depending on the game that is released I can still run it at high or highest settings at 1920x1080 with 4x AA. Hell I can mod the hell out of Skyrim and it'll still be playable on my PC with graphics far and beyond what you get on consoles.
 
first page pretty page nailed it all...
got nothing to add really.

So yes, I am excited about the Next-Gen when it comes....
 
That's why, back in 2001, I always thought Metal Gear Solid 2 (at 480i) looked significantly better than, say, Max Payne at 1280x1024. As you say, it's all about finding the right balance.
You're mixing things here. Most everything that looked great on MGS2 wasn't because the tech, but because the budget and artists. That's why SMG looks fantastic, even on the Wii. And you're comparing that to what? A multiplatform game, seriously?

You could well say: "That's why Crysis 1, back in 2007, looked significantly better than anything on console... and still does".
 
Yup, exactly.

Even if you get a game like Witcher 2 on consoles it will never, ever have the bells and whistles that makes it a next gen experience on PC.
What's the next gen experience on PC? I must be missing it. I launch the game and am greeted with a tutorial where it says [BACK_BTN_ICON] instead of the icon for the back button.
That shit is next gen, actually displaying the icon I need to press is last gen.

Or how about being transported into an Nether realm where I do a finishing move to one person and no one else visible and then be transported back and having 4 people surround me.

Or when I run into invisible walls everywhere and have to actually press a button to go down a ledge and have a very stiff animation being played instead of having the character adapt to the environment.

If that is next gen, please let us stay current gen forever with RDR, Batman, Uncharted, Max Payne and the likes.
 
I like how people are downplaying IQ and framerate. I mean, if you downplay IQ, why did you buy a HDTV in the first place? Why don't you stick to your VHS videos? Why? IQ and framerate are the most important graphical aspects in gaming and you are downplaying them as they're not important: "Yeah, they look crisp and run silky smooth and there's no sign of tearing and no jaggies and textures are so vibrant... but except for those, it looks the same... well, and some other minor details, but they look the same. The SAME."

I dare to say people that need glasses should throw them away; after all, seeing clear is not that important.

Agreed. Hell, just take a look at BP101's shots comparing 1080p with 1080p w/ 4x AA and SGSSAA. One looks not so good, and the other looks nearly photorealistic (ignore the color differences). IQ makes an enormous difference.

noaakft3.png


8xqaa8xsgssaa_0x004010cg5f.png


Keep in mind, that's just 1080p compared to 1080p 4xAA/SGSSAA. Compare it with 720p instead and the difference is enormous.

All it takes is a bit of aliasing and shimmering to break the illusion. Once you play a game with SSAA or AA+SGSSAA, you understand just how good a game can look once you eliminate these flaws. Sadly, hardware is not yet powerful enough to make SSAA or AA+SGSSAA feasible in most modern games...at least not on a single GPU.

If anyone here has a moderately powerful PC with an NVidia card, do me a favor: boot up Half-Life 2 maxed out, and then boot it up with 2x2 supersampling instead (follow BP101's posts in this thread: http://www.neogaf.com/forum/showthread.php?t=456622. The difference may surprise you in motion.
 
I like how people are downplaying IQ and framerate. I mean, if you downplay IQ, why did you buy a HDTV in the first place? Why don't you stick to your VHS videos? Why? IQ and framerate are the most important graphical aspects in gaming and you are downplaying them as they're not important: "Yeah, they look crisp and run silky smooth and there's no sign of tearing and no jaggies and textures are so vibrant... but except for those, it looks the same... well, and some other minor details, but they look the same. The SAME."

I dare to say people that need glasses should throw them away; after all, seeing clear is not that important.

I think you are massively overselling IQ and framerate. Those things may make a difference between generations, but like I said earlier, the one thing everyone seems to be missing is polygon counts. The amount of detail shown in games is still largely the same between PC and consoles, even if all the effects and IQ are better. People on GAF might be able to tell the difference, but the general gaming public cannot.

And I'd be more impressed of a game with Crysis 2 1280x720 running at graphics than a game with Mario 64 graphics running at 4096×3112. I know it's an impossible example, but that's apples to oranges.

You'd have to ask: which one is running smooth? Which one is showing tearing? Which one loads textures in your face? PC games look crisper than console games, run smoother, load textures more properly and have less graphical artifacts all around. They look better in every possible way. It's not only resolution, because if we were talking about resolution, we wouldn't stop at 1080p.

This is exactly what I'm talking about. A lot of PS2 games this gen have been upgraded for higher IQ and framerate on both the PS3 and PC emulators, but they don't all of a sudden look a generation ahead.

Halo Combat Evolved Anniversary looks a generation ahead of its original version. Metal Gear Solid 3 in HD does not.

Has anyone mentioned ArmA 2 yet? Can't see that working too well on a console.

I mentioned it a few pages back, and I agree that it's probably the one PC game I've seen (excluding strategy games) that you probably could not port to a console. Honestly AmrA 2 is what I hope to see in next gen shooters.
 
What's the next gen experience on PC? I must be missing it. I launch the game and am greeted with a tutorial where it says [BACK_BTN_ICON] instead of the icon for the back button.
That shit is next gen, actually displaying the icon I need to press is last gen.

Or how about being transported into an Nether realm where I do a finishing move to one person and no one else visible and then be transported back and having 4 people surround me.

Or when I run into invisible walls everywhere and have to actually press a button to go down a ledge and have a very stiff animation being played instead of having the character adapt to the environment.

If that is next gen, please let us stay current gen forever with RDR, Batman, Uncharted, Max Payne and the likes.

What the fuck is this post?

I think you are massively overselling IQ and framerate. Those things may make a difference between generations, but like I said earlier, the one thing everyone seems to be missing is polygon counts. The amount of detail shown in games is still largely the same between PC and consoles, even if all the effects and IQ are better. People on GAF might be able to tell the difference, but the general gaming public cannot.

I think you are underselling it. Are you even aware of the massive pixel difference between 720 and 1080p?
 
I think you are massively overselling IQ and framerate. Those things may make a difference between generations, but like I said earlier, the one thing everyone seems to be missing is polygon counts. The amount of detail shown in games is still largely the same between PC and consoles, even if all the effects and IQ are better.
Polygon counts are meaningless if all you see on screen is a shimmering, stuttery mess. People often disregard screenshot comparisons by saying that it's not possible to see the difference in motion. In my opinion, the opposite is true. Shimmering is often worse than aliasing in destroying graphics, and the problem only gets more acute with more details (unless compensated for by good IQ).

I admit that I'm biased, since I'd personally rather see PS2 assets with pristine IQ and framerate than what most games output on the current consoles, but if the "general public" were as oblivious to IQ as you seem to imply then why would almost all publishers bother to create bullshots?
 
Watching youtube footage of both BF3 and Witcher 2, both on ultra settings... I can't say I'm impressed...at all. The Witcher 2 in particular. What is so demanding and/or impressive about that game?

Witcher 2 looks amazing. The immersion created by the world (and the graphics of course) is second to none. And yes, where there several times where I thought "dear god that's already like the beginning of a next gen". You have to see it right before your eyes I guess. And I didn't even have the specs to run it with ubersampling enabled which really helps the overall IQ by another large margin.







 
So people show one of the greatest looking games ever made...

And you're bitching about lip syncing?

...ok.
Lip syncing is animation. The animation is worse than contemporary console games.
I think that is worth mentioning. I was very surprised when I played it.

TedNindo said:
The lip-syncing does look bad. But the lighting, particle effects, use of depth of field and animations besides the lip-syncing are impressive imo.
I think individual animations are fine. But since I've played Euphoria powered games and Uncharted 2 I think animation blending and things like that I'm not impressed by one long animation routine, but rather by the way they smoothly go into each other and the connectedness to the environment.

Also I'm not singling out The Witcher 2, but it's the most commonly posted game in the PC-High res thread here on GAF, so I think it's a good example.
The transportation issue described in my other post is also present in Deus Ex 3 and there it's equally stupid.
 
It looks better in motion? The lip-syncing in that game is atrocious. But one wouldn't know that from seeing the screenshot.

The lip-syncing does look bad. But the lighting, particle effects, use of depth of field and animations besides the lip-syncing are impressive imo.
You also have to keep in mind that the game was made with a relativly small budget by a relatively small Polish developer. Compared to those really pushing console hardware.
 
I think you are underselling it. Are you even aware of the massive pixel difference between 720 and 1080p?

Yes I am aware that it is roughly twice the number of pixels and takes a much bigger toll on the hardware, but the end result of higher IQ alone doesn't mean a whole console generational leap in most people's eyes. That's probably why most developers are settling for 720p and 30fps this gen. They probably don't think the end result of getting twice the pixels up there is worth the sacrifice to the hardware. If they did we'd be seeing a lot more games this gen that basically just look like PS2 games in 1080p and 60fps.

Polygon counts are meaningless if all you see on screen is a shimmering, stuttery mess. People often disregard screenshot comparisons by saying that it's not possible to see the difference in motion. In my opinion, the opposite is true. Shimmering is often worse than aliasing in destroying graphics, and the problem only gets more acute with more details (unless compensated for by good IQ).

I admit that I'm biased, since I'd personally rather see PS2 assets with pristine IQ and framerate than what most games output on the current consoles, but if the "general public" were as oblivious to IQ as you seem to imply then why would almost all publishers bother to create bullshots?

That's the problem - you have to attain a balance. The PS2 God of War games in the HD Collection have higher a IQ and framerate than God of War III, but which game looks better?
 
Damn! It is so time for the next generation of consoles. You can always tell when ALL the hardcore PC fans come out of the woods and finally get to say, "LOL Consoles!? More like... shitsoles. AMIRIGHT?" Yes, the console tech is old. We know this. Hopefully it is remedied soon.
 
The games posted in this topic, like The Witcher 2, look better than anything I've seen on consoles.

But the fact that we even have phrases like "mainstream PC gaming" or 'entry-level graphics cards" is evidence that something is always holding these companies back.
 
thread reminds me of PS360 fans vs Wii fans at the beginning of this gen

Actually seems a bit like current iPhone / dedicated portables threads that keep popping up to me. One side is annoyed with the other for putting up with too many (perceived) downgrades and restrictions and for getting money, games and attention that they would prefer their platform of choice to have. While the other side enjoys their library of games, usability and doesn't see the point in what the (perceived) improved experience has to offer.

In any case I feel like the next generational leap is probably a little less about graphics then previous ones, which is why this argument is even being made. From where I'm standing something like Neptune's Pride is the real next gen PC game that I see as a generation ahead of current console games, but that might be just my odd perspective.
 
I think I've sussed it. PC hardware is pretty much a generation ahead but the software that proves that won't come until someone trying to sell a new console (with similar hardware to a fairly high-end rig of today) slings a chunk of cash at a first party dev to make something with next-gen magic and fairy dust.
 
Are the games themselves that are currently available on PC's what you would consider a full generational leap over the games available on consoles?:p

I think the majority of them aren't, which is the fault of how multiplat-centric current industry is.
But there are numerous games that truly are next-gen. Mostly those that rely on scale, like Shogun 2 or upcoming ARMAIII, Planetside 2 or MS FLight.
 
Damn! It is so time for the next generation of consoles. You can always tell when ALL the hardcore PC fans come out of the woods and finally get to say, "LOL Consoles!? More like... shitsoles. AMIRIGHT?" Yes, the console tech is old. We know this. Hopefully it is remedied soon.

And then you've got your crybaby console gamers who don't know crap about PC gaming going "No, it's not that much better! Look at this sub-HD awesome stuff! Look at the LIPSYNCING, BRO, does your PC game have that!?"

Just kidding... kinda.
 
The Witcher 3 (if that's possible, don't shoot me because I haven't played it/can't run it) should be a next-gen launch title.
 
One side is annoyed with the other for putting up with too many (perceived) downgrades and restrictions and for getting money, games and attention that they would prefer their platform of choice to have. While the other side enjoys their library of games, usability and doesn't see the point in what the (perceived) improved experience has to offer.

Exactly, just like the PS360 vs Wii debates at the beginning of this gen.


one day GAF will learn some tolerance :(
 
And then you've got your crybaby console gamers who don't know crap about PC gaming going "No, it's not that much better! Look at this sub-HD awesome stuff! Look at the LIPSYNCING, BRO, does your PC game have that!?"

Just kidding... kinda.

That's the best part about these threads. The large majority of PC first gamers on GAF and in these threads also play console games.

I highly doubt you could say the same vice versa.
 
I love how console gamers are quick to say "THERES NO VISUAL DIFFERENCE UGHHHHHHHHH" when talking about PC games running with amazing IQ - but when PS3 and 360 versions have MINOR, MINISCULE differences between the two, they are fierce to nitpick over them.
 
Didnt that demo run on easily attainable hardware?

If it is so easily attainable, why aren't we seeing games with samaritan level graphics?

Witcher 2 PC looks great(at least, on the right hardware), but I still wouldn't say its generational.

Also, I don't think I'd say the supersampled car shot is 'realistic'. Yea, it looks good, but it doesn't look real.
 
If it is so easily attainable, why aren't we seeing games with samaritan level graphics?
Because devs/publishers don't want to raise the min spec too high and cut off potential sales. High-end PC parts have always been easily attainable, after all.
 
Top Bottom