Are current PC games a full "Generational Leap" ahead of current console games?

Long story short: Most games aren't taking full advantage of what a high end PC can do because they're console ports. PC has no monopoly on bold and innovative games, though it is a good platform for them because of it having a theoretically very high amount of hardware power, and a lack of restrictions compared to retail publishing or PSN/XBLA.

EDIT: Before accusing me of "teh consolez r holding back vidja gaems!" hysteria, read my post again. The first and second sentences address completely different things.
If they actually took full advantage of what the PC could do most systems would not be able to deliver 60 fps, lots of AA and a high resolution.
 
I don't agree. I think 32 is the player count that works best for most maps. Some can be dialed down to 24.
Anything above those two I consider a clusterfuck.

Of course there are people that play on those servers, but to me the inability to affect something with 2 squads is not fun.
People in my platoon (~30 people) that play on those servers choose the 1000 ticket Metro variant and just go there to farm points.
(That this is just silly to me is a different topic.)

I just think that MAG does higher player count better than Battlefield and PlanetSide does it even better than that still.

Disagree with every single point you made, it's too subjective to have any type of meaningful debate anyway.

What isn't subjective is that BF3 offers an expanded experience on the PC due solely to the superior hardware the PC provides. Whether you personally enjoy the added game-play features or visuals is again subjective and frankly, not worth discussing (in this thread).
 
If they actually took full advantage of what the PC could do most systems would not be able to deliver 60 fps, lots of AA and a high resolution.

I should probably expand this a bit - There are many titles don't take advantage of the PC's other capabilities, namely those relating to patching and delivering software. Games like Project CARS, Minecraft, and Overgrowth are literally impossible to do on consoles because of the restrictions imposed on them.
 
Disagree with every single point you made, it's too subjective to have any type of meaningful debate anyway.

What isn't subjective is that BF3 offers an expanded experience on the PC due solely to the superior hardware the PC provides. Whether you personally enjoy the added game-play features or visuals is again subjective and frankly, not worth discussing (in this thread).
You're completely right.

I was just bringing up MAG because it inarguably features more players on a map than BF3 does.
That I personally prefer the way they deal with the higher player count is besides the point, but the amount of players featured is higher.

But yes: BF3 on PC is the expanded experience of the BF3 on consoles.
 
It sounded more like a thought experiment than a genuine proposal. 720p is already not good enough imo. Hell, 1080p is starting to feel inadequate.

I agree about 720 but to say 1080p is inadequate? I feel like anything higher then 1080p we start to get diminishing returns but I haven't spent alot of time above 1080p.
 
Played Witcher 2 on highest setting and 1080p for hours. Impressive but not giving the next-gen vibe.

More like a PS2 -> Xbox transition feeling.

PC games are not there yet (I hope).
 
Looking at some old GPU articles...I think people really need to tone down their expectations.

The PS3's RSX is supposed to be slower than the 7900GT released that year. How much power did the 7900GT take under load? 50 watts.

Now, let's take a look at a high-end GPU from today: the GTX 580. How much power does it take under load? 300W.

The PS3 was a power hungry beast and it has a 225W PSU, which means its total draw is below that. Next generation consoles are likely going to take less power than that. Even mid-range GPUs like the 6850 or GTX 460 take 100-150W of power, so I wouldn't even expect that kind of GPU performance in a modern console anytime soon.

cons_hiend.gif


Source: http://www.xbitlabs.com/articles/graphics/display/power-noise_4.html#sect1
 
Looking at some old GPU articles...I think people really need to tone down their expectations.

The PS3's RSX is supposed to be slower than the 7900GT released that year. How much power did the 7900GT take under load? 50 watts.

Now, let's take a look at a high-end GPU from today: the GTX 580. How much power does it take under load? 300W.

The PS3 was a power hungry beast and it has a 225W PSU, which means its total draw is below that. Next generation consoles are likely going to take less power than that. Even mid-range GPUs like the 6850 or GTX 460 take 100-150W of power, so I wouldn't even expect that kind of GPU performance in a modern console anytime soon.
7900GT only 50Watt?!

Launch PS3 units were 300W* if I remember correctly.

*MAX

Edit: nevermind :P
Next-gen consoles will probably all have ATI GPU's. How do they fare in power consumption?
 
M°°nblade;34192828 said:
How much Watt was a 7900GT?

Launch PS3 units were 300W if I remember correctly.

image.php


Source: http://www.hardcoreware.net/reviews/review-356-2.htm

That's the total consumption.

A high end GPU alone requires over 200W of consumption during gameplay, and can peak as high as 300W. Total system consumption is well over 400W.

Computers may have gotten faster, but they're also far more power hungry than they were in the past. Video cards now come with big 2 and even 3 slot coolers with 2 90mm fans, when they used to come with small single slot coolers with a single 50mm fan.

This is a 7900GT, a high-end card which is more powerful than the PS3's RSX, and released a few months prior to the PS3:

GeForce_7900_GT_Side.jpg


This is a GTX 570, a high-end card which released a year ago:

PNY-XLR8-GeForce-GTX-570-graphic-card.jpg


Or the cooler DirectCu II version, if you prefer:

normal_IMG_3938.JPG
 
The wii pulls eighteen watts during load? :o

People make the mistake of thinking that since Nintendo chose not to make a powerful machine this gen, they actually don't know how to make good hardware.

Nintendo are actually damn WIZARDS at making hardware, they just aren't pushing in the high-graphics direction.
 
People make the mistake of thinking that since Nintendo chose not to make a powerful machine this gen, they actually don't know how to make good hardware.

Nintendo are actually damn WIZARDS at making hardware, they just aren't pushing in the high-graphics direction.

Except for the part where the Wii in standby uses as much power as it does during load (not pictured on that graph).
 
Guys, its winter but I ain't freezing. The hot air spewing from my GPUs keeps me warm and cozy.

Just another unsung benefit of high-end PCs. Wii couldn't heat my ass.
 
So do you think a 7900 could smoothly run Uncharted 2/3?

What do you guys think is more important, raw hardware specs or developer talent and resources? (serious question)
 
So do you think a 7900 could smoothly run Uncharted 2/3?

What do you guys think is more important, raw hardware specs or developer talent and resources? (serious question)

Raw power is very important. PC GPUs are not as inefficient as you think. Keep in mind most of us target 1080p and 60fps, and that takes 4x the amount of performance required to maintain 720p and 30fps.
 
So do you think a 7900 could smoothly run Uncharted 2/3?

What do you guys think is more important, raw hardware specs or developer talent and resources? (serious question)

I'd like to see talented developers working on powerful hardware, so the talent can be used doing incredible things, period, and not doing impressive things if you consider the hardware.
 
Why must there be a compromise. Progress should be made as a whole, right?

In an ideal world sure.

But the reality of the situation is that consoles are where the money is, and subsequently where the developer talent, resources and big budgets are going into. So there is a compromise whether we like it or not.
 
One thing is clear from why I've shown here: the performance gap in between PCs and consoles is going to be quite large even at launch, as long as PC keep getting power hungry high end components like the GTX 580 and HD 7970.
 
Yes, we all know that both are very important, but which is more important in your opinion? Which of the two plays a larger role in defining things like "next gen" and "generational leap"?

Raw power, obviously. You guys are insane, and here's why.

The thing which defines a generational change is new hardware. This concept that, in spite of the fact that several generations of CPU and GPU tech have been released since the HD twins, you can use your eyes to make some determination of when "next-gen" or "generational leaps" occur is just asinine. Take a look at the Silent Hill series for a good example of why.

I understand that it's upsetting to be told that your socks won't be knocked off like they were when the PS2 came out or whatever, but assuming that you can use previous generations to dictate some acceptable level of "wow" or "pizazz" or "kablingy" for you to deem when a generational leap occurs is a form of inductive reasoning which relies on your subjective determinations and personal experience. In other words it's totally useless.

Developers don't have anything to do with when the "next-gen" starts because, by definition, it starts when the next generation of hardware is released. So you can continue to try and make some passive-aggressive point, which I'm not even sure I understand the purpose of at this stage, but it's worth pointing out that your argument makes absolutely no sense whatsoever on a basic logic level.
 
This topic really makes people extremely defensive, oversensitive and emotional.

Some want to casually dismiss and brush aside the importance of developer talent/resources while others downplay the clear and obvious benefits of more powerful hardware.

It seems like everyone is being forced to "pick a side" and there is no middle ground with anyone. Gaf truly is insane.
 
It seems like everyone is being forced to "pick a side" and there is no middle ground with anyone. Gaf truly is insane.

Says the guy who 1 post ago wanted people to pick the side of raw power or developer talent.
Yeah, truly crazy.

Both are important...
 
This topic really makes people extremely defensive, oversensitive and emotional.

Some want to casually dismiss and brush aside the importance of developer talent/resources while others downplay the clear and obvious benefits of more powerful hardware.

It seems like everyone is being forced to "pick a side" and there is no middle ground with anyone. Gaf truly is insane.
I gave you a middle way answer, you demanded I pick a side. You can't put the Genie back in his bottle like that.
 
This topic really makes people extremely defensive, oversensitive and emotional.

Some want to casually dismiss and brush aside the importance of developer talent/resources while others downplay the clear and obvious benefits of more powerful hardware.

It seems like everyone is being forced to "pick a side" and there is no middle ground with anyone. Gaf truly is insane.

I don't get it. If we're going to talk about next-gen game quality base on "talent" of the developers, then will never know when next-gen truly begins and that metric is just sort of impossible to use in a logical discussion. And if we do it base on resources, then next-gen won't come until Sony/Microsoft/Nintendo spend more than before. Both of which you can't even guarantee.
 
I gave you a middle way answer, you demanded I pick a side. You can't put the Genie back in his bottle like that.

Okay, I'll give you that one. Chalk it up to poor wording on my part, my apologies.

What I was trying to ask for though is this: an explanation/outlining and wieghting of the reasons why you think developer talent/resources are important, compared and contrasted with with an explanation/outlining and weighting of why raw hardware power is important, and a weighting and assessment of each of their importance. In a nutshell, I want to hear PC die-hards talk about how they see the importance of developers in a more detailed and meaningful way than "both matter/."

But due to my poor wording of the question, you just assumed that I was on the "console side" got angry and defensive about me trying to make some sort of imagined "passive aggressive" point and only gave half the answer I was looking for.
 
Both are important. There's no need to do a thorough analysis on the matter.

Developer talent is limited by performance.

Performance is limited by developer talent.

Simple as that.
 
Both are important. There's no need to do a thorough analysis on the matter.

Developer talent is limited by performance.

Performance is limited by developer talent.

Simple as that.

Okay.

So only raw hardware power is worthy of having a detailed, thorough and passionate discussion about (as evidenced by the bulk of posts in this thread.)

Discussing the role and importance of developer talent and resources, on the other hand, is not.

simple as that?
 
simple as that?
Haven't you read the posts?

Either the next gen starts when new consoles are being released or it means that the next gen started when the last GPU generation was released which was December 22nd.

I don't remember The Witcher 2 coming out post-Dec22 which means current PC games are not a generation ahead.

Or something?! I don't know anymore. Everything is crazy now.
 
Okay.

So only raw hardware power is worthy of having a detailed, thorough and passionate discussion about (as evidenced by the bulk of posts in this thread.)

Discussing the role and importance of developer talent and resources, on the other hand, is not.

simple as that?

Developer talent doesn't get talked about as much as hardware because it's much easier to understand the basic metrics of hardware performance. You need a god damned computer science degree, and probably a specialization in computer graphics coding, to actually back up a statement like "X is better coded then Y"; which is the kind of statement you really mean by "developer talent".
 
I never understood the PC masterrace hardon for fidelity. Videogames are videogames, a really poor inferior representation of visual art at this stage.

I do understand the hardon for performance though, since gaming with 60 fps does have this undenyable benefit.
 
I never understood the PC masterrace hardon for fidelity. Videogames are videogames, a really poor inferior representation of visual art at this stage.

I do understand the hardon for performance though, since gaming with 60 fps does have this undenyable benefit.

Fidelity has an undeniable benefit as well: you see better...you see more.

If you have 20/40 vision, do you settle or do you go out and get yourself a pair of glasses?
 
I should probably expand this a bit - There are many titles don't take advantage of the PC's other capabilities, namely those relating to patching and delivering software. Games like Project CARS, Minecraft, and Overgrowth are literally impossible to do on consoles because of the restrictions imposed on them.
Minecraft was announced for 360, by the way.
 
Fidelity has an undeniable benefit as well: you see better...you see more.

If you have 20/40 vision, do you settle or do you go out and get yourself a pair of glasses?

Worded badly: resolution also ups the experience offcourse. Especially for the open world game it has major benefit indeed.

What i meant was: my lack of understanding for this unending love of razor sharp textures, in the context of videogame it might be cutting edge, but when I look at these screens it just doesn't convey anything worthwhile visually. Crysis jungle does look like a low budget animation series still. Art design will always reign suprem, to me at least.
 
Developer talent doesn't get talked about as much as hardware because it's much easier to understand the basic metrics of hardware performance. You need a god damned computer science degree, and probably a specialization in computer graphics coding, to actually back up a statement like "X is better coded then Y"; which is the kind of statement you really mean by "developer talent".

This is a fair point.

Hardware specs are something that are easily understandable to most people. On the other hand, most of us don't have the qualifications to discuss the technical nuances of coding.

But what we do have are real-world results that we can judge with our own eyes. For example:

Does anyone really believe that you can take a pc with a 7900 and 256mb of ram and code a game like UC3 or KZ2 to run as well on that pc as it does on PS3? Rage is another example of a crazy technical achievement that doesn't get enough credit. 

To me, that begs the question: even if next gen consoles won't have dual 7970's and 16 gigs of ram, is that really going to limit the hardware from achieving top shelf visuals once the full weight of developer talent and publisher money are thrown behind them?

Given the fact that visuals like U3 etc. (whether you personally think they look like shit or not) shouldn't even be possible on the shitty outdated tech of today's consoles, then I don't see how developer talent is something that you can just casually dismiss and just cut it out of the conversation.

Even though our lack of understanding of coding makes it more of a theoretical discussion than a technical one, it's still one worth having IMO. 
 
Guys, its winter but I ain't freezing. The hot air spewing from my GPUs keeps me warm and cozy.

Just another unsung benefit of high-end PCs. Wii couldn't heat my ass.

Haha I thought I was the only one that felt like this. I kick up the OC on my GPU just for that extra boost of heat in my room.
 
Does anyone really believe that you can take a pc with a 7900 and 256mb of ram and code a game like UC3 or KZ2 to run as well on that pc as it does on PS3?
With the right team of developers who have the knowledge to code to the metal if needed, absolutely. You'd be amazed at how skilled some engineers can be.
 
Worded badly: resolution also ups the experience offcourse. Especially for the open world game it has major benefit indeed.

What i meant was: my lack of understanding for this unending love of razor sharp textures, in the context of videogame it might be cutting edge, but when I look at these screens it just doesn't convey anything worthwhile visually. Crysis jungle does look like a low budget animation series still. Art design will always reign suprem, to me at least.

Well, there is the case of RAGE. Awesome art, like playing a painting, but heavily damaged by low res textures. I'm not saying that all games should have 4096x4096 textures, but that when great art is put into a game, I would prefer to see in its full glory.
 
With the right team of developers who have the knowledge to code to the metal if needed, absolutely. You'd be amazed at how skilled some engineers can be.

In the realities of today's industry though? Aside from a few tech demos and a small handful of PC-exclusive games, that is nowhere close to happening in the current PC market.
 
In the realities of today's industry though? Aside from a few tech demos and a small handful of PC-exclusive games, that is nowhere close to happening in the current PC market.
Market realities are a separate discussion, I was simply refuting the point that it wasn't a possibility.
 
Top Bottom