You're joking right?? Lmao. This thread is comical
Try bringing up a Final Fantasy XIII screenshot of Chapter 11. I dare you.
You're joking right?? Lmao. This thread is comical
Try bringing up a Final Fantasy XIII screenshot of Chapter 11. I dare you.
I have the same graphics card and I mostly agree with him. Maybe it's the card! Dun dun dun...Heh, can't say I'm surprised. You hadn't posted since then so there was no way to prove you didn't have a 6970. They walked right into that one.
Generational? No.
Do PC games look significantly better when designed and backed by sufficient hardware? Yes.
Generational will be huge worlds of the Samaritan demo.
I added an example to my initial post, btw.Agreed. I'd like to see and objective one though.
I added an example to my initial post, btw.
My point is that an objective definition really doesn't exist without numerous caveats and points of valid disagreement. It's very clear that this discussion has devolved into a PC vs. Consoles thread (as it expected) whereas the discussion should be how far removed are PC multi-plats from their console counterparts; not in visual fidelity but in 'essence.' Is BF3 on consoles a wholly unrecognizable entity which shares only its name in common? Does one's assessment attempt to remove disparities that exist only as a consequence of the hardware they are on, rather than how much they utilize it from the ground up, i.e., the differences between a 'next-gen' game and current-gen game running on next-gen hardware? To me a next-gen game would be one where at its core it fundamentally could not transition, seem feasible, or exist in any comparable fashion except on a PC. And by that, I mean as a 'core' experience and not if it could transition at certain resolutions and with a certain amount of AA.
I have the same graphics card and I mostly agree with him. Maybe it's the card! Dun dun dun...
Errr, you can spend half of that and still build a pretty good rig that will look better than current gen consoles.
What will happen is that everyone who said The Witcher 2 wasn't all that, will suddenly act like the new consoles are bringing never before imagined visual fidelity.
PC graphics are shit right up until the moment they are reproduced by a console - then they are "stunning!"
Samaritan Demo is running using a 3-Way SLI GeForce GTX 580 on PC. If we are lucky enough Sony and MS are using current GPU architecture(Nvidia 500 series or AMD 6000) in a single chip. When Next Gen consoles launch around 2013, the PC market will be in Nvidia's 700 series and AMD 8000, powerful enough to run the Samaritan Demo in a single card configuration. So yes, it's a Generation Leap, people thinking that they will get Samaritan level of graphics in the next gen is wrong.
Samaritan Demo is running using a 3-Way SLI GeForce GTX 580 on PC. If we are lucky enough Sony and MS are using current GPU architecture(Nvidia 500 series or AMD 6000) in a single chip. When Next Gen consoles launch around 2013, the PC market will be in Nvidia's 700 series and AMD 8000, powerful enough to run the Samaritan Demo in a single card configuration. So yes, it's a Generation Leap, people thinking that they will get Samaritan level of graphics in the next gen is wrong.
I built my PC like 2 years ago for ~500 and it runs circles around the consoles yeah.
What's with the colours in the console pic?The Witcher 2 BF3 ect look amazing on PC, and they blow the console versions out of the water. The hardware is MORE than a generation beyond console games.
That said, the software itself is nowhere near a generational leap.
As someone else posted
![]()
![]()
Sure there is a huge difference between IQ of the PC and 360 version, but they dont look anywhere near a generational leap. Even when taking into account diminishing returns.
Whatever game Naughty Dog desires to make for the PS4 a couple years into the PS4's lifespan, will absolutely blow away the Witcher 2. I guarantee it.
That said, by that time, PC's will be running games that look even better.
This is the way of things.
Games? No.
Hardware? Most definitely.
ITT: 'No? Then you don't have a PC lololololol'
Solid007 bringing the truth.
Unlikely. At worst, people running top end rigs today might have to cut back on crazy amounts of AA.
Are people really trying to tell me they don't see much difference between those Witcher 2 screens and this direct feed UC3 capture?
![]()
And to put an even bigger spanner in the works, considering that those 360 shots came out as soon as they announced the port the "console" comparison shots for TW2 were most likely captured on PC with projected settings applied. If you think the 360 version will have DOF and IQ that high-quality you craaaaaaazy.
So yes. Even MOST people running top-end rigs today might have to UPGRADE.
If you have halfway decent hardware and 720p/30fps is good enough for you then you probably won't need to upgrade. Especially considering how well PC titles scale. I just can't see that happening if the new generation of console hw comes out within two years.Well, yeah and the fact that today's PC Gamers will also have to upgrade their CPU&GPU to play next-gen console ports even if you have HD6970 like I do boggles my mind.
However, I should've known better since this is the norm. So, to answer the question, YES&NO.
YES, PC games look better.
NO, Today's PC is NOT next-gen since we will have to upgrade our CPU&GPU again to play next-gen console ports.
Not quite.
Has everyone forgot about Epic's 2011 Unreal NEXT-GEN Tech Demo? It needed THREE GeForce GTX 580 in SLI to get this done.
So yes. Even MOST people running top-end rigs today might have to UPGRADE.
How can anyone argue that the PC is not a generational leap ahead?
Does anyone even remember when Half Life 2 came out. Nothing on consoles even looked close back then, and the overall console experience back then was way behind those PC exclusive games. For those who say IQ/Resolution/AA doesn't matter, I really can't understand why not. If the experience isn't that different for you that's fine, but you can't just deny the difference and say it "plays the same." The core mechanics aren't going to change; that isn't an argument for saying a generational leap didn't happen.
I think you're in for a hard awakening when next gen comes out.
For reference, just look at what you need to run BF3 at max settings, 1080 @ 60fps with AA enabled. If you really think next gen consoles will be much more powerfull than that, I repeat, you're in for a rough awakening. I'm not saying graphics won't look good, but if you want to improve BF3 graphics, you'll need to sacrifice IQ and framerate.
PCs are using the extra power to get IQ and framerate, and that's only possible because hardware is already next gen compared to consoles. You could sacrifice IQ and framerate and get better graphics, and that's what consoles do, what Crysis did back in the day, because nobody could play the game with good framerate and good IQ when it was released, it was so demanding.
You have your preferences and I have mine, but that's not the topic. The thing is: to run a game such as BF3 or TW2 at those settings you need next-gen hardware, thus they are next gen.
When the next consoles arrive and are "only" producing visuals at the level of The Witcher 2 on High at 720p/30fps I think people will start to understand what the term "diminishing returns" means.
Even Xenoblade at 1080p gives 360/PS3 games a run for their money (besides the HUD):
![]()
The characters may not look as good, but the environments put FF XIII to shame.
Putting it into shame is a bit too much imo.
Naughty Gods just utterly destroyed Witcher 2 with that shot. That's the difference I see. PCs/games wish they could handle that level of ambient occlusion.Are people really trying to tell me they don't see much difference between those Witcher 2 screens and this direct feed UC3 capture?
![]()
Does that look like a Wii games to you? The difference between Wii and PS3 is a generational leap. Are you saying there's the same difference between this Uncharted pic you posted and Witcher 2 that there is between a Wii game and Uncharted? Is there magical details that only the PC elitists can see which do not exist to my weary eyes?
By saying "diminishing returns" means that we're looking at "next gen" games right now on PC, you're basically admitting that the next gen leap isn't going to look as impressive as previous generational leaps.
Clearly you haven't played FF13 next to dolphin up Xenoblade...
I am pretty sure that the demo was not very optimized at all, they said they could probably get it working on one 580. It was made by like three people or something. That is the kind of visuals I want to see next gen though... I honestly think we will get close to it, given current gen consoles can output something like Uncharted 3...if nothing else it proves that PC devs are absolutely not pushing the hardware to its limits.
When the 360 released I think it had a fairly contemporary card, did it not? They shouldnt release a new console with a 500 series....I am willing to hold out for a while until stuff like Samaritan can be produced. I am content enough playing nice looking games on PC for now, but I want the next consoles to offer a sizeable leap...
Jim-Jam: There are far, far more impressive scenes in U3 than that. I actually think Drake's model looks better than Geralts though. Certainly it animates more impressively.
As already mentioned, the demonstration ran in real-time on a 3-Way SLI GeForce GTX 580 system, but even with the raw power that configuration affords, technological boundaries were still an issue, and for that reason, Daniel Wright, a Graphics Programmer at Epic, felt that "having access to the amazingly talented engineers at NVIDIA’s development assistance centre helped Epic push further into the intricacies of what NVIDIA’s graphics cards could do and get the best performance possible out of them." Being a tightly controlled demo, Samaritan doesn’t include artificial intelligence and other overheads of an actual, on-market game, but with enough time and effort, could the Samaritan demo run on just one graphics card, the most common configuration in gaming computers? Epic’s Mittring believes so, but "with Samaritan, we wanted to explore what we could do with DirectX 11, so using SLI saved time."
Are people really trying to tell me they don't see much difference between those Witcher 2 screens and this direct feed UC3 capture?
![]()
And to put an even bigger spanner in the works, considering that those 360 shots came out as soon as they announced the port the "console" comparison shots for TW2 were most likely captured on PC with projected settings applied. If you think the 360 version will have DOF and IQ that high-quality you craaaaaaazy.
To me that does look like a "Wii game" actually, assuming that's your benchmark for when a game on the HD consoles looks "last gen". There are no magical details:
- Textures blur beyond about 1m because of a lack of AF
- Jaggies everywhere in this bitch
- Low-poly models on the background objects
- 720p
If you use your eyes, these things are readily apparent.
Actually not quite, see above. I do think it's impressive, but I also think a lot of people in this thread are indulging in some cognitive-dissonance.
Wow, you've been playing some pretty sweet looking Wii games. On dolphin I presume?
Meh, I think that Uncharted shot actually does look better than The Witcher 2, IQ aside. The lighting is beautifully done.
Sadly, that doesn't really hold true for most of the game.
Naughty Gods just utterly destroyed Witcher 2 with that shot. That's the difference I see. PCs/games wish they could handle that level of ambient occlusion.
You are using "Wii game" to mean a game a whole generation behind what I'm playing right now, so yes it does look like a "Wii game" but not a Wii game.
Is the lighting in the uncharted pic pre-baked or not?