Uncharted 4 Trailer runs in-engine, in-game, in realtime on a single PS4 at 1080p60

Status
Not open for further replies.
I really don't get it why they had to say that it came from a "single" ps4 and not just from a ps4. I guess they were afraid someone would say it came from 2 ps4 duct taped together.

Well reading through the thread it appears that ND has used multiple consoles in the past to run their trailers. So in that regard it makes sense.
 
Everything I have highlighted here is factually incorrect or bending of the truth. There is no part of the PS4 GPU from a raw number perspective that makes it better than a 7970 or even comparable with a 290x. Ps4 does not use the same GCN architecture as 290x. At all.

You can simply write ps4 sucks at this point.
 
I really don't get it why they had to say that it came from a "single" ps4 and not just from a ps4. I guess they were afraid someone would say it came from 2 ps4 duct taped together.
You have to be super specific. There are people in the press and on GAF that interpret "running real-time in-engine on PS4" as "well, they run the pre-rendered movie they made on PC on the PS4 using Bink and then capture that"
 
I agree with BruceLeeRoy about the "coming out of nowhere". Games like Ryse (game that runs around 30fps), inFSS, BF4, Sunset Overdrive, KZSF, Forza 5, DF, AC:U, The Division look like the expected (and sometimes more) evolution in fidelity from a new generation of hardware especially this early on in its life cycle. U4's teaser not only looks like that while claiming to be real time but also happens to be native 1080p and targeting 60fps. Judging by the expectations from dry specs of the box, most of us are expecting that something has to give as otherwise it causes massive cognitive dissonance.

Naughty Dog outclassed devs left and right last gen, and you expect current gen to be any different? Especially when UC4 is one of the first wave of games that isn't launch window software and can fully show what the hardware is capable of?
 
Everything I have highlighted here is factually incorrect or bending of the truth. There is no part of the PS4 GPU from a raw number perspective that makes it better than a 7970 or even comparable with a 290x. Ps4 does not use the same GCN architecture as 290x. At all.

Please read the articles I've posted.

And let me quote this, since it further clarifies the PS4 GPGPU improvements:
“Next, to support the case where you want to use the GPU L2 cache simultaneously for both graphics processing and asynchronous compute, we have added a bit in the tags of the cache lines, we call it the ‘volatile’ bit. You can then selectively mark all accesses by compute as ‘volatile,’ and when it’s time for compute to read from system memory, it can invalidate, selectively, the lines it uses in the L2. When it comes time to write back the results, it can write back selectively the lines that it uses. This innovation allows compute to use the GPU L2 cache and perform the required operations without significantly impacting the graphics operations going on at the same time — in other words, it radically reduces the overhead of running compute and graphics together on the GPU.”

Thirdly, said Cerny, “The original AMD GCN architecture [it's the 7970 GCN architecture btw] allowed for one source of graphics commands, and two sources of compute commands. For PS4, we’ve worked with AMD to increase the limit to 64 sources of compute commands — the idea is if you have some asynchronous compute you want to perform, you put commands in one of these 64 queues, and then there are multiple levels of arbitration in the hardware to determine what runs, how it runs, and when it runs, alongside the graphics that’s in the system.”
The PS4 compute units are indeed superior to the 7970's. It's just there, you just have to read carefully.

I'm not talking about fillrate, number of ROPs, bandwidth or whatever. I was specifically talking about the compute units. And the PS4 is strictly superior to the 7970 in that regard

Also, the 290x did improve their computational units, thanks to Cerny's modifications. It does have the same number, and I only pointed that out. And as you have said, the 290x has an updated GCN architecture (ver 2.0). so their compute units have probably been improved as well, I wasn't talking about parity with this GPU btw.
 
the graphics are not that important. what matters is how it animates.

that face in ryse looks fantastic, but in motion, the entire illusion falls apart--dead eyes, limited range of animations. the same thing is going on with the order. for a game that is so bent on being "filmic," the facial animations are awful.

drake's facial animations are genuinely the most impressive thing i've ever seen in a video game.

Not sure if serious.

https://www.youtube.com/watch?v=8hxz8IWWzt8

I have not seen this level of detail in facial animations in anything ever. When the werewolf speaks it is especially impressive. If you watch the stage show demo when it is sniffing the air you can see its neck and lips moving as it takes in air. Absolutely amazing attention to detail.
 
I think that if devs learned how to use SPUs on the CELL, they'll learn how to leverage PS4 compute capabilities too. If only, because it can be used on One and PC too.

Yeh they definitely will, I'm actually very interested in how these consoles are going to influence GPGPU algorithm development, personally I think GPGPU could have some very radical impacts on rendering tech. But there are tasks which are simply not parallelizable, and hence can't take advantage of the ACE's.
 
One way they could have done that deformable water is through the playing back of a cached geometry animation. Ryse does that for some of the more crazy physics seen in its cutscenes and gameplay for example:
I'm actually pretty sure that there is nothing really physic driven in this cutscene. The water at the beginning seems to be animated and moves the same.


the graphics are not that important. what matters is how it animates.

that face in ryse looks fantastic, but in motion, the entire illusion falls apart--dead eyes, limited range of animations. the same thing is going on with the order. for a game that is so bent on being "filmic," the facial animations are awful.

drake's facial animations are genuinely the most impressive thing i've ever seen in a video game.
Yeah. But to be fair Drake hasn't talked yet.
 
Do you really think ND would commit suicide like that? If ND doesn't deliver on these visuals they will face a shit storm with the strength of 10,000 Watch_Dogs meltdowns.

Bullshit!

ND could come out tomorrow with gameplay that looks like it done on an NES and everyone here who is desperately trying to silence any doubt or skepticism will eat it up wholesale.
 
I'm actually pretty sure that there is nothing really physic driven in this cutscene. The water at the beginning seems to be animated and moves the same.



Yeah. But to be fair Drake hasn't talked yet.

The bit that made me squeeeeee with excitement was ridiculous...

It's the bit where he removes the clip, looks around and then looks down over his nose with his head tilted back slightly to check it was full. Love it. :D
 
Please read the articles I've posted.

And let me quote this, since it further clarifies the PS4 GPGPU improvements:

The PS4 compute units are indeed superior to the 7970's. It's just there, you just have to read carefully.

I'm not talking about fillrate, number of ROPs, bandwidth or whatever. I was specifically talking about the compute units. And the PS4 is strictly superior to the 7970 in that regard

Also, the 290x did improve their computational units, thanks to Cerny's modifications. It does have the same number, and I only pointed that out. And as you have said, the 290x has an updated GCN architecture (ver 2.0). so their compute units have probably been improved as well, I wasn't talking about parity with this GPU btw.


Don't forget onion. PS4 APU has modified the CPU caches and bus to be slightly different and to their ends. Its not a huge change, but when you're coding to the metal in assembly you can preform some functions much faster with less time resources.
 
Bullshit!

ND could come out tomorrow with gameplay that looks like it done on an NES and everyone here who is desperately trying to silence any doubt or skepticism will eat it up wholesale.


That ain't gonna happen and you know that. No other game studio can reach them in terms of technical capabilities. I, for one, haven't seen a game on any platform including PC (yes, I said it: PC) that comes even close to running something as impressive as the U4 teaser in real-time.

Cheers.
 
Well reading through the thread it appears that ND has used multiple consoles in the past to run their trailers. So in that regard it makes sense.

Oh so that's why. Thanks, only read half the thread so maybe I missed it.

Well I have decided that I'm going to buy a ps4. I have been thinking of building a new PC or getting a ps4. PS4 indeed!
 
Naughty Dog outclassed devs left and right last gen, and you expect current gen to be any different? Especially when UC4 is one of the first wave of games that isn't launch window software and can fully show what the hardware is capable of?
I think the 60 FPS is what most people are having trouble believing. The footage already looks borderline impossible even at 30 FPS. Like BruceLeeRoy said, throw in 60 FPS at 1080p and it's Metal Gear Solid 2 reveal-levels of "Holy Shit! There's no way!" up in here.
 
This is just plain wrong, the 79xx GPU's have 32 compute units, the 78xx GPU's have 20 CU's, both with 2 ACE's. Also both the 7970 and 290x destroy the PS4 GPU in terms of pure processing capability, it's really just math.

Compute units? Compute engines? Same thing? The little chart has the ACEs flowing into the 'Compute Units'

Indeed, my bad, I wrote incorrectly, it's not "compute units", I meant "ACE" units. The PS4 has 8 ACEs (64 command queues), the 78xx and 79xx have 2 (4 command queues)

But my point stands, when it comes to the compute ACE units, the PS4 is superior to the 7970.
 
How do you know?
It's funny how some people attribute everything to her even though she works on tools and just joined ND in November last year, which was 7 months ago. She is probably accustomed to the engine by now and likely made some improvements already, but not on this scale ;)


I'd expect the foliage to be simulated, if it's indeed realtime. It's not static, and it's definitely not hand animated.
I would assume the physic engine calculated it and the result was then dumped into a static animation. Similar to how they did the ocean in Uncharted 3 and the clothing in MGS V Ground Zeroes first trailer.
 
Everything I have highlighted here is factually incorrect or bending of the truth. There is no part of the PS4 GPU from a raw number perspective that makes it better than a 7970 or even comparable with a 290x. Ps4 does not use the same GCN architecture as 290x. At all.

You bolded and quoted something that was correct, you should check your facts first and stop being so defensive.

The 290 and PS4 GPU DO HAVE the same compute que's.
 
Grimløck;116057705 said:
so what? people told me it's gonna look like crap when you're actually playing the game.

yuk yuk. whatev's, your people are dumb and obviously jealous of the masters.

cant wait to see Guerrilla's new ip, Naughty dog and Guerrilla are so good at what they do.
 
They were only sometimes used to hide loading. Most of the time the next area is already streamed in when they start. They mostly load stuff during the forced slow walking sections, combat and ladder puzzles. That's why there are so many of them and a character is constantly closing a gate behind you or you jump down a broken ladder with no means of returning. Immediate unloading of the previous area and start to stream in the next one. If you sprint through the environment ignoring everything like a speedrun the engine actually breaks down completely and this happens rather constantly:

2224yu2k.gif


ND basically begs you to play slowly or their game and technology doesn't work ^^
Having sped run the game for kicks I have never seen this happen, might be a dying PS3.
 
Several people have replied, and as said, it's assembly coding, a closed hardware, so it can be much better optimized, Windows has OS and Direct X overhead, and PC gets the "brute force" approach.

BUT, there's more. The magic is in it's compute units (ACE), also known as GPGPU :

http://www.redgamingtech.com/playst...eon-volcanic-island-gpu-compute-similarities/



All AMD GPUs from 7850 to 7970, only have 2 compute units, while the PS4 has 8.

Also:


Furthermore, the PS4 compute queue size has been increased to 8 per ACE, so 64 total, compared to the paltry 4 the 7850 - 7970 have.

AMD has improved their GCN architecture and done the same with the 290x:

http://gearnuke.com/amd-flagship-r9-290x-has-same-number-of-aces-as-the-ps4/

Quote from that article:



Infamous SS used compute extensively for their incredible graphical effects (And who knows what else), so it's highly probable that Uncharted, The Order, Driveclub, etc. Are doing the same. It's certainly the PS4's "secret sauce", and helps tremendously in specific graphical effects.

As for the PC side, I doubt developers use these compute units extensively for games (if at all), as few GPUs have them, and only the 290x has as many as the PS4. And Nvidia is even weaker on the compute department (but they are improving that on their next architecture). As that article said, only Crytek uses ACEs for weather systems.

And as a side note, the Xbox one has only 2 ACEs, but they do have 8 queues per unit, making a total of 16 compute commands, an improvement from PC, but far from the PS4's 64.


Computing units, are another pillar in the PS4 computational power, that allows this kind of visuals.

Team ICE said they are improving the ACEs performance (GPGPU), and that is going to increase the delta with the X1... It's only gonna get better...

Very good write up, the 64 compute jobs of the PS4 will result in incredible visual effects over time. I remember reading something about Killzone SF not relying too much on gpgpu and only using a few compute jobs. inFamous used it for the lovely particle effects. Cerny was a visionary to bank so heavily on compute, he said that while PS4 is easy to code for, it's custom gpgpu capabilities will be gradually taken advantage of and that we should expect gen 2 and 3 games to look better and better. It's the best of both worlds: easy to code for and yet still have enough ACE's up its sleeve.
 
I really don't get it why they had to say that it came from a "single" ps4 and not just from a ps4. I guess they were afraid someone would say it came from 2 ps4 duct taped together.

Naughty Dog used a render farm comprised of 8 PS3's to render the cutscenes for The Last Of Us and Uncharted 3.

Cristophe Balestra - "Our high tech farm to render cutscenes... #uncharted3 #naughtydog"

ibxzo6Q2kiKkp0.jpg
 
Having sped run the game for kicks I have never seen this happen, might be a dying PS3.
Nah, happens to me all the time when I tried to run it for fun. The current WR has it happening as well. Pretty common occurrence. The engine is supposed to force stop you and display a message "Please wait. Currently loading..." and then the game resumes after a couple of seconds. If that check fails you are able to go out of bounds. Doesn't work everywhere though and in some spots the window is tight so you have to be really fast (or hope the game disc is not spinning or is in a certain position so the game needs longer to load).
 
the graphics are not that important. what matters is how it animates.

that face in ryse looks fantastic, but in motion, the entire illusion falls apart--dead eyes, limited range of animations. the same thing is going on with the order. for a game that is so bent on being "filmic," the facial animations are awful.

drake's facial animations are genuinely the most impressive thing i've ever seen in a video game.
What are you talking about? Ryse is regarded as the game with the most convincing facial expression both in gameplay and in cutscene out of any video game released till date.
 
Indeed, my bad, I wrote incorrectly, it's not "compute units", I meant "ACE" units. The PS4 has 8 ACEs (64 command queues), the 78xx and 79xx have 2 (4 command queues)

But my point stands, when it comes to the compute ACE units, the PS4 is superior to the 7970.

True, but unless you're playing a game which is literally just a series of particle effects the 79xx's higher number of CU's and higher clock speeds are going to make them outperform the PS4 GPU handily.
 
That ain't gonna happen and you know that. No other game studio can reach them in terms of technical capabilities. I, for one, haven't seen a game on any platform including PC (yes, I said it: PC) that comes even close to running something as impressive as the U4 teaser in real-time.

Cheers.
SSM and QD reached or surpassed them on PS3 just fine. It's an endless cycle. You are crazy if you think this won't be matched or surpassed in a couple of years.


yep just like the last of us what exactly are the news here?
Cutscenes in TLoU were pre-rendered. This is claimed to be real time.


I have had some dying PS2s in the past and this is exactly the sort of glitch that is caused by that.
Nope, not a dying console at all. Still works fine a year later and was tested across multiple consoles.
 
True, but unless you're playing a game which is literally just a series of particle effects the 79xx's higher number of CU's and higher clock speeds are going to make them outperform the PS4 GPU handily.

But the point is NOTHING on any PC (hell 290,780ti in SLI) come even remotely close to the vivid attention to detail, CG like animation and sheer "is it real or not" wizardry in this trailer, this is fact.

The only other game close is The Order 1886, which has superb animation, facial capture and CG like details.
Confirmation bias at work I suppose.
Yep, appears so!
 
lol @ the Ryse comparisons. Shit's on another level.

Infamous SS was this gen's Gears of War in terms of visual breakthrough. But if Naughty Dog can translate this visual fidelity into the game... OMG.
 
ibdRXJqeM8dbYH.gif


Am I on Punked?
Your telling me you think RYSE character models look like this?

This Ryse shit needs to stop. You fanboys really needed something to hold onto. We get it. But stop. Ryse was not the second coming of videogame jesus in any way. The game is fundamentally not that amazing at all. Period.

If I am hearing about Ryse in 6 more months you can officially cancel the Xbox.
 
But the point is NOTHING on any PC (hell 290,780ti in SLI) come even remotely close to the vivid attention to detail, CG like animation and sheer "is it real or not" wizardry in this trailer, this is fact.

The only other game close is The Order 1886, which has superb animation, facial capture and CG like details.

Yeh I've said this before in this thread, this trailer is probably the most graphically impressive real-time game I've seen. Now PC GPU's ARE capable of much more than the PS4. But there's not much point to it at the moment when there are no real AAA PC exclusive devs which are technically ambitious.

I guess what I'm trying to say is the visual fidelity you see in the Uncharted trailer doesn't really come from the PS4 being a particularly powerful machine, it comes from ND being an extremely talented and well-resourced company.
 
But the point is NOTHING on any PC (hell 290,780ti in SLI) come even remotely close to the vivid attention to detail, CG like animation and sheer "is it real or not" wizardry in this trailer, this is fact.

The only other game close is The Order 1886, which has superb animation, facial capture and CG like details.

Yep, appears so!
Well, nothing on PC is made specifically to achieve that. If Crytek would still work on PC exclusive games we probably would have seen something like that already.


So it's a repeatable bug contained in one area?
Across multiple areas. It's simply a flaw in the streaming system that works if you are able to reach an area before the game is able to load it. Sometimes they use ladder puzzles to make you go slow and there is nothing you can do about it. During the exploration and combat sections you can simply sprint through a lot of them and with good movement optimization you are so fast the game can't keep up. Normally it's supposed to force stop you with a message telling you "loading, please wait", but for some reason that check is not very well coded and doesn't come up a lot of the times. This will likely never happen if you play the game normally though, so it seems like they just didn't test it thoroughly.


What are you talking about? Ryse is regarded as the game with the most convincing facial expression both in gameplay and in cutscene out of any video game released till date.
Seems like I missed the memo. Faces look good, animation looks creepy. There is better or rather less creepy stuff out there. Facial animation is entirely subjective how it appeals to you. Ryse might be objectively the best, but I take The Last of Us or Beyond over it any day of the week. Much less creepy. The ultimate best is probably still L.A. Noire, too bad it's creepy as hell.
 
This is a lot of thread to sift through... have they confirmed that this is real-time? And not just rendered in-engine on a single PS4 one frame at a time and strung together into an FMV?
 
Status
Not open for further replies.
Top Bottom