VGLeaks rumor: Durango CPU Overview

TressFX was heavily promoted by (and co-developed by) AMD, who have a DirectCompute advantage over Nvidia. Considering the rest of Tomb Raider on the PC seems like it barely looks better than the console editions I think it's fairly unlikely Crystal Dynamics would've implemented it without AMD's support.

AMD is also powering the PS4. And the impact of TressFX is the same on AMD and nvidia systems ( link ). Shouldn't it run worse on a nvidia card if AMD has an compute advantage? There are 4 parties interested in such a feature: Sony and the third party devs (who have to compete with Sony's first party putput) because it will help software sales, AMD because it advertises their brand and consumers because games will look better. So the incentive is there. Of course it would take a little effort, but not much if the power is there. I can see something like this happening.

No they won't.

Basically if the Durango sells more than the PS4 (or the other way around) expect that version to get the most attention. Nobody is going to bother with the lesser selling console to add in extra fluff.

If they can improve a version which has bad sales (the PS4 versions won't have bad sales) with very little effort because of the superior specs, don't you think those devs will do it? Also, see above.
 
the dma are a hardware version of the memexport instruction, it's going to be tasked with directly skinning and texturing models, hence why the large focus on jpg for it.

What, explain, DMA cannot texture and skin models at all it contains no compute. JPEG is also a horrible texture format, no one would use it, i can guarantee you that the JPEG decoder will most likely not be used for textures.

To explain this further, JPEG is not used generally because.

A. The hardware does not have support for hardware decoding JPEG.
B. DXTn can require only 1 read to get a value out in comparison to JPEG's many, this is a advantage.

Instead what they use is the DXTn family of texture compression, these guys support upto 4:1 compression and have hardware decompression on GPU's, this hardware decompression is really the kicker because it allows the GPU to store upto 4x the textures in its cache as the texture doesn't actually need to be decompressed until its used in the registers. If you did this with Durango's JPEG decoding you would end up with RAW (which is what it decompresses to) these files are upto 4x as large as the texture compressed in DXTn and would destroy your texturing performance.

tldr; bad idea.

EDIT:.

To drive this point home ever further it would actually be quicker to move the raw data then it would be to move the JPEG because of the limited bandwidth of the JPEG decoding in comparison to the 25.6GB/s of raw data movement they can perform.
 
Interesting. It needs smoothing and also high levels of LOD's. Zooming in and seeing "jaggies" on models that aren't rendering artifacts is just bad. Also, it's all completely stiff. Everything looked great from far away, but what about the foliage? In the future maybe, but not now.

the demo was on a old Acer notebook, Intel C2D P7550, nVidia GeForce GT 130M

the video wasn't to show how good it looks but as a example of the type of games that will follow Minecraft using Voxels.



but you can read the Unreal Engine 4 link where they talk about using Voxels for lighting


Please give us an overview of how the algorithm works from generating the octree, to cone tracing, to the gathering pass.


http://www.geforce.com/whats-new/ar...s-next-gen-gtx-680-powered-real-time-graphics

The technique is known as SVOGI – Sparse Voxel Octree Global Illumination, and was developed by Andrew Scheidecker at Epic. UE4 maintains a real-time octree data structure encoding a multi-resolution record of all of the visible direct light emitters in the scene, which are represented as directionally-colored voxels. That octree is maintained by voxelizing any parts of the scene that change, and using a traditional set of Direct Lighting techniques, such as shadow buffers, to capture first-bounce lighting.

Performing a cone-trace through this octree data structure (given a starting point, direction, and angle) yields an approximation of the light incident along that path.

The trick is to make cone-tracing fast enough, via GPU acceleration, that we can do it once or more per-pixel in real-time. Performing six wide cone-traces per pixel (one for each cardinal direction) yields an approximation of second-bounce indirect lighting. Performing a narrower cone-trace in the direction of specular reflection enables metallic reflections, in which the entire scene is reflected off each glossy surface.

[Editor's Note: If the above sequence seems alien to you, it's because it is. Global Illumination requires a totally new lighting pipeline. In a traditional game, all indirect lighting (light that is bounced from a surface) is calculated in advance and stored in textures called lightmaps. Lightmaps give game levels a GI-look but since they are pre-computed, they only work on static objects.

In Unreal Engine 4, there are no pre-computed lightmaps. Instead, all lighting, direct and indirect, is computed in real-time for each frame. Instead of being stored in a 2D texture, they are stored in voxels. A voxel is a pixel in three dimensions. It has volume, hence the term "voxel."

The voxels are organized in a tree structure to make them efficient to locate. When a pixel is rendered, it effectively asks the voxel tree "which voxels are visible to me?" Based on this information it determines the amount of indirect light (Global Illumination) it receives.

The simple takeaway is this: UE4 completely eliminates pre-computed lighting. In its place, it uses voxels stored in a tree structure. This tree is updated per frame and all pixels use it to gather lighting information.]
 
Gemüsepizza;49577040 said:
The RAM amount of the Xbox 3 is also only a rumor at this point. So why should we take this as "confirmed" and not the alleged amounts of reserved RAM? If we use all those values for now, the PS4 would have 36-50% more RAM available for games (5.0-5.5 GB vs 7.5 GB), which is significant.

so instead let's just pick arbitrary data and turn these into favor of ps4? really?
 
what is going on here again? yesterday the talk about an api-advantage of ps4 now people talk about memory footprint of consoles which aren't even released? and really do you guys think Sony has an advantage in os-development over ms? I hope that we will get some real info soon to stop all this dumb talk.
 
so instead let's just pick arbitrary data and turn these into favor of ps4? really?

The memory footprint of the OS is as arbitrary as the alleged amount of total RAM for the Xbox 3 - they are both rumors from sources which have been relatively reliable in the past. So not arbitrary at all.

what is going on here again? yesterday the talk about an api-advantage of ps4 now people talk about memory footprint of consoles which aren't even released? and really do you guys think Sony has an advantage in os-development over ms? I hope that we will get some real info soon to stop all this dumb talk.

Dude, what's your problem? This is a thread about a rumor, where people are speculating. Why are you here if you hate to speculate about rumors? Regarding the OS: The Xbox OS is rumored to have more features than the PS4 OS, that's why people think it might need more resources - not because Microsoft can't develop a solid OS.
 
what is going on here again? yesterday the talk about an api-advantage of ps4 now people talk about memory footprint of consoles which aren't even released? and really do you guys think Sony has an advantage in os-development over ms? I hope that we will get some real info soon to stop all this dumb talk.

Doesn't really matter because Sony has a chip for the OS so the fact that it's not using 2 of the Jaguar cores is already a big advantage. so hopefully The Xbox 3 also has a chip to help out with the OS.
 
Gemüsepizza;49618902 said:
Dude, what's your problem? This is a thread about a rumor, where people are speculating. Why are you here if you hate to speculate about rumors?

but speculating and wildly guessing, even ignoring the fact that ms has proficient knowledge when it comes to apis and os development, is not the same. I just miss educated guesses.
 
Doesn't really matter because Sony has a chip for the OS so the fact that it's not using 2 of the Jaguar cores is already a big advantage. so hopefully The Xbox 3 also has a chip to help out with the OS.

even wiiu has one so you can be sure. educated guess, I say. but it worked for 360 without dedicated processor, too. I would still bet on a dedicated arm processor.
 
even wiiu has one so you can be sure. educated guess, I say.

I'm pretty sure it doesn't. Check the "Latte" thread.

EDIT: Found it for you.

http://www.neogaf.com/forum/showthread.php?p=48907455#post48907455

Finally found the reason for the lack of the multicore ARM: That chip is only found in devkits and is part of the bridge between the Wii U hardware and the host.

It also doesn't have an OS. That's why app "switching" is really slow and games lag while you do background downloads.
 
even wiiu has one so you can be sure. educated guess, I say. but it worked for 360 without dedicated processor, too. I would still bet on a dedicated arm processor.

Then the 2 Jaguar cores the 720 reserves must be for Kinect or something similar.
 
but speculating and wildly guessing, even ignoring the fact that ms has proficient knowledge when it comes to apis and os development, is not the same. I just miss educated guesses.

But this is NOT wildly guessing. We have rumors from relatively reliable sources that state such a difference in reserved RAM. And this has NOTHING to do with Microsoft possibly being incompetent in regards to OS development. It's just that several rumors indicate that the Xbox OS will have more features, which means it also might need more resources. But I already said that above.

Doesn't really matter because Sony has a chip for the OS so the fact that it's not using 2 of the Jaguar cores is already a big advantage. so hopefully The Xbox 3 also has a chip to help out with the OS.

An ARM CPU would make sense, they could easily use the Vita OS for PS4 (if I remember correctly there were rumors pointing to this).
 
Well the ARM chip could be for encryption and system protection as well as basic background tasks such as downloading and waking up the console from a suspended state., which is what it most likely is for in both Durango and PS4.

http://www.arm.com/products/processors/technologies/trustzone.php

there has to be an arm in wiiu because of the compatibility to wii. the ios of Wii also was executed on an arm and I doubt that Nintendo does arm emulation when the chip runs in Wii-mode. what it's for in next gen we have to see.
 
there has to be an arm in wiiu because of the compatibility to wii. the ios of Wii also was executed on an arm and I doubt that Nintendo does arm emulation when the chip runs in Wii-mode. what it's for in next gen we have to see.

I really think that it won't run the OS, and if it does I don't think it will run all the OS theres somethings you simply cannot do on the ARM chip and would have to be executed on the x86 cores.
 
What, explain, DMA cannot texture and skin models at all it contains no compute. JPEG is also a horrible texture format, no one would use it, i can guarantee you that the JPEG decoder will most likely not be used for textures.

To explain this further, JPEG is not used generally because.

A. The hardware does not have support for hardware decoding JPEG.
B. DXTn can require only 1 read to get a value out in comparison to JPEG's many, this is a advantage.

Instead what they use is the DXTn family of texture compression, these guys support upto 4:1 compression and have hardware decompression on GPU's, this hardware decompression is really the kicker because it allows the GPU to store upto 4x the textures in its cache as the texture doesn't actually need to be decompressed until its used in the registers. If you did this with Durango's JPEG decoding you would end up with RAW (which is what it decompresses to) these files are upto 4x as large as the texture compressed in DXTn and would destroy your texturing performance.

tldr; bad idea.

EDIT:.

To drive this point home ever further it would actually be quicker to move the raw data then it would be to move the JPEG because of the limited bandwidth of the JPEG decoding in comparison to the 25.6GB/s of raw data movement they can perform.

So many wrongs in this post.....I don't even know where to start. So, at the barest of logical thinking, you think MS spent silicon, money and die space to put in something that will not be used eh? This post proves that you have absolutely no idea about textures and texturing in games as the vgleaks article on it clearly points to its use to decompress a texture into a sampling chroma value similar to DXGI as an example. The fact that you think that DXTn is the most used texturing format is laughable, but since you seem so convinced in what you are saying, there really is no point in correcting you. So carry on.
 
they will somehow blow up every picture and look at every crevice, examine the most miniscule pixel and write 150 pages on why one is better then the other...Thankfully I can't see anything those people see because gaming must suck if you just noticed every small graphical detail, lol.

We've been doing that in Multiplatform game threads for the last 6 years, why would it change? If you're a dual console owner you might as well get the best version of a Multiplatform game. Although in the last couple of years the differencs have generally been pretty minor.

As for PS4/Durango - we don't know what the differences will be like in reality. As much as we can't sit here and definitively say 'PS4 will look noticably better' - neither can anyone state 'oh they'll both look identical'. Both statements require knowledge that nobody has yet. Even current dev teams on Multiplatform titles will need more time to shake these down and get a proper view of the differences.

I do disagree that publishers will strive to make them look identical. If faced with eg CoD looking noticably better on PS4 vs Durango, do you think EA would deliberately gimp BF4 for the sake of parity? Competition will drive up standards.

This generation was more about slowly pulling the PS3 up to par with 360, I don't think 360 was gimped because Ps3 was difficult to develop for,
 
So many wrongs in this post.....I don't even know where to start. So, at the barest of logical thinking, you think MS spent silicon, money and die space to put in something that will not be used eh? This post proves that you have absolutely no idea about textures and texturing in games as the vgleaks article on it clearly points to its use to decompress a texture into a sampling chroma value similar to DXGI as an example. The fact that you think that DXTn is the most used texturing format is laughable, but since you seem so convinced in what you are saying, there really is no point in correcting you. So carry on.

I didn't say it wouldn't be used, I said it wouldn't be used for texturing, theres a difference you would know this if you could read.

JPEG is a completely retarded texture format. DXTn is MADE FOR IT.

Your suggesting developers use a format which will DESTROY texturing performance, using JPEG for texturing that get moved into the eSRAM will be ~4x worse.

DXTn and its friends BCn are what nearly everyone would use.
 
6 for games and 2 for the os is perfect

If they were aiming for 512MB they'd have a relatively streamlined OS. No way would they need anywhere near 2GB. 1Gb is probably overkill, but gives them room to breath and they can always squeeze it later.

Or if they are happy with 512, just stick to that. They managed to double the ram relatively late in the process, their OS should already be fairly done
 
So if I am playing a non Kinect game those 2 cores will be doing nothing?

Every breath you take
Every move you make
Every bond you break
Every step you take
Kinect will be watching you





but on a serious note I think it will be reserved for background tasks & doing things like recording shows & serving games to your tablets & smartphones & things like that & being able to do them at anytime no matter what game you are playing meaning that it will have to keep it reserved.
 
If they were aiming for 512MB they'd have a relatively streamlined OS. No way would they need anywhere near 2GB. 1Gb is probably overkill, but gives them room to breath and they can always squeeze it later.

Or if they are happy with 512, just stick to that. They managed to double the ram relatively late in the process, their OS should already be fairly done
If PS4 OS can done with 512mb i can't see 720 OS can't
 
If they were aiming for 512MB they'd have a relatively streamlined OS. No way would they need anywhere near 2GB. 1Gb is probably overkill, but gives them room to breath and they can always squeeze it later.

Or if they are happy with 512, just stick to that. They managed to double the ram relatively late in the process, their OS should already be fairly done

7 is already an overkill .
with os firmware you always need more ram
 



Screen-Shot-2012-06-16-at-17.52.26.jpg


image104.png
 
If they were aiming for 512MB they'd have a relatively streamlined OS. No way would they need anywhere near 2GB. 1Gb is probably overkill, but gives them room to breath and they can always squeeze it later.

Or if they are happy with 512, just stick to that. They managed to double the ram relatively late in the process, their OS should already be fairly done

No, Sony won't want to have the same problems next gen again. They hadn't enough RAM for the system this gen so they should reserve more to implement new features later.
1GB is minimum, 1.5-2GB would be better.

512MB probably wasn't enough and was also one of the reasons why they increased the RAM, else they'd have had only 2.5-3GB for games.

Later they still can give more RAM to devs again, when they think there won't be any new important features in the future they would have to add.
 
No, Sony won't want to have the same problems next gen again. They hadn't enough RAM for the system this gen so they should reserve more to implement new features later.
1GB is minimum, 1.5-2GB would be better.

512MB probably wasn't enough and was also one of the reasons why they increased the RAM, else they'd have had only 2.5-3GB for games.

Later they still can give more RAM to devs again, when they think there won't be any new important features in the future they would have to add.


My thoughts exactly
 
No, Sony won't want to have the same problems next gen again. They hadn't enough RAM for the system this gen so they should reserve more to implement new features later.
1GB is minimum, 1.5-2GB would be better.

512MB probably wasn't enough and was also one of the reasons why they increased the RAM, else they'd have had only 2.5-3GB for games.

Later they still can give more RAM to devs again, when they think there won't be any new important features in the future they would have to add.

If they were aiming for 512, then 1GB should be plenty. People are scratching their heads wondering what o earth MS needs 3Gb for - suggesting Sony suddenly goes from 512MB to 2Gb is just as head scratching
 
What more features?You know PS4 OS already have a lot of features,even streaming games.



And?Btw that's old

I know it'sold but it's still from Microsoft & for them to say that one of the limitations of the Xbox 360 was "No full fidelity AAA Games + Kinect V1 sensor" says to me that kinect was using some of the Xbox 360 gaming resources.

but Kinect V2 is going to have it's own chip or reserved CPU cores inside the Xbox 3 so it doesn't take away from the games that use it.
 
Can't wait for the forum meltdown when Sony announces the network requirements for PS4's features.

Can't see that happen
Or what features you talking about?

I know it'sold but it's still from Microsoft & for them to say that one of the limitations of the Xbox 360 was "No full fidelity AAA Games + Kinect V1 sensor" says to me that kinect was using some of the Xbox 360 gaming resources.

but Kinect V2 is going to have it's own chip or reserved CPU cores inside the Xbox 3 so it doesn't take away from the games that use it.
Unless it have own chip,if not it still means Kinect2 will use 720 gaming resources,even this time they already put resources to Kinect2 from the start
 
Top Bottom