DF: Orbis vs Durango Spec Analysis

lol, 720p isn't garbage. most people sit far enough away they can't tell the difference between 1080p and 720p.

sure, 1080p is better but it still won't be a forced standard with these consoles. A good gaming PC is still needed for smooth 1080p on new games.

and you're definitely out if you think 60fps will be standard. not enough people buying games give a shit. (and those who do will get the PC version) 30 fps will be used for the vast majority of console next gen games. bet on it.

720p is garbage.
 
I think the biggest reason people think that is because the biggest franchise of this gen, COD, runs sub 720p. Also, some early 360 exclusives, including Halo 3 were sub 720p.

sometimes this isn't a result of hardware limitations, just badly optimized engines. a quick google search brought me to This article.

HD games on 360 that were sub HD on PS3:

GTA4
Bioshock
Guitar Hero 3
Dark Sector
Fracture
LOR:Conquest
Midnight club: LA
MTX vs ATV untamed
Red Dead Redemption
Silent Hill: Homecoming

could any of these have been HD on ps3? certainly. The issue was budget and timing, not hardware.
 
sometimes this isn't a result of hardware limitations, just badly optimized engines. a quick google search brought me to This article.

HD games on 360 that were sub HD on PS3:

GTA4
Bioshock
Guitar Hero 3
Dark Sector
Fracture
LOR:Conquest
Midnight club: LA
MTX vs ATV untamed
Red Dead Redemption
Silent Hill: Homecoming

could any of these have been HD on ps3? certainly. The issue was budget and timing, not hardware.

They could have all been HD. But the key here is being HD while matching the 360 version's graphics on display with comparable performance.
 
I think the biggest reason people think that is because the biggest franchise of this gen, COD, runs sub 720p. Also, some early 360 exclusives, including Halo 3 were sub 720p.

Certainly. CoD, however, is more likely still sub-hd because any further development on the engine has been hindered by Infinity Ward's departure than an actual hardware limitation.
 
720p can't be real.

I think they both have to target 1080p. Native 1080p is much better than 720p with a bunch of AA.
Agreed. Orbis' specs (bandwidth, ROPs) seemed design around targeting native 1080p and there's a lot of rumors about Durango doing some impressive and interesting multilayered scaling to 1080p. They want that number, if purely for marketing purposes. Will there be sub 1080p games? Absolutely, but all the way down to 720p is unlikely.
 
I'm confused. Without getting into the whole "secret sauce" crap...isn't the entire OP, or at least the Link to the Eurogamer article, predicated on the hypothesis that the Durango GPU is an off the shelf AMD GCN-based GPU, with no customization at all. Aren't they in fact comparing benchmarks between AMD's stock GPUs, and making the claim that the same will apply to whatever ends up in Orbis and Durango?
 
But then again, the guy trying to sell the Durango dev kits ended up saying that the specs released were old and that the final Durango specs are comparable to Orbis.

Did this also turn out to be bullshit or are we just still waiting?

EDIT: Like others have said, if we end up seeing a majority of 720p games next-gen, then both "next-gen" systems can go get fucked, because that's completely inexcusable at this point. I expect 1080p at 30fps to be the standard, with the occasional 1080p/60fps titles showing up with their graphics turned up.
 
But then again, the guy trying to sell the Durango dev kits ended up saying that the specs released were old and that the final Durango specs are comparable to Orbis.

Did this also turn out to be bullshit or are we just still waiting?

While we're technically still waiting, the concrete information we do have isn't leaving much room for secret sauce.
 
But then again, the guy trying to sell the Durango dev kits ended up saying that the specs released were old and that the final Durango specs are comparable to Orbis.

Did this also turn out to be bullshit or are we just still waiting?

Still waiting. We'll find out soon enough.

A part of me is seeing MS "leaking" Durango specs on the 20th.
 
yeah, yeah. so, do you already have a gaming PC?
I do, though i don't expect nex-gen consoles to perform as well... i do expect them to take a step forward though, and settling for 720p anything but progress.

Why are half the people on here defending 720p and saying it's acceptable? It's not. We already have 1 Wii U. We don't need two more.
 
I'm really hoping both specs are wrong because they are rather underwhelming. I think they will have some nice looking games but how long will that last before the next next generation surfaces?
 
I do, though i don't expect nex-gen consoles to perform as well... i do expect them to take a step forward though, and settling for 720p anything but progress.

Why are half the people on here defending 720p and saying it's acceptable? It's not. We already have 1 Wii U. We don't need two more.

I think a lot of people don't realize how many TV manufacturers, even budget ones, are pulling the plug on 720p or that the jump from 720p to 1080p is a good deal less of a power hog than the jump from SD to HD was. Previous generations also had resolution jumps, we just didn't have as much info about that readily available, and the jump to 1080p is in line with those standard increases. Last gen to this gen was an anomaly.
 
its selling exactly the same numbers windows 7 did
you know the best selling OS ever

Alright, I was wrong. I thought I had read that Windows 8 was under performing. But I was wrong. Just read some articles on it saying it's sales are on track with Windows 7 (which is what you guys said). My bad. I'll research first next time, before making that kind of post.
 
720p with decent upscaling is acceptable, but actually seeing a complex 3D game running natively at 1080p on a TV will change how you look at things, especially if it holds 60fps as well. Sadly, almost no console game currently exists that let's you see this. I had to hook up the PC version of Crysis 2 to a TV to finally see some of that full potential (and I couldn't even run the game in DX11). It very nearly felt like a generational leap.

Really though, I just want 60fps to become the standard for shooters like it is for fighting and racing games.
 
I don't see devs wasting bandwidth on 4x, let alone 8x MSAA. There are better uses for the resources.

I would guess we will see 1080P 0AA, 2xAA, FXAA, etc.

Framerate%201920x1080%20FPS.png


You can see how performance goes down with more MSAA, that 660Ti is 192-bit GDDR5, which is 130GB/s.

1280x1080, 2xMSAA and FXAA should provide incredible IQ for console standards as well. That would look really, really nice for 99% of people.
 
You can save yourself some disappointment and expect 1080p, 30fps, minimal AA as a baseline for most games.

This is exactly what I'm expecting. 1080p/30 (hopefully solid and with no tearing) combined with something like 2x SMAA will be a huge jump from this gen and not be too taxing on these machines. Keep in mind we'll also have tesselation and real AF support on top of that and those will improve visuals a lot without being an enormous power hit.
 
This is exactly what I'm expecting. 1080p/30 (hopefully solid and with no tearing) combined with something like 2x SMAA will be a huge jump from this gen and not be too taxing on these machines. Keep in mind we'll also have tesselation and real AF support on top of that and those will improve visuals a lot without being an enormous power hit.

I could live with 1080p/30... but please god let GT6 run at 60fps
 
I have no issues with 720p just saying. If devs want to give me an amazing looking game at 720 on D+O, I'll be okay with it. If they can push it so its 1080, that's also great.

30fps locked of course.
 
It does to me. I want nothing to do with PC gaming. I care about image quality, but I'm not willing to deal with the hurdles necessary to get it on a PC. Not to mention the large number of games that simply never get a PC release.

And what about the large number of games that never get a console release?

Setting up a PC is as easy as building Legos. Really, consoles nowadays require just as much patching and setting up the image correctly as a PC.

Open Steam, download game, press play, set up graphics to your liking, have fun.

Bullshit.

I care about image quality more than anyone else on this forum, and I will never, ever, ever game on a PC
. On my 50-inch Pioneer Kuro KRP-500M (ISF-Day activated, D-nice-calibrated), there is a HUGE visual difference between content that is 1080p-native and 720p.

720p is blurry, smeary shit compared to 1:1 pixel-mapped 1080p.

Yeah if you'd really care so much about IQ you would.
 
This is exactly what I'm expecting. 1080p/30 (hopefully solid and with no tearing) combined with something like 2x SMAA will be a huge jump from this gen and not be too taxing on these machines. Keep in mind we'll also have tesselation and real AF support on top of that and those will improve visuals a lot without being an enormous power hit.
I wonder if Sony can use the additional CUs for MLAA or any kind of post processing AA just like SPUs did with PS3. That would save some resources in the GPU. The CU address the same address space as the GPU so maybe it's possible.
 
I could live with 1080p/30... but please god let GT6 run at 60fps

Oh, I'm sure stuff like racing and fighting games will still prioritize 60fps. At least in Orbis' case, the bandwidth will make hitting 60fps easier although I doubt it'll be much more common than it was this gen.

I wonder if Sony can use the additional CUs for MLAA or any kind of post processing AA just like SPUs did with PS3. That would save some resources in the GPU. The CU address the same address space as the GPU so maybe it's possible.

It's possible. I could see them working with SMAA since it shares a lot of the same principles as MLAA, just a lighter power hit and works better. It would also be nice if they made it a standard part of their toolset, like they did with MLAA. It would be much easier to implement well than MLAA was since that was so heavily based on Cell SPUs.
 
Oh, I'm sure stuff like racing and fighting games will still prioritize 60fps. At least in Orbis' case, the bandwidth will make hitting 60fps easier although I doubt it'll be much more common than it was this gen.

We might see a "60fps honeymoon" early on for games that are based on current-gen engines. Apart from that, I agree.
 
We might see a "60fps honeymoon" early on for games that are based on current-gen engines. Apart from that, I agree.

I agree with you. If there are a lot of cross gen games I could see them being 1080p/60 on next gen systems with a couple extra effects and higher res textures.
 
lol, 720p isn't garbage. most people sit far enough away they can't tell the difference between 1080p and 720p.

sure, 1080p is better but it still won't be a forced standard with these consoles. A good gaming PC is still needed for smooth 1080p on new games.

and you're definitely out if you think 60fps will be standard. not enough people buying games give a shit. (and those who do will get the PC version) 30 fps will be used for the vast majority of console next gen games. bet on it.

For a long time I was of the opinion that 1080p wasn't that much better than 720p and that I wouldn't mind it if next-gen games stayed at that resolution, but after buying myself a monster PC and playing maxed-out games in 1080p, my opinion has changed. I want every next-gen game to be in native 1080p. All of them. It should be mandatory, even.

Bring on the next-gen.
 
They just love taking things from GAF

I have no issues with 720p just saying. If devs want to give me an amazing looking game at 720 on D+O, I'll be okay with it. If they can push it so its 1080, that's also great.

30fps locked of course.
Say no to 30 fps! Say no!
 
Alright, so these are the specs people are comparing?

Wii U

CPU:

  • Broadway based
  • 3 CPU cores running at 1.2GHz
  • 3MB aggregate L2 Cache size.

GPU:
  • 320 stream processors
  • 16 texture mapping units
  • 8 ROPs
  • 550 MHz
  • 362 estimated Gflops

Storage and Memory:

  • 2GB DDR3 RAM
  • 1GB DDR3 RAM available for games
  • 32MB onboard eDRAM
  • 50 GB discs
  • 8GB/32GB internal flash based memory storage

Durango

CPU:

  • x64 Architecture
  • 8 CPU cores running at 1.6 gigahertz (GHz)
  • each CPU thread has its own 32 KB L1 instruction cache and 32 KB L1 data cache
  • each module of four CPU cores has a 2 MB L2 cache resulting in a total of 4 MB of L2 cache
  • each core has one fully independent hardware thread with no shared execution resources
  • each hardware thread can issue two instructions per clock

GPU:
  • Custom D3D11.1 class 800-MHz graphics processor
  • 12 shader cores providing a total of 768 threads
  • Each thread can perform one scalar multiplication and addition operation (MADD) per clock cycle
  • At peak performance, the GPU can effectively issue 1.2 trillion floating-point operations per second

Storage and Memory:
  • 8 gigabyte (GB) of RAM DDR3 (68 GB/s)
  • 5-6 GB available for games (estimate)
  • 32 MB of fast embedded SRAM (ESRAM) (102 GB/s)
  • Hardrive
  • 50GB Discs

Orbis

CPU:
  • Orbis contains eight Jaguar cores at 1.6 Ghz, arranged as two “clusters”
  • Each cluster contains 4 cores and a shared 2MB L2 cache
  • 256-bit SIMD operations, 128-bit SIMD ALU
  • SSE up to SSE4, as well as Advanced Vector Extensions (AVX)
  • One hardware thread per core
  • Decodes, executes and retires at up to two intructions/cycle
  • Out of order execution
  • Per-core dedicated L1-I and L1-D cache (32Kb each)
  • Two pipes per core yield 12,8 GFlops performance
  • 102.4 GFlops for system

GPU:

  • GPU is based on AMD’s “R10XX” (Southern Islands) architecture
  • DirectX 11.1+ feature set
  • Liverpool is an enhanced version of the architecture
  • 18 Compute Units (CUs)
  • Hardware balanced at 14 CUs
  • Shared 512 KB of read/write L2 cache
  • 800 Mhz
  • 1.843 Tflops, 922 GigaOps/s
  • Dual shader engines
  • 18 texture units
  • 8 Render backends

Memory:
  • 4 GB unified system memory, 176 GB/s
  • 3.5 available to games (estimate)
 
Agreed. Orbis' specs (bandwidth, ROPs) seemed design around targeting native 1080p and there's a lot of rumors about Durango doing some impressive and interesting multilayered scaling to 1080p. They want that number, if purely for marketing purposes. Will there be sub 1080p games? Absolutely, but all the way down to 720p is unlikely.

Prediction:

The beginning:

1080p 30fps
Dynamic 1080p (1280-1920) 30/60fps

The mid life:

Dynamic 1080p 30/60fps

The end:

Dynamic 1080p 30/60fps
720p 60fps
720p 30fps
 
But then again, the guy trying to sell the Durango dev kits ended up saying that the specs released were old and that the final Durango specs are comparable to Orbis.

Did this also turn out to be bullshit or are we just still waiting?

According to Digital Foundry, these specs we have now are pretty much final.


Digital Foundry said:
In the case of the brush-strokes of the Durango and Orbis specs, not only do we have double-sourced information of our own, but we also have an extra form of backup in the form of these other leaks. Therefore, our belief is that the specs we are discussing are not only accurate, but very, very close - if not identical - to the make-up of the final hardware.

Holding out hope for noteworthy changes is wishful thinking at this point.
 
They could have all been HD. But the key here is being HD while matching the 360 version's graphics on display with comparable performance.

Not sure if you are being sarcastic here but Bioshock, Guitar Hero, Dark Sector, Homecoming etc are not graphical beasts to not be rendered in HD on PS3.
 
And what about the large number of games that never get a console release?

I rarely have any interest in games that end up being PC exclusives. If it involves m/kb I'm out.

Setting up a PC is as easy as building Legos. Really, consoles nowadays require just as much patching and setting up the image correctly as a PC.

Open Steam, download game, press play, set up graphics to your liking, have fun.

One visit to any PC thread on the day of release tells me that is not true at all. Dudes asking why they're having this problem, that problem. What drivers are you running? What GPU do you have? Nvidia runs this game better, etc, etc, etc. I see it all the time.
 
Prediction:

The beginning:

1080p 30fps
Dynamic 1080p (1280-1920) 30/60fps

The mid life:

Dynamic 1080p 30/60fps

The end:

Dynamic 1080p 30/60fps
720p 60fps
720p 30fps


Beginning - 1080p 60fps [devs focusing on current detail of assets to lower the cost while install base is low]

Mid life - 1080p 60-30fps

End - 1080p 30fps - 720p60fps
 
Can someone in the know confirm that 16X AF is effectively "free" on these consoles and that it will be a standard feature in games?

Just that alone would make games look so much better IMO.
 
Can someone in the know confirm that 16X AF is effectively "free" on these consoles and that it will be a standard feature in games?

Just that alone would make games look so much better IMO.

Just going by TUs it should be free across the board on Orbis and 8x should be free on Durango, maybe some games could push it to 16x.
 
So just a question on the RAM thing...

The Orbis has faster RAM, and the Durango has more RAM, right?
When people say things like, "well what if a game makes use of 6GB of RAM?" or something, people say things like, "Games like Crysis run on PC in 1080p@60FPS etc on only 2/3GB of RAM, so more RAM wouldn't be a big deal" but aren't those PC games running at such high IQ also running on only DDR3 RAM? I know my PC has 4GB of RAM, but it is only DDR3, and I have no problem running the latest games at 1080p, so why should DDR3 on the Durango matter?
 
So just a question on the RAM thing...

The Orbis has faster RAM, and the Durango has more RAM, right?
When people say things like, "well what if a game makes use of 6GB of RAM?" or something, people say things like, "Games like Crysis run on PC in 1080p@60FPS etc on only 2/3GB of RAM, so more RAM wouldn't be a big deal" but aren't those PC games running at such high IQ also running on only DDR3 RAM? I know my PC has 4GB of RAM, but it is only DDR3, and I have no problem running the latest games at 1080p, so why should DDR3 on the Durango matter?

The latest graphics cards utilize GDDR5 RAM.

It's the cutting edge. That's why it matters in a nutshell. Can you get by without it? Yes, but faster is better in this case.
 
So just a question on the RAM thing...

The Orbis has faster RAM, and the Durango has more RAM, right?
When people say things like, "well what if a game makes use of 6GB of RAM?" or something, people say things like, "Games like Crysis run on PC in 1080p@60FPS etc on only 2/3GB of RAM, so more RAM wouldn't be a big deal" but aren't those PC games running at such high IQ also running on only DDR3 RAM? I know my PC has 4GB of RAM, but it is only DDR3, and I have no problem running the latest games at 1080p, so why should DDR3 on the Durango matter?
GPUs need a lot more bandwidth than other features do. A modern PC GPU has GDDR5 that it uses for all graphics related functions. Durango's set up would leave it with less bandwidth for the GPU functions than Orbis. I'm not really sure how Orbis' extra RAM bandwidth would affect other features since GDDR5 isn't available as a primary, all purpose RAM.
 
The latest graphics cards utilize GDDR5 RAM.

It's the cutting edge. That's why it matters in a nutshell. Can you get by without it? Yes, but faster is better in this case.

I see. I had heard that the GPUs in these new consoles were kind of midrange, so I didn't think GDDR5 vs DDR3 would matter that much, but I guess I was wrong.
 
Pixar movies are rendered at something like 8000p or even more.... then downsampled to 1080p (or whatever they store their master copy in, probably 4k) then to whatever res you watch it in.
That's an insane amount of pixels to sample from.... more AA than you'll ever get on a PC

It's ridiculous to compare that to native 720p.

FUD indeed... in your post.
It's FUD to say 720p upscaled to 1080p is inherently bad.

I didn't state PS4 is capable of Pixar IQ. A Pixar bluray at 720p blows away any game rendered natively at 1080p.

I merely proposed that resolution alone is not an indicator of IQ, which is not FUD. With enough horsepower dedicated to AA, AF, etc, a 720p title can look cleaner and better than 1080p. That is not FUD, it is fact.

This "720p = bad" shit needs to go away; the quality of the source material is the most important facet.
 
Top Bottom