[Eurogamer\DF] Orbis Unmasked: what to expect from the next-gen PlayStation.

Yes, but it would perform at least as good with less cores but higher clock speeds, because performance scales lineary with clock speeds. You can't get more than 100% performance increase when you double the number of cores.
Or in short: Given the same architecture, 4 cores @ 3.2 GHz will always be at least a little bit better than 8 cores @ 1.6 GHz.

You sure?
 
http://oxcgn.com/2013/01/20/ps4-and-xbox-720-unveiled-dev-spills-the-beans-on-reddit/

Prety much just sums up all the rumors we've heard so far but this part is interesting:

This would allow Microsoft to have Kinect 2.0, and the PS4 to truly be a workhorse. It should also be pointed out that Sony is investing a lot in the system’s GPU compared to that of Microsoft.
In descending order, the most powerful system would be the PS4 with the Xbox 720 in the middle and the Wii U at the bottom.

It's also posted on Xbox dedicated website
 
So..?

in console gaming land, you'll never go beyond a maximum 60fps.

The fact that these devs could deliver 100fps at 3.4Ghz with four cores is irrelevant

It's a win win situation for PC gamers this time round, efficient multi threaded console developed games on low clocked CPU's will deliver higher frame rates for PC versions with more capable CPU's.

Yup, and the real victory is that due to the cpu and gpus being so similar to pc hardware, practically every single third party game will be ported to pc and they'll all be running at a higher framerate and resolution then the consoles whilst being more efficient than previous ports to pc.
 

Compressed YT vids (can't really tell if the gfx settings are the same as on consoles) with terrible performance due to Fraps and the uploader stating "game runs fine" while 90% of the comments say they get worse results are not proof.

I'd like someone like Digital Foundry to test this stuff. If they find out that you indeed get similar results, then I'm convinced that it's true.
 
Compressed YT vids (can't really tell if the gfx settings are the same as on consoles) with terrible performance due to Fraps and the uploader stating "game runs fine" while 90% of the comments say they get worse results are not proof.

I see. Would you at least admit to the possibility that you are wrong, based on the data we do have at hand?
 
Are we really getting excited about Anisotropic filtering (welcome to 2003?) and blurry pseudo post process AA now? (that at least half of gaf despises to no end).

Jesus christ. Lowered expectations doesn't even begin to cover it anymore.
In a console setting AF is very rare and, if present it was a low amount.

SMAA isn't blurry, you're thinking of FXAA and I hate that too. I'm alsot thinking Sony might come up with an SMAA solution that's similar to their excellent MLAA solution on Ps3 since they use the same principles.

All I was saying in that post is that we'll be able to get huge improvements to IQ without taking away power from other improvements.
 
Can someone explain to me the concept of Flops and what they indicate?

In a GPU context, in very simplistic terms, they indicate how much stuff the developer can directly do to pixels before they reach the user's screen. Every pixel on your screen is the result of lots of operations. A large part of those operations are done by shader units on the GPU, running software written by the developer. 'Flops' is a single number summary of how much those shader units can do together.

If anyone wants to argue that 'flops' don't matter, they should make their argument more subtle than just that. Computational ability does matter. If that rumour about UE4 cutting its lighting model down because the consoles don't have the grunt to handle it is true, it's a perfect example of why 'flops' do matter.
 
Maybe some people with alot of insight into how technology works can answer this:

I am curious if it will be easier or harder for developers to create games that work on Both PS4 and Vita - if we will have cross buy between ps4 and vita.
 
What I'm really glad about right now is that Orbis and Durante will have AMD gpus. It should make games run better on my AMD 7870 card, instead of many games having performance issues with AMD cards. The way it's meant to be played? More like the way of dirty business tactics D:


Tell me that devs will stop to neglect (or in some cases actively hamper, it's whispered) my rig when they port from those AMD boxes. Please.

I'm pretty okay with the rumoured machines after dwelling on them, after initial wishes for moar power.
 
I think what is being said is that peak teraflop measurements are not the best way to compare two things. For example A10 5800k is rated around 700 gigaflops. A combination of the cpu and gpu in the APU. It's misleading because it is the gpu in the apu that does the grunt work. And in reality you get to about a third of those flops.

In addition, the cpu bottlenecks are less likely these days.
 
I'm hoping that the design of the HW looks similar to the old chrome phat.
Although as much as I hated pics of the super slim, I think it looks fantastic when screwed onto the silver vertical stand.
Also why do people think that Sony will have more powerful HW than MS? It just doesn't make sense.
 
Also why do people think that Sony will have more powerful HW than MS? It just doesn't make sense.

*looks at article* based entirely on just this article...
Why?

Any console maker can release a console as weak or powerful as they'd like, depending on their internal targets, profitability forecasts, size, intended distribution network, and many other things that may be in our out of the box. The entire focus of the system is what dictates what it's like. Past precedence means little in this case.
 
Also why do people think that Sony will have more powerful HW than MS? It just doesn't make sense.
Because, reasons. Go look at the specs in the article.

Microsoft is a software company, they will focus more on software, Sony is a hardware company, they will focus more on the hardware, which was evident by the CELL, it was a beast hardware wise but didn't take into account how difficult it was to code for until it was too late.
 
Why are people so fixated on the CPU? It wasn't a bottleneck this generation, and it won't be a bottleneck next gen either. Developers have designed engines that are pretty indicative of which hardware pieces offer the most substantial performance increase.
 
Isn't this supposed GPU like compute module there to help out the 8 Jaguar cores, if so won't that be a pretty decent CPU?

For a console with a limited TDP ceiling, yes they're OK (they're obviously not completely vanilla either). But for comparison the Jaguar is a tablet/netbook/etc-oriented CPU whereas Steamroller is their desktop computer chip.

I'm sure there are benches that put Bobcat vs Bulldozer out on the 'net, but if you can't find those try doing Intel Atom vs Intel i5 (or i7). Same deal. Jaguar is supposedly potent for its size (3.1mm^2 per core) but it's a completely different class of CPU than what most people here with gaming rigs are running.

thuway said:
Why are people so fixated on the CPU? It wasn't a bottleneck this generation, and it won't be a bottleneck next gen either.

Uh. Well I'm sure there are plenty of documented examples where this gen's console CPUs have been bottlenecked in some fashion lol. Thing is, the PPE was great at vectorized code this generation, while sorely lacking for general purpose code (what was it - something like 0.4IPC? I don't recall at the moment, but it wasn't good). The vanilla Jaguar is in the exact opposite situation. That's where things like compute units on the GPU (or otherwise) help out, however resources are always limited on a console so you have to decide where you're going to dedicate them.

I'd say a lot of the time developers choose to make a pretty picture before they choose other things.
 
Isn't this supposed GPU like compute module there to help out the 8 Jaguar cores, if so won't that be a pretty decent CPU?

I posted about that last night. If I'm reading the DF article right the GPU section of the APU is 1.84TF and Durante estimates the 8 core Jaguar section adds up to 102GF.

I have read on B3D that the custom CU block could add ~200GF to the Jaguar figure above. I don't how possible any of this is or if any of the above is wrong.......Durante could you shed some light?
 
Why are people so fixated on the CPU? It wasn't a bottleneck this generation, and it won't be a bottleneck next gen either. Developers have designed engines that are pretty indicative of which hardware pieces offer the most substantial performance increase.

Stigma of an entry level CPU must be overcome!
Everyone should just nod and say 'they're both doing the same'; accept this signals a good thing and that the CPU is a standard affair and then just move on.
 
This is a nice little example of how in today's multi-threaded development environment, more cores at a lower speed is a significant advantage over less cores at a higher clock rate.

vLk.JPG


It's relevant because tomorrows Jaguar CPU is at about the same level as yesterdays Phenom II, and given that the Radeon 5850 used in this test, will be blown away by the PS4 GPU.

So, if programmed by decent devs, those 8 low clocked Jaguar cores will provide all the grunt needed when combined with the 7970m class GPU.

Of course, getting 8 cores to work nicely together will still be a pain, but the potential is there.

You use dirt 3 as a benchmark for how much cpu power is needed for future (or just current) games?
I happen to own a 3 core version of the phenom II (it's the full uncut phenom II with one core disabled, all the cache all the silicon intact) clocked at 3.6 ghz.
While it's the best bang for buck cpu I ever owned and was amazinggggg value when I bought it in early 2009 and is to this day still more than good enough to handle console ports and most pc games at 60+ fps it absolutely SHITS itself when trying to run pc centric games that try to do more than just corridor shootbang/racing game.

Let's play a game, you guess the average framerates I get in planetside 2 (remember, clocked at 3.6 ghz) during a medium sized skirmish (50-70 people, it goes all the way up to several hundreds of people) near a bio lab with this cpu.
it's 6-15 fps

To start you off with a hint, natural selection 2 drops into the low 20s framerate wise late in a match, sc2 on 4 player maps does too.

Ns2 dynamic infestation, ps2 scope and size etc are just early glimpses of what could be done with a powerful general purpose CPU.
Fact is this low end netbook cpu is going to hold back gameplay design.

It's funny you always see the orbis thread full of people anticipating planetside 2 since it's been hinted.
But I'm pretty sure they are anticipating this http://www.youtube.com/watch?v=BOLZT4WHJKQ and not the heavily cut down version they 'll probably get.
 
For those of you who want backwards compatibility, have you looked at this:

Backwards Compatibility Adaptor - an add-on that enables BC.

http://www.complex.com/video-games/...and-backwards-compatibility-add-on-discovered

I assume it is a Cell processor with some other doohickeys to emulate RSX. Your thoughts?

I brought this up a few times. To my understanding, and based on the architecture and rumoured power of the next ps's GPU, you would really only need the Cell in that box. The rest should "easily" be emulated by the psnext itself. I'm just not able to envision how they manage to overcome the bandwidth restrictions encountered by the connection between the box and the psnext. Perhaps they have some proprietary high speed cable and connection in mind

But I'm pretty sure they are anticipating this http://www.youtube.com/watch?v=BOLZT4WHJKQ and not the heavily cut down version they 'll probably get.

That's still tiny compared to some of the battles i've experienced at "The Crown". As an aside I've stopped playing PS2 though - just found the time it takes to acquire a satisfying amount of points is too slow and those points do actually make a huge difference in game/battle
 
For those of you who want backwards compatibility, have you looked at this:

Backwards Compatibility Adaptor - an add-on that enables BC.

http://www.complex.com/video-games/...and-backwards-compatibility-add-on-discovered

I assume it is a Cell processor with some other doohickeys to emulate RSX. Your thoughts?

I've seen this article before and it is encouraging but is it possible to have a small attachment containing a Cell and whatnot while maintaining thermal viability?
 
I brought this up a few times. To my understanding, and based on the architecture and rumoured power of the next ps's GPU, you would really only need the Cell in that box. The rest should "easily" be emulated by the psnext itself

That would require a fast connection between the PS4 and the BC box. EDIT: you edited after I had already quoted.

I certainly think a BC box is the better way of doing it, make those that want it pay up, and those that don't can save the money. If they change their mind they can buy the addon.

I just don't know if it is viable for them. I mean, how many do you produce? How much does it cost to build a chunk of a PS3? How much will people pay for it?
 
Cell on 28nw would spend less than 20W. I dont know what bandwidth would be needed for such type of an adapter...

Except that a 28nm cell doesn't currently exist in any documented form. Work on Cell improvement has, at least according to IBM, ceased.

I brought this up a few times. To my understanding, and based on the architecture and rumoured power of the next ps's GPU, you would really only need the Cell in that box. The rest should "easily" be emulated by the psnext itself. I'm just not able to envision how they manage to overcome the bandwidth restrictions encountered by the connection between the box and the psnext. Perhaps they have some proprietary high speed cable and connection in mind

Well, to say that an nvidia adapter is "easily" emulatable by a modern ATI part would show a lack of understanding of emulation. Hell, a modern Nvidia part uses a completely different architecture and wouldn't have an "easy" time. It's not like you've got a massive iOS-like abstraction layer in play here.
 
Except that a 28nm cell doesn't currently exist in any documented form. Work on Cell improvement has, at least according to IBM, ceased.



Well, to say that an nvidia adapter is "easily" emulatable by a modern ATI part would show a lack of understanding of emulation. Hell, a modern Nvidia part uses a completely different architecture and wouldn't have an "easy" time. It's not like you've got a massive iOS-like abstraction layer in play here.

That's specifically why i placed "easily" in inverted commas because it was going to be anything but that - realistically. Far from impossible though
 
You use dirt 3 as a benchmark for how much cpu power is needed for future (or just current) games?
I happen to own a 3 core version of the phenom II (it's the full uncut phenom II with one core disabled, all the cache all the silicon intact) clocked at 3.6 ghz.
While it's the best bang for buck cpu I ever owned and was amazinggggg value when I bought it in early 2009 and is to this day still more than good enough to handle console ports and most pc games at 60+ fps it absolutely SHITS itself when trying to run pc centric games that try to do more than just corridor shootbang/racing game.

Let's play a game, you guess the average framerates I get in planetside 2 (remember, clocked at 3.6 ghz) during a medium sized skirmish (50-70 people, it goes all the way up to several hundreds of people) near a bio lab with this cpu.
it's 6-15 fps

To start you off with a hint, natural selection 2 drops into the low 20s framerate wise late in a match, sc2 on 4 player maps does too.

Ns2 dynamic infestation, ps2 scope and size etc are just early glimpses of what could be done with a powerful general purpose CPU.
Fact is this low end netbook cpu is going to hold back gameplay design.

It's funny you always see the orbis thread full of people anticipating planetside 2 since it's been hinted.
But I'm pretty sure they are anticipating this http://www.youtube.com/watch?v=BOLZT4WHJKQ and not the heavily cut down version they 'll probably get.


luckily for them, it doesnt work that way... it is not luck either, it is the way they design consoles that allows to maximize both what they get from hardware and how they design software.

this is why best games for PS3/360 can look so good while being on 7-8 year old hardware and minimum amounts of RAM.

Using your reasoning, 720 wont be able to run Halo 4 smoothly.
 
For those of you who want backwards compatibility, have you looked at this:

Backwards Compatibility Adaptor - an add-on that enables BC.

http://www.complex.com/video-games/...and-backwards-compatibility-add-on-discovered

I assume it is a Cell processor with some other doohickeys to emulate RSX. Your thoughts?

I personally don't care for hardware BC. All I want is my digital content to carry over from my PS3 to my PS4. I guessing this device won't help in doing that.
 
i'd quite like it if sony offered 2 models of the ps4, one with built-in bc and one without.

that way they could hit a low entry point price to market, and make people who wanted bc pay up.

i'd happily pay up for bc providing it was flawless (obviously).
 
We also have hard data on Orbis's memory set-up. It features 4GB of GDDR5 - the ultra-fast RAM that typically ships with the latest PC graphics cards - with 512MB reserved for the operating system. This is in stark contrast to the much slower DDR3 that Durango will almost certainly ship with. Microsoft looks set to be using an offshoot of eDRAM technology connected to the graphics core to offset the bandwidth issues the use of DDR3 incurs. Volume of RAM is the key element in Durango's favour - there'll be 8GB in total, with a significant amount (two sources we've spoken to suggest 3GB in total) reserved for the OS.

Okay, so Orbis will have 1 gig less RAM for gaming than Durango, but Orbis has a way better type of RAM than Durango.

So...which ones wins on the memory front?
 
I personally don't care for hardware BC. All I want is my digital content to carry over from my PS3 to my PS4. I guessing this device won't help in doing that.

You are not going to get digital content from your PS3 working on your PS4 unless they manage to get ALL PS3 games running.

They are the same games, just distributed differently.
 
It'll be interesting to see how psn will work. Because not being able to sell ps3 games via psn on ps4 is kinda stupid. Sure you could port Gaf's game of the year, Journey, but tons of stuff already on PSN gets lost in the shuffle. BC in the digital era might be more important than before.

Edit: Actually it's better for third party too.
 
i'd quite like it if sony offered 2 models of the ps4, one with built-in bc and one without.

that way they could hit a low entry point price to market, and make people who wanted bc pay up.

i'd happily pay up for bc providing it was flawless (obviously).

Yeah I'd be willing to pay $100-$150ish for flawless BC. It'd be worth it to maintain the digital library I've built up with PS+ and normal purchases. It'd be marketing suicide to give people all these games and then tell people they can't play them once their PS3's break. It'd be even worse for people who bought a decent chunk of full price items off the PSN (me).
 
It isn't. We're talking about x86 CPUs and AMD cards here, there's no learning curve or familiarization period for devs. Most of them have been coding on x86 all their life, they know how to get the most out of them. It's reasonable to assume that the gap between early-gen and late-gen games will be smaller.

Improvement will come less from being able to overcome the learning curve and more from finding ways to maximize performance (gfx fidelity) with each iteration of games within the known boundaries.

By ND's own admission, they knew PS3 in the bone while developing U2 which looked better than U1 (learning curve). But this did not stop U3 from looking even better than its predecessor. In the same way, Epic, who quintessentially outlined the existing specs in 360 today, would have known all about the hardware by the time they started to make GeoW2, if not GeoW1. Despite this, they still managed to outdo its visuals with GeoW3. Lest we forget, 360 is closer to PC architecture than PS3 and it was so by design to reduce the learning curve for developers.

It'll be no different this gen. The devs will learn to utilize resources better (with new things like tessellation) and come up with the ultimate rendition of what's possible on the next gen during its twilight years. If you have reason to doubt this, then you're welcome to wait and watch.
 
Top Bottom