[Eurogamer\DF] Orbis Unmasked: what to expect from the next-gen PlayStation.

Why do you keep posting stuff like this?

I'm just joking around. I will have a PS4 and relish in its glory.

Hes been saying the same kind of stuff before in the thread

I was under the impression that Dreamcast to Xbox is actually a pretty big jump? I never had a Dreamcast and Xbox already seemed pretty impressive compared to the other 2 consoles.
Someone else in this thread told me this, and it sounded reasonable:

Dreamcast > Xbox

(just a guess)

Also I apologize, I didn't realize this thread was being taken so seriously.
 
While you've been enjoying 16xAF on PC...

There hasn't been excellent AF in these consoles and arcade hardware:

Model 2
Model 3
Sega Dreamcast
PS2
Gamecube
Xbox
Xbox 360
Wii
Wii U
PS3
Naomi 2
Lindbergh
RingEdge

I know huh? It's never really been a priority for years...

Point is AF isn't a bulletpoint these days, at all. It's like saying this hotel room has a toilet that actually flushes, for realsies guys , it's happening.

AF is such a baseline feature (I always was very annoyed that neither ps3 or xbox 360 bothered with it) that comes at no performance hit on any gpu released in the past 6-7 years.

You know what WOULD have been a cool bulletpoint? that real time light bouncing feature demod in UE4, but apparently that is now rumored to be canned because the hardware in ps4/xbox3 isn't up to it.
 
Point is AF isn't a bulletpoint these days, it's like saying this hotel room has a toilet that flushes.

AF is such a baseline feature (I always was very annoyed that neither ps3 or xbox 360 bothered with it) that comes at no performance hit on any gpu released in the past 6-7 years.

You know what WOULD have been a cool bulletpoint? that real time light bouncing feature demod in UE4, but apparently that is now rumored to be canned because the hardware in ps4/xbox3 isn't up to it.

It's not a rumour.
 
Point is AF isn't a bulletpoint these days, it's like saying this hotel room has a toilet that flushes.

AF is such a baseline feature (I always was very annoyed that neither ps3 or xbox 360 bothered with it) that comes at no performance hit on any gpu released in the past 6-7 years.

You know what WOULD have been a cool bulletpoint? that real time light bouncing feature demod in UE4, but apparently that is now rumored to be canned because the hardware in ps4/xbox3 isn't up to it.

That and maybe that implementation isn't ready.

UE3 has improved alot from it's initial reveal.
 
While you've been enjoying 16xAF on PC...

There hasn't been excellent AF in these consoles and arcade hardware:

Model 2
Model 3
Sega Dreamcast
PS2
Gamecube
Xbox
Xbox 360
Wii
Wii U
PS3
Naomi 2
Lindbergh
RingEdge

I know huh? It's never really been a priority for years...

Shouldn't be an issue now. 7970m texture fill rate is 68 gigatexels per second, compared to 13 on RSX and 8 on Xenos. It's even more than a GTX580 (49.4).
 
I said 100+ (distinction). As I said, despite some similarities (little CPU bigger GPU) the Wii u still has a hardware deficiency. And a memory one too. A 60 or 100 watt difference in the world of physics is a pretty large difference in terms of percentage.

While downports are not utterly impossible in many cases like they often were with Wii, there would still be considerable challenge.

The first one being the ROI issue that is currently leading western publishers' bean counters not to give a shit (and pigeonholing the perceived audience to an extent).

I was joking.

Although, given the specs being bandied about, I don't think Witcher 2 PC vs. Witcher 2 360 is an unrealistic expectation for Wii U downports; the caveat being, as you said, a publisher's desire to sell to the Nintendo market.

There was speculation about Nintendo's architecture/design choices taking the next-gen into account and, at a glance, it appears that Durango and Wii U do share a design philosophy. The Orbis, on the other hand, seems like the outlier. It will be interesting to see the first game released on all three at the same time.
 
Point is AF isn't a bulletpoint these days, at all. It's like saying this hotel room has a toilet that actually flushes, for realsies guys , it's happening.

AF is such a baseline feature (I always was very annoyed that neither ps3 or xbox 360 bothered with it) that comes at no performance hit on any gpu released in the past 6-7 years.

You know what WOULD have been a cool bulletpoint? that real time light bouncing feature demod in UE4, but apparently that is now rumored to be canned because the hardware in ps4/xbox3 isn't up to it.
Who wants to tell me what this is about?
 
I still expect $399 as a price ceiling. $499 simply isn't viable in the marketplace.

It would amuse me if one ended up starting at $349. I can't fathom how the Wii U could be sold at a loss beyond exchange rates, poor supplier negotiation and unusually favorable retailer margins. And I think such pricing coupled with $599 in the past has distorted expectations.

£250? That's madness. You do realise an iPad is £400 don't you?

BTW they have to start at a high price, as it leaves room for lots of price cuts to continue pumping life into the console. It would be ridiculous not to take advantage of all the early adopters who will pay anything for it. $499 - £350 plus or bust.

Sony cuts the price within the first year anyway
 
Point is AF isn't a bulletpoint these days, at all. It's like saying this hotel room has a toilet that actually flushes, for realsies guys , it's happening.

AF is such a baseline feature (I always was very annoyed that neither ps3 or xbox 360 bothered with it) that comes at no performance hit on any gpu released in the past 6-7 years.

You know what WOULD have been a cool bulletpoint? that real time light bouncing feature demod in UE4, but apparently that is now rumored to be canned because the hardware in ps4/xbox3 isn't up to it.
What's the source on the last part?
 
From those leaked memos or docs they have Bungies next game locked as a timed exclusive. Im positive that will be there big megaton with 720 at E3.

No they don't.

The leaked contract just gives Bungie the ability to not have to launch on PS3 if they do not have the resources to do so.

I'm pretty sure Activision is going to be giving them all the resources they need to make it a reality.
 
It starts with a more expensive controller and the fact that despite the negativity it has, overall, more juice than the 7th gen hd consoles. A look at the prices and the lack of further shrinkage and retailer-rebate-based cost resuction strategies of the current gen HD consoles should tell you something as well.
Even taking into account for the controller I still can't fathom how the costs could be so high. Full tablets will retail for less than $100.

Being better, but still ultimately comparable, to 7 year old consoles drives them into negative margins while charging $100+ more?
 
Will we see any gameplay footage (tech demos?) by E3, or is this unlikely.
Last month, Guerrilla Games mentioned that this year is possibly going to be their biggest ever since they released the first Killzone. So it's highly expected they will show something a la 2005. Make sure you have an extra pair of clean pants by then.
 
£250? That's madness. You do realise an iPad is £400 don't you?

BTW they have to start at a high price, as it leaves room for lots of price cuts to continue pumping life into the console. It would be ridiculous not to take advantage of all the early adopters who will pay anything for it. $499 - £350 plus or bust.

Sony cuts the price within the first year anyway

He said $400 not £250. You can't just do a direct currency exchange because that never ever works out to be the case.

If the consoles are $400 then the'll also be €400 and then about £330-350. Direct currency conversions from the dollar would make it £252 and €300 which we know nobody adheres to.

That iPad you are talking about is £400 in the UK, but only $500 in the US, not the $634 a direct currency conversion would suggest.


Also your post seems to suggest (apologies if I'm misreading) that they start at a high price just so that they can reduce the price later on to keep it fresh. They start at a high price because it costs them a high price to design, manufacture and ship the product only to then sell it typically at launch at a loss. They reduce the price yes to keep it ticking over, but only because the price of manufacturing has dropped in the first place.

Compared again to your iPad which is sold for a handy profit for everyone involved.


My thoughts on the subject are that it'll be $€400/£330-350 for the lowest SKU with a more expensive one with some pack ins and a larger HDD.

What I do NOT want to see is the lowest SKU being a gimped version of the console. No HDD is fine (on the logical assumption I can put my own in), but no less ports or anything like we saw with the PS3 launch.
 
Even taking into account for the controller I still can't fathom how the costs could be so high. Full tablets will retail for less than $100.

Being better, but still ultimately comparable, to 7 year old consoles drives them into negative margins while charging $100+ more?

Yes.

Actually he's correct. You don't judge this things from the task manager.

I wasn't.

Its much more than that, 40W at 45 nm process equals to much less wattage at 28nm process.

I thought the Wii U GPU was 40nm? Regardless...
 
Even taking into account for the controller I still can't fathom how the costs could be so high. Full tablets will retail for less than $100.

Being better, but still ultimately comparable, to 7 year old consoles drives them into negative margins while charging $100+ more?

Somebody estimated that the bom was kinda off, and the wii u is more expensive than it seems.
But I agree with you. No idea what makes the Wii u so expensive. I bet it goes down to $199 as soon as the next two hit retail.
 
When Sony showed off the Vita, we got a glimpse of the lineup at the Playstation Meeting.

Not a full blown reveal of anything, just a taste -- they decided to show off Uncharted a bit, without going into too many details.

I expect the same thing to happen. At the event for PS4 they will show off a brief trailer of a collection of games, along with some nice showcase of a standout title that shows off the hardware.

Also -- is 2013 pretty much a lock now for PS4? Going with jaguar cores should allow them to not have any issues launching this year, unless I'm missing something.
 
When Sony showed off the Vita, we got a glimpse of the lineup at the Playstation Meeting.

Not a full blown reveal of anything, just a taste -- they decided to show off Uncharted a bit, without going into too many details.

I expect the same thing to happen. At the event for PS4 they will show off a brief trailer of a collection of games, along with some nice showcase of a standout title that shows off the hardware.

Also -- is 2013 pretty much a lock now for PS4? Going with jaguar cores should allow them to not have any issues launching this year, unless I'm missing something.

They also bullshotted the hell out of uncharted vita:p

Looking forward to the target renders, EA sports bullshots ™ and all the one million troops overselling of their laptop hardware.
 
I love this console optimization truther bullshit making the rounds on GAF atm. Try running Crysis 3 on a X1800XT at any setting and see if you get even close to 20 fps.
 
Point is AF isn't a bulletpoint these days, at all. It's like saying this hotel room has a toilet that actually flushes, for realsies guys , it's happening.

AF is such a baseline feature (I always was very annoyed that neither ps3 or xbox 360 bothered with it) that comes at no performance hit on any gpu released in the past 6-7 years.

You know what WOULD have been a cool bulletpoint? that real time light bouncing feature demod in UE4, but apparently that is now rumored to be canned because the hardware in ps4/xbox3 isn't up to it.

I find it hard to believe hat Epic would just "cancel" a feature and throw away all the time and money they put into it. You got a link?
 
Somebody estimated that the bom was kinda off, and the wii u is more expensive than it seems.
But I agree with you. No idea what makes the Wii u so expensive. I bet it goes down to $199 as soon as the next two hit retail.

I suspect there's a lot creative accounting and as many costs as possible put into Wii U calculation to create PR "truth" that they are selling it at loss and not ripping consumers like they did with Wii and 3DS.
 
They also bullshotted the hell out of uncharted vita:p

Looking forward to the target renders, EA sports bullshots ™ and all the one million troops overselling of their laptop hardware.

Such is the case with pretty much all console launches, though.

I love this console optimization truther bullshit making the rounds on GAF atm. Try running Crysis 3 on a X1800XT at any setting and see if you get even close to 20 fps.

The PC build of crysis 3 isn't going to be the same build being run on the Xbox 360 (thus rendering the comparison moot), though somebody did post crysis 2 running with console equivalent settings on an x1900 the other day performing better than you'd think. Don't recall who or what.

As far as "creative accounting" or "lying" about the Wii u - they said they're taking a small loss at an investors meeting - per unit (which does not take r&d and other things into account) so that's not the case. please don't try to open that wreckless can of worms again. Especially in a DF thread.
 
Nope. If you read the posted DF rumour, what would suggest 599 to you?
Nothing. However, my (untrained) eye sees considerably more powerful and/or more costly tech than current guesstimation of the Wii U. With suggestions of controller gimmicks in the DS3 successor and the inclusion of Kinect 2.0.

Which implies consumer unfriendly price point or more reasonable price with substantial subsidisation to me. Or the Wii U isn't as costly as its price point suggests.
 
Looks like you don't know what closed hardware is

It's certainly not just a Pc with similar specs running PC versions of the games, like some posters here suggest.
Games made from ground up for those consoles will look waaay better than what we currently have.
 
Can someone explain to me the concept of Flops and what they indicate?

Wikipedia explains it well enough:

"In computing, FLOPS (or flops, for FLoating-point Operations Per Second, also flop/s, see below) is a measure of computer performance, especially in fields of scientific calculations that make heavy use of floating-point calculations, similar to the older, simpler, instructions per second."

d971dda51ba77a4f7de13fb5917cdb91.png


It simply measures the peak number of floating point operations a computer can perform in an optimal case.
Of course in the real world the optimal case will barely be reached, and they don't tell you anything about other parts of a system such as memory bandwidth or other computing tasks (integer, branches..). That's why they can easily be misinterpreted by the layman.
Yet for example, they are a good indicator of shader performance when you compare different GPUs of the same or similar architecture.
 
This is a nice little example of how in today's multi-threaded development environment, more cores at a lower speed is a significant advantage over less cores at a higher clock rate.

vLk.JPG


It's relevant because tomorrows Jaguar CPU is at about the same level as yesterdays Phenom II, and given that the Radeon 5850 used in this test, will be blown away by the PS4 GPU.

So, if programmed by decent devs, those 8 low clocked Jaguar cores will provide all the grunt needed when combined with the 7970m class GPU.

Of course, getting 8 cores to work nicely together will still be a pain, but the potential is there.
 
This is a nice little example of how in today's multi-threaded development environment, more cores at a lower speed is a significant advantage over less cores at a higher clock rate.

vLk.JPG


It's relevant because tomorrows Jaguar CPU is at about the same level as yesterdays Phenom II, and given that the Radeon 5850 used in this test, will be blown away by the PS4 GPU.

So, if programmed by decent devs, those 8 low clocked Jaguar cores will provide all the grunt needed when combined with the 7970m class GPU.

Of course, getting 8 cores to work nicely together will still be a pain, but the potential is there.

Console developers have been building their experience up this gen, the 360 has 6 hardware threads, the PS3 had the 7 SPE's..

That article is a little old now, and things have been moving on, the best thing with next gen and PC's is that more cores is the future, so I suspect consoles will not be hindered at all, and PC's will benefit massively..
 
This is a nice little example of how in today's multi-threaded development environment, more cores at a lower speed is a significant advantage over less cores at a higher clock rate.

vLk.JPG


It's relevant because tomorrows Jaguar CPU is at about the same level as yesterdays Phenom II, and given that the Radeon 5850 used in this test, will be blown away by the PS4 GPU.

So, if programmed by decent devs, those 8 low clocked Jaguar cores will provide all the grunt needed when combined with the 7970m class GPU.

Of course, getting 8 cores to work nicely together will still be a pain, but the potential is there.

I hadn't really thought about that but yeah, I guess, AMD's aiming for these low power parts to reach K8 and beyond. It was around 90% or thereabouts anyway.. I think.

So those comparisons to Core duo is definitely a moot point.

I wonder if we'll get 2ghz cores in the end. Seems like a nice round number. ;)
 
Wikipedia explains it well enough:

"In computing, FLOPS (or flops, for FLoating-point Operations Per Second, also flop/s, see below) is a measure of computer performance, especially in fields of scientific calculations that make heavy use of floating-point calculations, similar to the older, simpler, instructions per second."

d971dda51ba77a4f7de13fb5917cdb91.png


It simply measures the peak number of floating point operations a computer can perform in an optimal case.
Of course in the real world the optimal case will barely be reached, and they don't tell you anything about other parts of a system such as memory bandwidth or other computing tasks (integer, branches..). That's why they can easily be misinterpreted by the layman.
Yet for example, they are a good indicator of shader performance when you compare different GPUs of the same or similar architecture.
Thank you. I figured I could google it buy I knew there would be some details better explained by people here.
 
Thank you. I figured I could google it buy I knew there would be some details better explained by people here.

Note those are kinda like the ceiling. You can't go beyond them. But it's about getting as close to it as possible.

And also, in APUs, they add the GPU and CPU figures in total.
 
This is a nice little example of how in today's multi-threaded development environment, more cores at a lower speed is a significant advantage over less cores at a higher clock rate.

I'm not sure if I'm understanding you correctly, but just in case:
This diagram only shows that the workload in Dirt 2 is badly distributed on 2 cores. Apart from that, it either doesn't scale all that good with the number of cores or it quickly becomes GPU limited.
 
I'm not sure if I'm understanding you correctly, but just in case:
This diagram only shows that the workload in Dirt 2 is badly distributed on 2 cores. Apart from that, it either doesn't scale all that good with the number of cores or it quickly becomes GPU limited.

Presumably they [devs] were targeting 30fps for dual core and 60fps for quadcore.

These cpus arn't maxed out. And single thread and triple threads are less popular I guess.
 
I'm not sure if I'm understanding you correctly, but just in case:
This diagram only shows that the workload in Dirt 2 is badly distributed on 2 cores. Apart from that, it either doesn't scale all that good with the number of cores or it quickly becomes GPU limited.

No, its showing that low clocked CPU's with multiple cores are no bottleneck when programmed properly.

The CPU used in this test normally runs at 3.4Ghz, but they down clocked it to 2Ghz, but can still get a minimum of more than 50fps...
 
No, its showing that low clocked CPU's with multiple cores are no bottleneck when programmed properly.

Yes, but it would perform at least as good with less cores but higher clock speeds, because performance scales lineary with clock speeds. You can't get more than 100% performance increase when you double the number of cores.
Or in short: Given the same architecture, 4 cores @ 3.2 GHz will always be at least a little bit better than 8 cores @ 1.6 GHz.
 
Yes, but it would perform at least as good with less cores but higher clock speeds

So..?

in console gaming land, you'll never go beyond a maximum 60fps.

The fact that these devs could deliver 100fps at 3.4Ghz with four cores is irrelevant

It's a win win situation for PC gamers this time round, efficient multi threaded console developed games on low clocked CPU's will deliver higher frame rates for PC versions with more capable CPU's.
 
So..?

in console gaming land, you'll never go beyond a maximum 60fps.

But that's just in this specific case. You could easily imagine a next gen game that has twice the CPU workload.

Anyway, the only thing I wanted to make clear because I felt it could be misunderstood: 8 cores @ 1.6 GHz won't be better than 4 cores @ 3.2 GHz. That's all.
 
Top Bottom