TRUTHFACT: MS having eSRAM yield problems on Xbox One

Status
Not open for further replies.
What is the source for X1 supporting 5.1 for 4 controllers and PS4 not supporting it?
all "5.1 audio through headphones" is just software mixing 5.1 over stereo, you know this right? The actual cable and hardware inside the controller is stereo.

Where is the source for Xbone even supporting the ability to transmit anything other than chat audio to the controller (let alone 5.1 to four different gamepads)? I haven't seen this feature mentioned anywhere, the only people convinced of the possibility seem to be random forums posters.
 
he said ff15 would be exclusive to ps4
Permaban?

The only "painful" thing here is an argument about peak FLOPS using random benchmarks.
Yeah, it's a rather weird argument, especially that devs will want to squeeze every bit of processing power out of these machines.

In general, two groups in this thread are wrong. The first is the one claiming that PS4 won't have a significant edge in performance over Xbone. The second is the one claiming that a gaming PC will give marginal (at best) performance benefits over PS4.
 
Yeah, it's a rather weird argument, especially that devs will want to squeeze every bit of processing power out of these machines.

In general, two groups in this thread are wrong. The first is the one claiming that PS4 won't have a significant edge in performance over Xbone. The second is the one claiming that a gaming PC will give marginal (at best) performance benefits over PS4.

My expectation has been that we'll see developers basically design their games to carry PC settings straight to the consoles as an easy scaling template to stick with.

Early generation will be:
medium = XB1
high = PS4
ultra = PC

by late generation we'll instead see:
low = XB1
medium = PS4
high, ultra = PC only
 
My expectation has been that we'll see developers basically design their games to carry PC settings straight to the consoles as an easy scaling template to stick with.

Early generation will be:
medium = XB1
high = PS4
ultra = PC

by late generation we'll instead see:
low = XB1
medium = PS4
high, ultra = PC only

I always found the whole "low mid high ultra" denomination from PCs didn't really translate all that well to consoles. Even console games today with ultra gorgeous PC ports don't really look like "low" settings to me. They look about as good as the hardware will allow, which is what we'll see late next gen too.

My expectation though is that power won't really be a big issue, even late in the gen unless devs come up with something exceptionally computationally heavy that we haven't seen in games yet. I expect to continue to see 1080p/60fps games on PS4 well throughout its entire life span with obscenely high resolution textures and particle effects. I guess what it boils down to is I don't think devs will need substantially more than what's in the PS4 now 7 years down the line to make the games they really want. But hey, I've been wrong about that kind of stuff before so I guess we'll see :-P
 
Mistake. 360 has 512 MB of Ram and The One Has 8 GB of Much Faster Ram. 16x the RAM of the 360. Not even factoring the GPU and CPU combo.

But at the same time, the Xbox One has a 500 gb hard drive, and my 360 has 250 gb hard drive. So the Xbox One is only twice as powerful as the 360.
 
But at the same time, the Xbox One has a 500 gb hard drive, and my 360 has 250 gb hard drive. So the Xbox One is only twice as powerful as the 360.

And at the same time, it's nearly 8 thousand times more powerful than the original Xbox 360 Core.
 
I know but i was more asking if Esram bandwidth is also influenced by the GPU clock.

In some old cards, core clock and shaders clock used to be linked with a fixed ratio. Memory clock was always independent.

Probably someone will see in this message an attack to PS4 and reply some nonsense. You can save it.
 
So guys, how does the power of the cloud work? What exactly is it.. Does it help games have more content? If so why use blu-rays as opposed to DVD?
 
So guys, how does the power of the cloud work? What exactly is it.. Does it help games have more content? If so why use blu-rays as opposed to DVD?

The cloud is nothing more then external compute performance or a dedicated server.
If you have some elements in your game lets say for example AI and they player is always connected. Then you could choose to offload and run the AI in the cloud.
See it like instead of sending you back player data of other players in a multiplayer match.
The server will send back the AI data in your single player match. Think World of warcraft and other MMO.

You can offload some physics too think environmental fluid dynamics or cloth simulation.
The player will not know that those physics are run in the cloud because all the X1 does is render the state it gets from the cloud.

This all can only really work if players are always connected.
But the cloud is just some term marketing created some years ago for servers and software.
Because we life in an more connected world and a lot of data goes over wireless or mobile networks.

/Hope it wasn't an troll attempt or else typing this out was a waste of time.
 
So guys, how does the power of the cloud work? What exactly is it.. Does it help games have more content? If so why use blu-rays as opposed to DVD?
Have you heard about web browsing? You send a 50 character URL into the ether, and 200~300ms later, you may get a pretty page. Actually, not quite, you get served a hunk of text, and your browser locally transforms it into a pretty page. Because your browser runs on a machine that has resources for that kind of thing.

By providing latency and bandwidth of a single-speed CD-ROM drive, cloud computing provides that authentic PSX experience that we see far too rarely today.
 
I always found the whole "low mid high ultra" denomination from PCs didn't really translate all that well to consoles. Even console games today with ultra gorgeous PC ports don't really look like "low" settings to me. They look about as good as the hardware will allow, which is what we'll see late next gen too.
It does translate extremely well and straightforward at the moment -> PS360 level graphics are low settings on PC. Console assets and effects are the base level that only gets improved upon.
 
Have you heard about web browsing? You send a 50 character URL into the ether, and 200~300ms later, you may get a pretty page. Actually, not quite, you get served a hunk of text, and your browser locally transforms it into a pretty page. Because your browser runs on a machine that has resources for that kind of thing.

By providing latency and bandwidth of a single-speed CD-ROM drive, cloud computing provides that authentic PSX experience that we see far too rarely today.

Hey, don't sell the PSX short. It had a 2x CD-ROM drive.
 
Where is the source for Xbone even supporting the ability to transmit anything other than chat audio to the controller (let alone 5.1 to four different gamepads)? I haven't seen this feature mentioned anywhere, the only people convinced of the possibility seem to be random forums posters.

Xbox One does a lot of stuff that I see on here that most people have no idea about... lot's of very cool, small things that will get overlooked until most people are actually using one to be appreciated.
 
The cloud is nothing more then external compute performance or a dedicated server.
If you have some elements in your game lets say for example AI and they player is always connected. Then you could choose to offload and run the AI in the cloud.
See it like instead of sending you back player data of other players in a multiplayer match.
The server will send back the AI data in your single player match. Think World of warcraft and other MMO.



This all can only really work if players are always connected.
But the cloud is just some term marketing created some years ago for servers and software.
Because we life in an more connected world and a lot of data goes over wireless or mobile networks.

/Hope it wasn't an troll attempt or else typing this out was a waste of time.

A good honest effort - but I find it hard to believe a dev like turn 10 have found a way for ai to simulate human behaviour. And that it was only the lack of computing power that was stopping us.

It's pure bullshit if you ask me. If they could do this, they would had done it already.

My guess is they will have their ai, which drives round the course, and then add behaviours, such as the most likely way you corner.

Very cynical about all this dravatar claims. They may just smoke-and-mirror the whole thing, and put people in driver classes, and whatever class you are in, it'll will perform as well as it can in live a.i. race.
 
Xbox One does a lot of stuff that I see on here that most people have no idea about... lot's of very cool, small things that will get overlooked until most people are actually using one to be appreciated.

That's because 10,000 "attaboys" can get wiped out by one "oh shit". And MS has been nothing but "oh shit" lately.
 
The eSRAM ram reads x bytes (128 i think, i dunno) per GPU clock, so yeah its tied to the GPU clock.

Yeah i was questioning that because i heard that on B3D. When they tried to break down microsoft bandwidth figure over 200GB/s.
Most likely Microsoft creative math adding all sort of bandwidth figures :p

A good honest effort - but I find it hard to believe a dev like turn 10 have found a way for ai to simulate human behaviour. And that it was only the lack of computing power that was stopping us.

It's pure bullshit if you ask me. If they could do this, they would had done it already.

My guess is they will have their ai, which drives round the course, and then add behaviours, such as the most likely way you corner.

Very cynical about all this dravatar claims. They may just smoke-and-mirror the whole thing, and put people in driver classes, and whatever class you are in, it'll will perform as well as it can in live a.i. race.

I think what turn10 does is get driving data from all the players and see what corner is hard to take and where do most fail to make a clean turn. And process the data to refine and replicate human driving experiences better.

If 80% of the player can't make turn A on track B at X speed Turn10 can use that data to send new parameters to the in game AI so that if it makes turn A on track B at +- X speed it will probably slip or crash into other cars.
The more samples you can process the better an AI can get. Just like with every skill the more you do it the better you get at it. I had some basic neural network stuff for computer vision. So yeah its all smoke and mirrors for the average player it doesn't matter as long as they perceive the AI as advance its okay. Bungie had a nice discussion on this maybe if i can find it back i will post it.

This stuff is not exclusive for the X1 but when Microsoft gives those limited resources away
for free or with heavy discount why not use it. I can see dice do their Levolution stuff on the servers and just send back the current state of the level or building collapsing. Which most likely will be the case on every platform because you can reduce the requirements and broaden the amount of pc that can run the game.
 
Guys, can someone confirm PS4 CPU specs? I know this is not a best thread for that, but I'd like to avoid opening a new topic just to ask that question. And since most experts are here... :)

In every official Sony released info, there is only generic "8 core Jaguar CPU". What about clock, cache, IPC, etc.?
 
How would AI and physics unloading work practically anyway?

Say that there are five million users that will be using data from the cloud. Every cloud server has 10x the oomph of the Xbox one. Let's day that each server runs at $1000. That is five billion dollars that needs to be pulled on from somewhere. Say that each server uses a 700w power supply and that the load due to the global audience of the Xbox will be roughly even over the day (practically there will be local servers and such, but never mind that now). Counting cooling at a factor two (that we assume is running at max) we arrive at 126 GWh a day and 12cent per kWh that is 15 million per day on electricity costs. Or $20 a month per currently paying gold subscribers.

Fast and very very dirty, so my apologies for all of the errors.
 
How would AI and physics unloading work practically anyway?

Say that there are five million users that will be using data from the cloud. Every cloud server has 10x the oomph of the Xbox one. Let's day that each server runs at $1000. That is five billion dollars that needs to be pulled on from somewhere. Say that each server uses a 700w power supply and that the load due to the global audience of the Xbox will be roughly even over the day (practically there will be local servers and such, but never mind that now). Counting cooling at a factor two (that we assume is running at max) we arrive at 126 GWh a day and 12cent per kWh that is 15 million per day on electricity costs. Or $20 a month per currently paying gold subscribers.

Fast and very very dirty, so my apologies for all of the errors.
If x number of players share the same world space, you can calculate all those things once and upload them to all those players. Obviously those things are meant for always online games with multiplayer game modes, most of them will most likely push some sort of microtransactions, so it'll be worth it for devs and Microsoft.
 
Guys, can someone confirm PS4 CPU specs? I know this is not a best thread for that, but I'd like to avoid opening a new topic just to ask that question. And since most experts are here... :)

In every official Sony released info, there is only generic "8 core Jaguar CPU". What about clock, cache, IPC, etc.?

It´s 2 Jaguar modules, each one having 4 cores. They are rumored to be clocked to 1,6GHz although there were rumors about Sony clocking them at 1,8-2 GHz. It is still unknown. Each core has 32KB of L1 and 512KB of L2 ( so there is a total of 4 MB of L2 ). The L2 can be shared between the 8 cores although with a latency penalty. Each core has a good FPU ( the best part about this architecture ) capable of running all the existing instruction extensions until AVX.
 
It´s 2 Jaguar modules, each one having 4 cores. They are rumored to be clocked to 1,6GHz although there were rumors about Sony clocking them at 1,8-2 GHz. It is still unknown. Each core has 32KB of L1 and 512KB of L2 ( so there is a total of 4 MB of L2 ). The L2 can be shared between the 8 cores although with a latency penalty. Each core has a good FPU ( the best part about this architecture ) capable of running all the existing instruction extensions until AVX.

Great, thx, that cleared things up.
 

Xbox One does a lot of stuff that I see on here that most people have no idea about... lot's of very cool, small things that will get overlooked until most people are actually using one to be appreciated.

Streaming to a completely proprietary headphone connector isn't all that cool.

That doesn't mention "four streams of 5.1 audio" anywhere - it seems more likely that the Xbone will stream a stereo virtual surround mix (which is all you need for surround with headsets anyway). Whether they will be able to serve multiple stereo streams or not we'll have to see.

That's actually a pretty neat feature

Makes any headphone you plug in to your controller essentially wireless.

Unfortunately, the headphone connector seems to be proprietary - I'd rather not buy crappy Turtle Beach headphones (but a controller wouldn't have enough power to drive my BeyerDynamic DT 990 Pro anyway :p)

I can see dice do their Levolution stuff on the servers and just send back the current state of the level or building collapsing. Which most likely will be the case on every platform because you can reduce the requirements and broaden the amount of pc that can run the game.

Levelation most likely has been done on the server from the inception of this feature, anything else doesn't make all that much sense for a game with dedicated servers.
 
How would AI and physics unloading work practically anyway?

Say that there are five million users that will be using data from the cloud. Every cloud server has 10x the oomph of the Xbox one. Let's day that each server runs at $1000. That is five billion dollars that needs to be pulled on from somewhere. Say that each server uses a 700w power supply and that the load due to the global audience of the Xbox will be roughly even over the day (practically there will be local servers and such, but never mind that now). Counting cooling at a factor two (that we assume is running at max) we arrive at 126 GWh a day and 12cent per kWh that is 15 million per day on electricity costs. Or $20 a month per currently paying gold subscribers.

Fast and very very dirty, so my apologies for all of the errors.

The cloud can allocate resources on the fly. Given the average core gamer a person between 15~35 years with work or school can probably play about 3 hours a day that is 21 hours a week on average. When maybe the majority can play like 10~15 hours a week while still paying for gold subscription its not like they will game 24/7.
But i think we are going off topic with this discussion.

Levelation most likely has been done on the server from the inception of this feature, anything else doesn't make all that much sense for a game with dedicated servers.

Most mmo do everything on the server and all the client does is render the game maybe interpolate between states.
 
How would AI and physics unloading work practically anyway?

Say that there are five million users that will be using data from the cloud. Every cloud server has 10x the oomph of the Xbox one. Let's day that each server runs at $1000. That is five billion dollars that needs to be pulled on from somewhere. Say that each server uses a 700w power supply and that the load due to the global audience of the Xbox will be roughly even over the day (practically there will be local servers and such, but never mind that now). Counting cooling at a factor two (that we assume is running at max) we arrive at 126 GWh a day and 12cent per kWh that is 15 million per day on electricity costs. Or $20 a month per currently paying gold subscribers.

Fast and very very dirty, so my apologies for all of the errors.

This is why the cloud is bullshit in this context. That's not how "the cloud" works, well at least it won't work for things like physics that rely on very low latency. The cloud can be used to parallelize a large compute task across 100 or 1000s of machines. If there are millions of people playing a game simultaneously it's ridiculous to assume that any meaningful sort of processing the local machine's CPU/GPU could do would be magically offloaded to the cloud.

The cloud thing would work if, say, you had a massively multiplayer game where 1000s of users are somehow altering the environment and those changes are constantly streamed to the cloud for calculations to reflect those changes across all users. Things like that which can be queued up and batch calculated work fine and will scale even under peak usage. If you think the cloud will get you better graphics or physics you're dead wrong. That's OnLive territory and that sort of experience is sub-optimal, to put it lightly.

Source: I'm on a team that is building a giant private cloud infrastructure.
 
A good honest effort - but I find it hard to believe a dev like turn 10 have found a way for ai to simulate human behaviour. And that it was only the lack of computing power that was stopping us.

It's pure bullshit if you ask me. If they could do this, they would had done it already.

I read an article years ago about Forza AI and the institute that helped develop it.
Think I remember something about how the AI "learned" and actually got too smart.
Someone here can probably post the link, it was an interesting read.
 
It´s 2 Jaguar modules, each one having 4 cores. They are rumored to be clocked to 1,6GHz although there were rumors about Sony clocking them at 1,8-2 GHz. It is still unknown. Each core has 32KB of L1 and 512KB of L2 ( so there is a total of 4 MB of L2 ). The L2 can be shared between the 8 cores although with a latency penalty. Each core has a good FPU ( the best part about this architecture ) capable of running all the existing instruction extensions until AVX.
I'm still curious as to why we don't have the PS4 CPU clock confirmed. They've gone into so much detail about every other optimization, spec, and architecture component except for that. Either they think it'll sound unimpressive or the clock speed is in flux. I'm leaning toward the former at this point.
 
I'm still curious as to why we don't have the PS4 CPU clock confirmed. They've gone into so much detail about every other optimization, spec, and architecture component except for that. Either they think it'll sound unimpressive or the clock speed is in flux. I'm leaning toward the former at this point.

Aren't the CPU and GPU clock not in a 2:1 ratio?
Or is that just coincidence.
 
Where is the arrogance in show how real world applications can't match pure math figures?

The arrogance is in assuming theoretical increase of a certain magnitude in a certain part of the system should reflect in a certain program's certain benchmark because, oh you know it should, despite being corrected by several more informed people, to no avail of course.

People are giving you math figures by which flops are measured, you are giving them arbitrary benchmarks which are not even supposed to measure the figure in context.

Your real world applications may be bottlenecked by any kind of deficiency including possibly even hard drive speeds. Several people have tried to retrack you on what is actually being argued and you have blatantly ignored their calls to walk your own way.

Not only are you totally misinformed about what was being discussed, but you were arrogant enough to assume you knew better, and then managed to make a fool of yourself with your benchmarks.

I hope that was clear enough.
 
Your real world applications may be bottlenecked by any kind of deficiency including possibly even hard drive speeds. Several people have tried to retrack you on what is actually being argued and you have blatantly ignored their calls to walk your own way.

Wich is my point in this debate. Me, as some others, refuse that GAF TRUTHFACT of PS4 being 50% faster than Xbone just for having 50% more CU's in the GPU. Something easily proved.

Meanwhile, those who support the +50% power just reply with LOLs, gifs and insults.

Saying that performance increases linear along with clockspeed is just like say a car speed increase linear with HP. As simple as that.

Just have a read of how actual performance can be as far as less than half of the theoretical performance:

Code:
                                                      (“Theoretical Peak” Mflop/)
Intel Core 2 Q6600 Kensfield) (4 core, 2.4 GHz) 13130 (38400)

AMD Opteron 275/2.2 Ghz (dual core, 4 proc) 6147 (17600)

IBM Cell BE (3.2 GHz) 98.05 (204.8) (32 bit)

A cow is not spherical, and universe have friction.
 
Wich is my point in this debate. Me, as some others, refuse that GAF TRUTHFACT of PS4 being 50% faster than Xbone just for having 50% more CU's in the GPU. Something easily proved.

Meanwhile, those who support the +50% power just reply with LOLs, gifs and insults.

Saying that performance increases linear along with clockspeed is just like say a car speed increase linear with HP. As simple as that.

Just have a read of how actual performance can be as far as less than half of the theoretical performance:

Code:
                                                      (“Theoretical Peak” Mflop/)
Intel Core 2 Q6600 Kensfield) (4 core, 2.4 GHz) 13130 (38400)

AMD Opteron 275/2.2 Ghz (dual core, 4 proc) 6147 (17600)

IBM Cell BE (3.2 GHz) 98.05 (204.8) (32 bit)

But a 50% advantage in CU's over X1 is 50% more powerful. It's not a clock rate difference. Or are you saying that the rumored clock rate drop and how it changes in power estimates are wrong because clock rates don't scale linearly?
 
But a 50% advantage in CU's over X1 is 50% more powerful.

No, 50% more CU's is 50% more theoretical raw processing power just in the GPU. Let's just ignore there are more key components inside the box than only a GPU. Now you have to translate that into actual performance, and if there is something we know in computing, is that more units mean less efficiency per unit. This is a pretty difficult field.

I'm pretty sure many of you have read the move AMD did from VLIW 5 to GNC to improve their efficiency, because they weren't able to maintain instruction flow to feed every shader.

Or are you saying that the rumored clock rate drop and how it changes in power estimates are wrong because clock rates don't scale linearly?

I don't know the target clock for Xbone, as I don't know the actual retail clock. I don't know either the PS4 GPU clocks. Can you clarify me on this? Some people here go as far to throw some numbers as 800Gflops. Where does that number come from?

Usually, less processors means more frecuency, just because of the TDP headroom. Do we know if Xbone GPU will have more frecuency than PS4 gpu? There is a lot of things we just don't know about.

Every architecture have a sweetpoint clockwise. Go beyond that, and performance improvement can move from minor to negligible. Go under that and performance can plummet.
 
Saying that performance increases linear along with clockspeed is just like say a car speed increase linear with HP. As simple as that.

Nobody says that, that's demagoguery. They are arguing that peak FLOPS increases linearly. Your real world performance will not only increase depending on a crapload of other variables, but will also be impossible to objectively or consistently monitor because we won't be able to agree on what constitutes a fair benchmark application in the first place.

What people argued so far is objective truth.

What you are arguing is convulted nonsense.

To return to your car example:

People are arguing that Car A can travel from point X to point Y in a straight line 50% faster than Car B because its top speed is 50% higher
You are arguing that these people are wrong because Car A can not complete 50 laps around the Nurburgring 50% faster than Car B. Which is probably true, but not at all related, since completing 50 laps of the Nurburgring has to do with cornering performance, tyre wear and fuel consumption etc.
 
You only have to look as far as Battlefields destruction system to see what issues "cloud" latency causes in-game events.

You strap a building in C4 and blow it, the server processes the destruction event and updates all the clients, so what you end up with if you have a slight high ping is a building blowing up and falling that appears delayed and looks janky clipping through the floor.

Not very exciting, whilst it does add to the game you do have to suspend belief a touch.
 
Status
Not open for further replies.
Top Bottom