TRUTHFACT: MS having eSRAM yield problems on Xbox One

Status
Not open for further replies.
nope - and this is confirmed!
But you are allowed to plug in an external hdd to your all-in-one box...

So what happens when the internal HD fails? I have to mail the entire system to MS to replace it with another shitty slow HD, unless i'm willing to void my warrant? I mean what the shit...

Day 1 i am going to swap my PS4 harddrive with an SSD. 5400rpm is bogus.
 
They wouldn't release specs even if no downclock happened (which I have no idea if it did). 1.23TF is clearly less than 1.84TF.

Remember how the 360 was 300 some gflops and the PS3 was 500 some gflops? It amounted to very little.

It's not going to be 30fps on X1 while 60fps on PS4.

It's going to be slightly better AA solutions.

Slightly better shadows.

Slight better particle effects etc.

What is truely amazing is that both consoles while being quite pathetic when compared to gaming PCs are pumping out so many 1080/60 games. Clearly the raw specs mean little.
 
Imagine the egg on some peoples' faces if the One turns out to be more powerful than the PS4, even without the cloud computing benefits.

Just consider that there might be a good reason MS is playing their cards close to the vest on this one. Still a ways until launch and best to drop the big news closer in time.

If that were the case, they wouldn't be going on about the magic cloud that they keep referring to.
 
Remember how the 360 was 300 some gflops and the PS3 was 500 some gflops? It amounted to very little.

It's not going to be 30fps on X1 while 60fps on PS4.

It's going to be slightly better AA solutions.

Slightly better shadows.

Slight better particle effects etc.

What is truely amazing is that both consoles while being quite pathetic when compared to gaming PCs are pumping out so many 1080/60 games. Clearly the raw specs mean little.

The only reason for this is that they were different architectures and PS3 wasn't the lead platform due to launching at 599 and failing to gain momentum early on.

Now they are identical and the differences will be bigger.
 
Remember how the 360 was 300 some gflops and the PS3 was 500 some gflops? It amounted to very little.

It's not going to be 30fps on X1 while 60fps on PS4.

It's going to be slightly better AA solutions.

Slightly better shadows.

Slight better particle effects etc.

What is truely amazing is that both consoles while being quite pathetic when compared to gaming PCs are pumping out so many 1080/60 games. Clearly the raw specs mean little.
Irrelevant in this context, releasing specs sheet would make them look inferior to PS4 and they have the more expensive box so they desperately want to avoid that perception.

As for 1080p60, well PC GPUs have been able to achieve that for quite some time, if that's the priority for devs it's easily manageable on the new consoles as well.
 
I told you guys MS wasn't going to release specific spec info ala' Nintendo. There has to be a reason why.

Because it is worse spec than PS4 and MS doesn't want people to know this.

Still 500Mhz for GPU screams fake. No way in hell they would create FM5 1080p@60 with only 770Gflop GPU.

I already predict that there will be big downgrade when xbone launches from those crystal clear IQ presented in xbone games at E3. I don't believe 1Tflop or even 1,2 Tflop can achieve those things.

Even if we take what Carmac said (twice efficiency) that would take system to 2Tf like power level and still those games wouldn't even work 1080p@60FPS looking like that.

To get what get you 2Tf on PC. I have 2,3Tf PC and The Witcher 2 doesn't run well @ 1920x1080 often getting 20 fps in places on max details (without Uber). And most of those games looks beyond The Witcher 2 even in best places.

Unless it is power of the cloud(tm) but we know it is horseshit PR slogan.
 
To get what get you 2Tf on PC. I have 2,3Tf PC and The Witcher 2 doesn't run well @ 1920x1080 often getting 20 fps in places on max details (without Uber). And most of those games looks beyond The Witcher 2 even in best places.
The Witcher 2 is a terrible example of good optimization.
 
They are still quite different.

Well, the PS3's Cell heavily focused on a SIMD computation model and message passing between Jobs over shared memory. In addition, they had two separate memory pools, and its GPU had no unified shader programming model.

I cannot even begin to see any difference of that magnitude between the PS4s and XBones general system architecture.
 
Imagine the egg on some peoples' faces if the One turns out to be more powerful than the PS4, even without the cloud computing benefits.

Just consider that there might be a good reason MS is playing their cards close to the vest on this one. Still a ways until launch and best to drop the big news closer in time.

Sound like you had a Boner typing that out.
 
Remember how the 360 was 300 some gflops and the PS3 was 500 some gflops? It amounted to very little.

It's not going to be 30fps on X1 while 60fps on PS4.

It's going to be slightly better AA solutions.

Slightly better shadows.

Slight better particle effects etc.

What is truely amazing is that both consoles while being quite pathetic when compared to gaming PCs are pumping out so many 1080/60 games. Clearly the raw specs mean little.

1) xbox fans where partying for way smaller differences in multiplatforms

2) whenyou say pc,you mean a more expensive pc than either console.not a 400$ laptop.
 
The only reason for this is that they were different architectures and PS3 wasn't the lead platform due to launching at 599 and failing to gain momentum early on.

Now they are identical and the differences will be bigger.


For 3rd party games the differences will be small,

The BF4 demo running on X1 dev kits, I noticed some very small differences from the first gameplay shown that was running on like 2 7970's

Now if the difference is that small when comapring the X1 (7770) version to the first demo (7970 x2)

The difference between the PS4 and X1 version is going to be even smaller.
 
For 3rd party games the differences will be small,

The BF4 demo running on X1 dev kits, I noticed some very small differences from the first gameplay shown that was running on like 2 7970's

Now if the difference is that small when comapring the X1 (7770) version to the first demo (7970 x2)

The difference between the PS4 and X1 version is going to be even smaller.

Without knowing what BF4 actually ran on (not what marketing tells us), that's not in any way a reasonable way to estimate the advantages (that will be found in better performing games as well): The power gulf alone could easily amount to a 720p vs 1080p resolution difference. I don't expect to see that in the real world, but I think it's safe to say the PS4 will have an easier time at achieving a stable framerate, maybe better AA, less pop in, better textures etc.
 
For 3rd party games the differences will be small,

The BF4 demo running on X1 dev kits, I noticed some very small differences from the first gameplay shown that was running on like 2 7970's

Now if the difference is that small when comapring the X1 (7770) version to the first demo (7970 x2)

The difference between the PS4 and X1 version is going to be even smaller.


The original demo was running at 60fps locked at 3000K. It probably could have run at 100fps + if optimized. BF4 is a small fry technically compared to what a 7990 could output with optimization. All that shows is that BF4 is not going to be pushing the hardware majorly.
 
For 3rd party games the differences will be small,

The BF4 demo running on X1 dev kits, I noticed some very small differences from the first gameplay shown that was running on like 2 7970's

Now if the difference is that small when comapring the X1 (7770) version to the first demo (7970 x2)

The difference between the PS4 and X1 version is going to be even smaller.

BF4 ran on 'PCs spec'd to XBONE specs'

Thats not a dev kit in any way shape or form, unless they are still using the alpha dev kits from last year.
 
BF4 ran on 'PCs spec'd to XBONE specs'

Thats not a dev kit in any way shape or form, unless they are still using the alpha dev kits from last year.

Most of shown xbone games were running PC with Windows7 on boards and not actual devkits. There is that tweet that dude said that demo crashed and he saw windows 7 and it was on HP PC.
 
It truely is a sad day now that 1080p/60 has become a hardware spec in peoples eyes lol. I can run pong at that on very cheap hardware. Take that consoles :P
 
What are you smoking ? The Witcher 2 can run on low on very low spec. When you turn up graphic to 11 then it is problem to even best PC out there.
I'm not smoking anything, on my laptop Crysis 2 runs smoothly on medium settings by The Witcher 2 chokes on low settings, and Crysis 2 looks better to begin with.

Its funny, because even with the cloud at the specs we know (800mhz/1.6ghz for both) the XBONE is still behind by around 300GFLOPs.
600GFLOPS actually.
 
They wouldn't release specs even if no downclock happened (which I have no idea if it did). 1.23TF is clearly less than 1.84TF.
Yeah, I agree with this. Not releasing specs is not evidence of a downclock.

However, it is a very good indication that those specs are inferior.
 
Crysis 2 probably makes better use of your CPU than the Witcher 2 does.
Probably. CryEngine 2/3 is amazing in that it scales all the way down and still has good performance, and it scales all the way up for highest-end hardware. UE3 is also very impressive performance wise but it plateaus at much lower hardware level.
 
I was reading the newspaper and saw an Xbox One article and they posted some specs:
cQIylFt.jpg

Notice the GPU at 800 MHz.
 
Imagine the egg on some peoples' faces if the One turns out to be more powerful than the PS4, even without the cloud computing benefits.

Just consider that there might be a good reason MS is playing their cards close to the vest on this one. Still a ways until launch and best to drop the big news closer in time.
You're a predator, you know that?
 
I must say, pretty goddamn impressive looking X1 games even if the system turns out to be 400 flops.

1080p, 60fps, great looking visuals.

Thank you.

Now fucking get rid of the Kinect and lower the price.
 
Imagine the egg on some peoples' faces if the One turns out to be more powerful than the PS4, even without the cloud computing benefits.

Just consider that there might be a good reason MS is playing their cards close to the vest on this one. Still a ways until launch and best to drop the big news closer in time.

When did NeoGaf become a popular place to post your Xbox fan-fiction? I figured that would be more of an OT sort of thing...
 
Imagine the egg on some peoples' faces if the One turns out to be more powerful than the PS4, even without the cloud computing benefits.

Just consider that there might be a good reason MS is playing their cards close to the vest on this one. Still a ways until launch and best to drop the big news closer in time.

tumblr_lz2q01ffWq1qkm4yho1_500.gif


You're the best I have ever seen. Simply the best.
 
Is it possible that this 3GB reserved RAM is partly due to video streaming, done without a separate streaming card and thus using actual main RAM itself?

I don't think MS knew Sony would include a separate video streaming card until the actual announcements and they had to have something similar, but by that point they probably already had the hardware in production and couldn't add a card.
 
Why dont Sony push the DDR5 in the press releases or conferences?

Most of the tech sites and podcats have no clue the difference between DDR3 and DDR5. For instance, when the DDR5 bomb dropped, The Verge live chat did not even mention it! And everytime people are discussing it on TWIT etc. no one even mentions the DDR3/5 difference! Instead they are hailing the Xbone as being the superior hardware because of the cloud and kinect...
rage
 
Why dont Sony push the DDR5 in the press releases or conferences?

Most of the tech sites and podcats have no clue the difference between DDR3 and DDR5. For instance, when the DDR5 bomb dropped, The Verge live chat did not even mention it! And everytime people are discussing it on TWIT etc. no one even mentions the DDR3/5 difference! Instead they are hailing the Xbone as being the superior hardware because of the cloud and kinect...
rage

Neither do you apparently. It's GDDR5, not DDR5.
 
Is it possible that this 3GB reserved RAM is partly due to video streaming, done without a separate streaming card and thus using actual main RAM itself?

I don't think MS knew Sony would include a separate video streaming card until the actual announcements and they had to have something similar, but by that point they probably already had the hardware in production and couldn't add a card.

No, the amount of RAM needed for encoding a stream is not very large, about 100-200MB.

The PS4 does not have a streaming card, it has an encoder chip that is likely to be sending the data to main RAM, same as the xbone.
 
No, the amount of RAM needed for encoding a stream is not very large, about 100-200MB.

The PS4 does not have a streaming card, it has an encoder chip that is likely to be sending the data to main RAM, same as the xbone.

Do you have a source for this?
 
Do you have a source for this?

tivos have 512MB ram and a shitty CPU. they manage to record 3 HD streams at once with no issue. It really isn't a heavy duty thing to do if you have dedicated encoding hardware - you're not saving it in ram, you'd save off to HDD
 
Status
Not open for further replies.
Top Bottom