• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Digital Foundry vs Xbox One S

Yeah but most living rooms have a background greater than these levels, it will be hardly noticeable and you would need a sound quiet room to hear.

Remember 44 dB is the lowest level of urban sound, as quiet as you can get in most environments.

Nice noise levels and power draw. Impressed.
 
The difference is there but dB if often regarded as a not so good measurement of how loud something actually is.

Huh... who are these people that makes it 'often regarded'?

It all depends on which weighting is used. A, B, C, D, & Z, and which application. A being the most common for assessing environmental noise, and hearing damage.

You need to elaborate on this now.

Yeah but most living rooms have a background greater than these levels, it will be hardly noticeable and you would need a sound quiet room to hear.

Remember 44 dB is the lowest level of urban sound, as quiet as you can get in most environments.

Nice noise levels and power draw. Impressed.

Agreed.
 

The real question of the whole article is why on earth did they settle for 12 compute units still with so low power draw and undetectable noise..

So many xb1 games now from 3rd parties have scaling, they could of matched the ps4 quite easily.
 
The real question of the whole article is why on earth did they settle for 12 compute units still with so low power draw and undetectable noise..

So many xb1 games now from 3rd parties have scaling, they could of matched the ps4 quite easily.
Form factor is the answer. They'd either need to make the box larger to make room for a larger cooling system or sacrifice noise production.
 
Huh... who are these people that makes it 'often regarded'?

It all depends on which weighting is used. A, B, C, D, & Z, and which application. A being the most common for assessing environmental noise, and hearing damage.

You need to elaborate on this now.

You are telling me I need to elaborate? This forum...
dB is a measurement of sound pressure difference, not really loudness. The logarithmic scale doesn't really help in that regard.
Here you can read about it: http://www.sengpielaudio.com/calculator-levelchange.htm
A better measurement is sone (https://en.m.wikipedia.org/wiki/Sone) which is a better scale because it's linear.
“These people“ are, for example, the authors of technical articles of the most technical proficient computer magazine in Germany, c't.
 
So MS said it would not affect games, and you'd not notice a difference without a 4k display, yet it does make a significant difference? Oh man. So, my guess is they will phase out OG XB1 pretty quickly after this.
 
Surprising they actually increased performance on the slim model (even if it's not noticeable)

Maybe this is their solution for Neo and Scorpio really is xbox2?
 
So MS said it would not affect games, and you'd not notice a difference without a 4k display, yet it does make a significant difference? Oh man. So, my guess is they will phase out OG XB1 pretty quickly after this.
I don't see how the difference is significant???

From what I see games will receive in best cases 1fps or the dynamic res will drop a few pixels less... it is good and all but not significant.

These differences are in 90% of the cases unnoticeable.
 
Surprising they actually increased performance on the slim model (even if it's not noticeable)

Maybe this is their solution for Neo and Scorpio really is xbox2?

If this is their solution to compete with NEO, its not a good one.

NEO is in between this and Scorpio, but that's a huge gap for most titles. This is about 100gflops stronger than XB1. NEO is about 3000gflops more than XB1.

This bodes significantly well for NEO, because this is just with a clock speed increase of the original GPU without any changes to the CPU clock. NEO is replacing the old GPU with a far more power efficient and stronger GPU, and a significant upclock of Jaguar to boot.

Remember how Xbox 360 S got a huge bump in power just by putting the same 360 components on a SoC and shrinking down to 45nm? The only reason that didn't show in games was because MS forced the S to run at the same speed of the OG 360.

This is the same thing they are doing now for XB1 S, with the shrinking down to 16nm, except they are allowing that speed boost due to the 16nm process, even while using the same exact components as OG XB1.
 
You are telling me I need to elaborate? This forum...
dB is a measurement of sound pressure difference, not really loudness. The logarithmic scale doesn't really help in that regard.
Here you can read about it: http://www.sengpielaudio.com/calculator-levelchange.htm
A better measurement is sone (https://en.m.wikipedia.org/wiki/Sone) which is a better scale because it's linear.
“These people“ are, for example, the authors of technical articles of the most technical proficient computer magazine in Germany, c't.

It... it was sarcasm. (Thought you would have picked up on that with how I had the top and bottom sentence structured, lol.)

Loudness is subjective, where as dB is, as you said, objective pressure measurement.

Hence why some say it's a 'jet engine' and others, 'a whine'. lol

In other words, I completely agree.
 
If this is their solution to compete with NEO, its not a good one.

NEO is in between this and Scorpio, but that's a huge gap for most titles. This is about 100gflops stronger than XB1. NEO is about 3000gflops more than XB1.

Remember how Xbox 360 S got a huge bump in power just by putting the same 360 components on a SoC and shrinking down to 45nm? The only reason that didn't show it games was because MS forced the S to run at the same speed of the OG 360.

This is the same thing they are doing now for XB1 S, with the shrinking down to 16nm, except they are allowing that speed boost due to the 16nm process, even while using the same exact components as OG XB1.

I get that you seem to be wanting to make this look like a bigger deal than it actually is, but you do understand that outside GAF (and honestly even for a large portion of GAF), most people would not even be able to notice this "boost" right?
 
I get that you seem to be wanting to make this look like a bigger deal than it actually is, but you do understand that outside GAF (and honestly even for a large portion of GAF), most people would not even be able to notice this "boost" right?

Its a DECENT boost to power man, that's what i can say. 5 to 10 FPS is no joke, and would surely be made a deal of if it was between XB1 and PS4 in a DF comparison.

Its true that it won't really matter to casual fans sure, but i wasn't really speaking about them you know? I was musing on the technical specifications in general.
 
Digital Foundry forgot to ask if the GPU is still GCN 1.1 instead of GCN1.2. The bandwidth efficiency of GCN 1.2 would have been a as big, if not bigger, impact for future games taking advantage of it than the 7% overclock.
 
If this is their solution to compete with NEO, its not a good one.

NEO is in between this and Scorpio, but that's a huge gap for most titles. This is about 100gflops stronger than XB1. NEO is about 3000gflops more than XB1.

This bodes significantly well for NEO, because this is just with a clock speed increase of the original GPU without any changes to the CPU clock. NEO is replacing the old GPU with a far more power efficient and stronger GPU, and a significant upclock of Jaguar to boot.

Remember how Xbox 360 S got a huge bump in power just by putting the same 360 components on a SoC and shrinking down to 45nm? The only reason that didn't show in games was because MS forced the S to run at the same speed of the OG 360.

This is the same thing they are doing now for XB1 S, with the shrinking down to 16nm, except they are allowing that speed boost due to the 16nm process, even while using the same exact components as OG XB1.
Their solution to compete with Neo is something like 1800 GFLOPs more powerful than Neo.
 
Their solution to compete with Neo is something like 1800 GFLOPs more powerful than Neo.

Puma at 2.1 ghz.
AMD Baffin 14cu at 1090 mhz. 2 teraflops.
64mb of Esram. 260 gb/s.
8GB of DDR4, 100gb/s.

Would have been > PS4 by quite a substantial margin. Perfect 1080p performance.

But we're getting Scorpio instead, so not that disappointing to have the current Xb1 S.
 
Their solution to compete with Neo is something like 1800 GFLOPs more powerful than Neo.

Sure. that's why i'd disagree with the notion that this XB1S is in anyway designed to compete with NEO, hence my response to the original poster.

Puma at 2.1 ghz.
AMD Baffin 14cu at 1090 mhz. 2 teraflops.
64mb of Esram. 260 gb/s.
8GB of DDR4, 100gb/s.

Would have been > PS4 by quite a substantial margin. Perfect 1080p performance.

But we're getting Scorpio instead, so not that disappointing to have the current Xb1 S.

Your basically saying change every single component of the unit. Why the hell would they do that for a minimal refresh taking advantage of 16nm? Do you know how much that would cost? Totally eat into Scorpio's R&D
 
Seems like a nice upgrade and an interesting chance for those who don´t have an Xbox One yet. The unexpected extra horsepower is a very welcome bonus, even if they don´t intend to transform it into a selling point.

Aesthetically I like it more than the vanilla Xbox One, but I still don´t find it beautiful. Ultimately, I will pass on this, waiting for Scorpio (will the S offer some cues on its design?).

Glad to see Penello back to front. They covered him through the Xbox One launch turbulences.
 
Sure. that's why i'd disagree with the notion that this XB1S is in anyway designed to compete with NEO, hence my response to the original poster.
Your basically saying change every single component of the unit. Why the hell would they do that for a minimal refresh taking advantage of 16nm? Do you know how much that would cost? Totally eat into Scorpio's R&D

I was under the impression porting from 28nm to 16nm finfet would have been costly anyways based on the rumors surrounding the motivation behind Neo.
 
Great that performance on games is a little better and a nice surprise when MS originally said the performance would be the same as the original XB1.

I'm disappointed with the media performance but not totally surprised though. Loud when playing UHD blu-ray and still no Dolby Atmos support is a big no no for me. I guess thats why its so cheap though, when a normal UHD player is at least twice as much. On the plus side, its good for someone who doesn't care about having the very best audio quality.
 
Puma at 2.1 ghz.
AMD Baffin 14cu at 1090 mhz. 2 teraflops.
64mb of Esram. 260 gb/s.
8GB of DDR4, 100gb/s.

Would have been > PS4 by quite a substantial margin. Perfect 1080p performance.

But we're getting Scorpio instead, so not that disappointing to have the current Xb1 S.

that would be 1.95 Teraflops
 
I was under the impression porting from 28nm to 16nm finfet would have been costly anyways based on the rumors surrounding the motivation behind Neo.

Sony doesn't have the money to eat that cost for a marginal revision of the original hardware on 16nm and a NEO type revision at the same time, and more importantly, probably didn't think the cost was worth it to bring out two different SKU's of different performance levels like that on 16nm as opposed to pushing one singular one and OG PS4 at the original node size.

MS have the advantage both of money and being more desperate for marketshare, hence opening up the market to more of their hardware via marginal S upgrade and a brand spanking new console in Scorpio.

NEO will fit in between the Xbox S unit and Scorpio in terms of architecture and power(same jaguar cores but upclocked, better GPU, higher bandwidth of the GDDR5, 4K compatibility ect)
 
I was under the impression porting from 28nm to 16nm finfet would have been costly anyways based on the rumors surrounding the motivation behind Neo.
Going to a smaller node always costs money. When the CELL for the PS3 went from 90nm to 65nm it costed money to adapt the chip to the new node. When the Xenon (360) went from 90nm to 65nm it costed money for the same reason.
 
Lets all pray Neo gets the TSMC 16 nm treatment.

Hmm not sure how they would do that especially when the Polaris 10 chip (RX 480), which is rumored to be in the heart of the NEO, is built on 14nm Finfet. If anything the entire APU would be built on Global Foundries (GloFo) 14nm Finfet process.We'll find out soon enough I suppose.
 
Which most 4K TV apps already do, no?

I can't speak for anyone else, but the way my setup is, it is beyond a hassle to use the 4K TV apps for any streaming. They are slow and unintuitive, not to mention the added inconvienence of using ARC or running an optical cable to get audio back to the reciever, which cuts out certain formats of HD audio and is generally messy.

I bought a XB1 at launch, and I am thrilled to replace it with the XB1-S. The X1 is already what runs my media center, so to have an integrated 4K box with UHD BR playback is incredible. It sucks that I bought the Samsung UHD player, but I'll find someone who can use it.
 
Going to a smaller node always costs money. When the CELL for the PS3 went from 90nm to 65nm it costed money to adapt the chip to the new node. When the Xenon (360) went from 90nm to 65nm it costed money for the same reason.

Interesting. Exactly how much did it costed?
 
I can't speak for anyone else, but the way my setup is, it is beyond a hassle to use the 4K TV apps for any streaming. They are slow and unintuitive, not to mention the added inconvienence of using ARC or running an optical cable to get audio back to the reciever, which cuts out certain formats of HD audio and is generally messy.

I bought a XB1 at launch, and I am thrilled to replace it with the XB1-S. The X1 is already what runs my media center, so to have an integrated 4K box with UHD BR playback is incredible. It sucks that I bought the Samsung UHD player, but I'll find someone who can use it.

Ah yes, forgot about the audio with receivers. I guess that makes sense.
 
I was under the impression porting from 28nm to 16nm finfet would have been costly anyways based on the rumors surrounding the motivation behind Neo.
I never understood this way of thinking to be fair.

Lower process = smaller chip = less cost to produce each chip.

You spent a bit in R&D to adapt the chip to the new process but that is way cheaper than create a new chip and most of these cost to shrink will be covered in the first batch of chip produced.

That is how it works... if you can shrink then do it... you will cut the costs a lot.

Sony doesn't have the money to eat that cost for a marginal revision of the original hardware on 16nm and a NEO type revision at the same time, and more importantly, probably didn't think the cost was worth it to bring out two different SKU's of different performance levels like that on 16nm as opposed to pushing one singular one and OG PS4 at the original node size.
That is bullshit.

Sony will make the OG PS4 chip in 16nm to cover costs too... there is no other path to go if they want to cut prices in the future.

The ideia of not having money to shrink for a lower process with a forecast to sell over 20 million chip yet is ridiculous... Sony selling 20 million in 16nm will give them way way way more money than 20 million in 28nm... the costs to adapt the chip to the new process will be way lower too than the cut in cost to produce them.
 
All setup! All green ticks with the 4k/HDR/10-bit support too :)

So....the One S still can't bitstream HD audio formats so I get the nice DTS or DD audio lights on my AV receiver like most players out there? :/ damn even the PS3 could do that. Replaced my 4K upscalling Blu-Ray player which supports all the HD audio formats with the One S.
Guessing I'll pick uncompressed 5.1 and let the receiver do the work, although the receiver now says Multi In.
 
HDR mode will require different calibration settings than other modes due to the extreme brightness requirements for the standard. If you aren't using an OLED, you'll need to turn on local dimming, otherwise the backlight brightness will wash out the blacks. If your set is an LED TV without local dimming, HDR is not worth it anyway.
I understand that part. The issue is that using netflix or Amazon aND streaming 4k HDR does not cause this issue. it is only when playing disk that causes it.
 
That is bullshit.

Sony will make the OG PS4 chip in 16nm to cover costs too... there is no other path to go if they want to cut prices in the future.

Yeah, lol. It read like a hidden 'war chest' comment. Sony has the money to do it (most of the time it is already pre-planned/budgeted when the initial ones ship), just like their hardware of the past. Only this time, they are profiting by far in the console division, more so than before.
 
That's crazy, didn't they say there would be no difference? I'd hate to be someone who bought an older model assuming there wouldn't be a difference.

So it begins.

Before the Neo even launches...iterative consoles are here guys right now starting with the XBO S.
 
You guys are underestimating its ability to stream 4k that alone makes it worth the upgrade.
If you already have a 4K/HDR capable TV :/ But many of us are still sitting on the fence on 4K which is why we're debating on how big of an upgrade this is.

Personally I'd love to play movies in 4K but I'd need a new AVR and projector capable of 4K and HDR before there is a point for me to jump in on the One just for 4K movies. I have an old 1080p Marantz SR6006 right now and Marantz launches their 4K/HDR capable SR6011 this september which would be a great choice for me, but we're talking about $1400 just for the AVR. And I don't know if there are any true 4K/HDR capable projectors under like $10000 yet. I guess I could get a new 4K TV instead but that would be a downgrade in size.

So it's all about the total cost for me at this point, and it's just way too expensive to jump in on 4K right now imo.
 
I have the old model and I don't think I'll be upgrading. I only really used it for exclusives, which incidentally are now heading to PC so I don't really know what to do with it (since I have a powerful PC).

I'm kinda surprised to see a performance difference. I thought Microsoft would have capped the performance so it would be identical to the old model but since they didn't do that, I can't help but wonder why they didn't add a few extra compute units to the machine to make it on par with the PS4. The small difference in place right now only makes me worried that future games will target the new model and underdeliver on the old one, unless Microsoft intends to raise the clock speed like Sony did with the PSP. The fat X1 is very quiet already so even if it were to be overclocked, it would probably still be colder and quieter than the PS4.
 
It... it was sarcasm. (Thought you would have picked up on that with how I had the top and bottom sentence structured, lol.)

Loudness is subjective, where as dB is, as you said, objective pressure measurement.

Hence why some say it's a 'jet engine' and others, 'a whine'. lol

In other words, I completely agree.

Sorry for not getting that ;-)
 
Sure. that's why i'd disagree with the notion that this XB1S is in anyway designed to compete with NEO, hence my response to the original poster.

This may not be made specifically to compete with Neo but if they launch close to each other anybody deciding between a XB1S and PS4Neo may decide to pick up a significantly more powerful Neo for their 4k playing console depending on the price. They would compete whether that was MS' intention or not.
 
So this is some cut down 480? Not surprised with the perf difference l mean this is not your usual chip downsize but a move to FinFET.
 
DF's comparison of Rise of the Tomb Raider was interesting, with increased frames and reduced tearing. But I'd be interested to see if the input lag they identified on the One was eliminated as a result of the increased performance and/or change in architecture.

Can anyone with both copies be able to do a quick test? Or DF, are you listening?
 
I can't speak for anyone else, but the way my setup is, it is beyond a hassle to use the 4K TV apps for any streaming. They are slow and unintuitive, not to mention the added inconvienence of using ARC or running an optical cable to get audio back to the reciever, which cuts out certain formats of HD audio and is generally messy.

I bought a XB1 at launch, and I am thrilled to replace it with the XB1-S. The X1 is already what runs my media center, so to have an integrated 4K box with UHD BR playback is incredible. It sucks that I bought the Samsung UHD player, but I'll find someone who can use it.

Which TV are you using?
 
I can't speak for anyone else, but the way my setup is, it is beyond a hassle to use the 4K TV apps for any streaming. They are slow and unintuitive, not to mention the added inconvienence of using ARC or running an optical cable to get audio back to the reciever, which cuts out certain formats of HD audio and is generally messy.

I bought a XB1 at launch, and I am thrilled to replace it with the XB1-S. The X1 is already what runs my media center, so to have an integrated 4K box with UHD BR playback is incredible. It sucks that I bought the Samsung UHD player, but I'll find someone who can use it.

Hmm which TV are you using? I'm using a Samsung JS9500 and definitely sticking to using it for the 4k streaming apps. I had issues with ARC though, as sometimes it wouldn't switch to the audio, but using optical now and it's fine. The JS9500 uses an Octa Core processor so very speedy too! I'm not going to be using the One S for the apps (well no 4K support on Amazon anyway), only Blu-Ray's and UHD (although still no HD audio option for bitstream, so will miss the HD audio signs on the AV which I do like seeing! Chosen uncompressed 5.1 instead).
 
Top Bottom