• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Digital Foundry vs Xbox One S

Side note.

I said months ago Xbox Slim, PS4 Slim, Neo, Scorpio will use TSMC FF+ 16nm instead GF 14nm... to be fair I said in DF thread that Neo will use a TSMC 16nm version of Polaris (Scorpio too).

Looks like a pretty good decrease in size.

TSMC 16nm is a big surprise though "this year" means they'll switch to GloFlo/Samsung 14nm next year? Or someone else that does 16nm Fin Fet?
Big surprise?

Every tech forum since last year that I talked about said TSMC FF+ 16nm is way better for clocks and performance than GF 14nm. The technical aspects of the two process were already know and the result was expected.

Sad AMD is tied with GF until 2022.
 
Kinda bummed that it's not as small as the E3 presentation made it out to be. I knew there was some Apple-esque fuckery going on with those renders, but damn.

At least it looks fantastic in white.
 
Does the speed boost make the OS more responsive (less laggy)? It's my only pet peeve of the console.


Very much looking forward to the S when the 500GB is out.
 
Someone needs to try the framerate-unlocked Tomb Raider 2013 on xbox one s, wasn't that game 60 with drops?

Xbox version had a 30fps limit.

Agree I'm pretty sure that the DF team is already working on that, I love their content

They are always at 60fps (aside from sections involving streaming/loading, which isn't limited by GPU anyway), and there's no automated pixel counting for thousands of frames, so it's somewhat of a moot exercise IMO.

343i would have to release engine stat recordings, and even then, a 7% boost in pixel count is somewhat meaningless to observe.
 
Yes, because the USB3.0 is much faster than the SATA II inside. And the faster HDD speed is a bonus as well.

The biggest reason in reality is lack of OS conflict. Having faster drives is better, and yeah USB3 is faster than SATA but in practice, the biggest boost comes from the OS' ability to segment the internal drive for system caching while exclusively using the external drive for reading of game data.
 
Does the speed boost make the OS more responsive (less laggy)? It's my only pet peeve of the console.


Very much looking forward to the S when the 500GB is out.

Not from what I've seen so far. The UI is still a frustratingly slow experience.


Is anyone else waiting to talk to support about a kinect adapter for the S? This queue is really slow.
 
The biggest reason in reality is lack of OS conflict. Having faster drives is better, and yeah USB3 is faster than SATA but in practice, the biggest boost comes from the OS' ability to segment the internal drive for system caching while exclusively using the external drive for reading of game data.

Nice, just as you would on the PC.
 
What is that is Broken? I am not using full rgb but when playing a UHD movie with HDR I noticed the black bars from the letter boxing are not pure black. I have a Samsung UHD player as well and when playing a movie the black bars are pure black. I think it is the blue ray app causing this as gaming etc it is fine. The black bars are a light shade of black instead of pure black.

My set is professionally calibrated and it only happens when using the S and UHD HDR playback. If I adjust the black level I can make the black bars pure black but I should not need to do this. It seems that the blu.ray app is trying to compensate for the brighter image that happens when HDR is activated. I am going to try to turn off HDR just to test if the black bars stay the same. For now, though there is something wrong some where since black bars should be pitch black like they are when using my Samsung player.

HDR mode will require different calibration settings than other modes due to the extreme brightness requirements for the standard. If you aren't using an OLED, you'll need to turn on local dimming, otherwise the backlight brightness will wash out the blacks. If your set is an LED TV without local dimming, HDR is not worth it anyway.
 
Side note.

I said months ago Xbox Slim, PS4 Slim, Neo, Scorpio will use TSMC FF+ 16nm instead GF 14nm... to be fair I said in DF thread that Neo will use a TSMC 16nm version of Polaris (Scorpio too).

Looks like a pretty good decrease in size.


Big surprise?

Every tech forum since last year that I talked about said TSMC FF+ 16nm is way better for clocks and performance than GF 14nm. The technical aspects of the two process were already know and the result was expected.

Sad AMD is tied with GF until 2022.

It is said AMD will be able to use TSMC with some of their future products. I believe Zen is, Vega possibly as well.
 
I'm going to wait and get a scorpio but it bodes well for it that so many games are already seeing boosts even on the S.
 
Dead Rising 3 had some shocking performance when released, not sure if a patch or a lack of Kinect has helped it since but would be interesting to see a comparison on Xbox One S
weet
 
Big surprise?

Every tech forum since last year that I talked about said TSMC FF+ 16nm is way better for clocks and performance than GF 14nm. The technical aspects of the two process were already know and the result was expected.

Sad AMD is tied with GF until 2022.

It was to me! You would think AMD would use TSMC even for a small amount of chips if their process is so much better. I assume AMD's GF contract doesn't completely tie them to it?

That AMD's customers get the better process seems a bit of a facepalm....
 
It was to me! You would think AMD would use TSMC even for a small amount of chips if their process is so much better. I assume AMD's GF contract doesn't completely tie them to it?

That AMD's customers get the better process seems a bit of a facepalm....
AMD will use TSMC for what they need more Vega and Zen.

About the contract I "guess" they only can use other fab if they already hit the number of production lines they needs by contract.
 
The biggest reason in reality is lack of OS conflict. Having faster drives is better, and yeah USB3 is faster than SATA but in practice, the biggest boost comes from the OS' ability to segment the internal drive for system caching while exclusively using the external drive for reading of game data.
Albert can you say anything about 4K with 30bits and 36bits output? It seems to be broken, or I'm doing something massively wrong. The resulting picture is washed-out and the blacks are grey. Similar to when you're having an RGB range mismatch between two devices.
 
Albert can you say anything about 4K with 30bits and 36bits output? It seems to be broken, or I'm doing something massively wrong. The resulting picture is washed-out and the blacks are grey. Similar to when you're having an RGB range mismatch between two devices.
I'll check these settings when I get home, but I'd like to know if anyone else is seeing the same thing.
 
Thanks would appreciate it. Seeing Hawk269's post it seems like there's indeed something wrong, as UHD-BD uses 4K with 30bits.
I'd be hard pressed to keep the Xbox One S, if this is the case. I already have the Samsung UHD player, so it's not like I'd be missing out on much.

Though Gears of War 4 HDR is reason enough to keep the console. Haha!
 
Do we have any confirmation if Xbox One S somehow resolves the crushed blacks issue for some games? Like it's actually putting out appropriate color ranging? I expect in HDR mode it'll be just fine but wondering if lesser end displays see an improvement.
 
Do we have any confirmation if Xbox One S somehow resolves the crushed blacks issue for some games? Like it's actually putting out appropriate color ranging? I expect in HDR mode it'll be just fine but wondering if lesser end displays see an improvement.
No, MS still hasn't fixed it. The incorrect gamma curve when using the full-rgb range is still there.
 
The biggest reason in reality is lack of OS conflict. Having faster drives is better, and yeah USB3 is faster than SATA but in practice, the biggest boost comes from the OS' ability to segment the internal drive for system caching while exclusively using the external drive for reading of game data.

For Scorpio, slap a small SSD in there for exclusive OS use
 
No, MS still hasn't fixed it. The incorrect gamma curve when using the full-rgb range is still there.

Gotcha, thanks for the reply.

Minor disappointment overall. Glad to have doubled my storage and gotten that tiny performance boost, and it matches my PS4 aesthetically with the white.
 
No, MS still hasn't fixed it. The incorrect gamma curve when using the full-rgb range is still there.

I think the poster is referring to specific games that have black crush when using limited range like CoD: Ghosts, Dead Rising 3, and a few others. If that's the case, then no. The games would have to be patched to use the Xbox One's specific gamma curve.
 
It was to me! You would think AMD would use TSMC even for a small amount of chips if their process is so much better. I assume AMD's GF contract doesn't completely tie them to it?

That AMD's customers get the better process seems a bit of a facepalm....

Idk, it didn't seem TSMC 16nm ff increased the perf / watt that much for the Xb1 S.

I think the Pascal / Maxwell architecture has more to do with how many better Nvidia gpus are with perf / watt.
 
As for noise. I don't notice a noteworthy difference in produced sound when I compare the XB1 with the XB1 S.
I think it's kinda like a lottery. Mine is audible but not to a annoying degree, except for silent scenes then it's annoying. The sound it produces is easily drowned by the game audio.
Judging by the article it's roughly 3dB louder, I had to Google a bit to understand how much that is.

Reality
Physics tells us that for every doubling of acoustical energy, there is a 3dB increase. Conversely, a 3dB decrease means the sound is cut in half. So, 3 is the magic number right? Well, not so fast. This is where we see a conflict between scientific calculations and perceived sound levels. “Perceived” sound levels report how our ears and brain interpret the sound. In other words, perception answers the question of “What sounds ‘twice as loud’?”

Perception
Sound studies tell us time and again that a 3dBA increase in sound level is barely noticeable to the human ear. In fact, you have to raise a sound level by 5dBA before most listeners report a noticeable or significant change. Further, it takes a 10dBA increase before the average listener hears “double the sound.” That’s a far cry from 3dB.
http://www.acousticsbydesign.com/acoustics-blog/perception-vs-reality.htm

Sounds okay to me even though I had hoped the One S was even quieter than the already quiet OG XB1, the internal PSU is probably the culprit here.
 
No, MS still hasn't fixed it. The incorrect gamma curve when using the full-rgb range is still there.

Of course it would be. But to be fair, their devkit fixes this problem in all games released after about CoD:AW anyway. It just sucks for apps in general like Internet Explorer. I wonder if we ever get emulation would it be a problem.

What is that is Broken? I am not using full rgb but when playing a UHD movie with HDR I noticed the black bars from the letter boxing are not pure black. I have a Samsung UHD player as well and when playing a movie the black bars are pure black. I think it is the blue ray app causing this as gaming etc it is fine. The black bars are a light shade of black instead of pure black.

My set is professionally calibrated and it only happens when using the S and UHD HDR playback. If I adjust the black level I can make the black bars pure black but I should not need to do this. It seems that the blu.ray app is trying to compensate for the brighter image that happens when HDR is activated. I am going to try to turn off HDR just to test if the black bars stay the same. For now, though there is something wrong some where since black bars should be pitch black like they are when using my Samsung player.

Please make a thread about this issue. I want MS to get off their butt and fix this shit.
 
Does the speed boost make the OS more responsive (less laggy)? It's my only pet peeve of the console.


Very much looking forward to the S when the 500GB is out.

Unfortunately, most of the hang-ups seem to be from network requests, rather than code performance. That's my impression, anyway.
 
Idk, it didn't seem TSMC 16nm ff increased the perf / watt that much for the Xb1 S.

I think the Pascal / Maxwell architecture has more to do with how many better Nvidia gpus are with perf / watt.
Well you did had near 50% increase in perf/watt... ~47% to be exactly with the move from 28nm to 16nm.

Xbone: 1.31 TFs using 109w
Xbone S: 1.4 TFs using 79w

That is ~47% increase in perf/watt.

BTW that indeed way better than what we saw in Polaris with 14nm for example.
 
The biggest reason in reality is lack of OS conflict. Having faster drives is better, and yeah USB3 is faster than SATA but in practice, the biggest boost comes from the OS' ability to segment the internal drive for system caching while exclusively using the external drive for reading of game data.

Hah, that's what I said several times. A drive dedicated for games is of course faster because there is disk access to the internal drive.
 
Well you did had near 50% increase in perf/watt... ~47% to be exactly with the move from 28nm to 16nm.

Xbone: 1.31 TFs using 109w
Xbone S: 1.4 TFs using 79w

That is ~47% increase in perf/watt.

Well to be fair you have to not include the power draw for HDD, Blu-ray, ram etc to get the actual increase which should be bigger than 50%.

I was hoping for a near 100% jump, something like Kepler to Pascal.
 
Well to be fair you have to not include the power draw for HDD, Blu-ray, ram etc to get the actual increase which should be bigger than 50%.

I was hoping for a near 100% jump, something like Kepler to Pascal.
Yeap but we don't have these data.

If you want a better comparison it can be GCN 28nm to GCN 14nm or in the case of nVidia Maxwell 28nm to Pascal 16nm.

Polaris (the AMD GCN) comparison looks to be worst than Xbox vs S while nVidia I don't have ideia but keep in mind nVidia is over close to 2 generations ahead AMD in perf/watt.
 
Well you did had near 50% increase in perf/watt... ~47% to be exactly with the move from 28nm to 16nm.

Xbone: 1.31 TFs using 109w
Xbone S: 1.4 TFs using 79w

That is ~47% increase in perf/watt.

BTW that indeed way better than what we saw in Polaris with 14nm for example.

Lets all pray Neo gets the TSMC 16 nm treatment.
 
In most cases it is 3db louder and they measured it being right on the console. I doubt anyone will notice the difference of 44db versus 47db when sitting an average distance. But technically it is louder than on Xbox One.
Distance will indeed attenuate the difference, but 44dB versus 47dB isn't negligible. That's about a 40% rise in volume.

(For anyone who isn't aware, decibels are a logarithmic scale. Each decibel added is a 12.2% rise in amplitude versus the previous level.)
 
Distance will indeed attenuate the difference, but 44dB versus 47dB isn't negligible. That's about a 40% rise in volume.

(For anyone who isn't aware, decibels are a logarithmic scale. Each decibel added is a 12.2% rise in amplitude versus the previous level.)

The difference is there but dB if often regarded as a not so good measurement of how loud something actually is.
 
Top Bottom