If the S is as quiet and cool as the fat despite the size and internal power supply, then it will be a legendary feat of engineering.
Panos don't give a fuck about boundaries
If the S is as quiet and cool as the fat despite the size and internal power supply, then it will be a legendary feat of engineering.
Panos don't give a fuck about boundaries
I heard he gave Isaac Newton's corpse a good kicking.
Like the games will look stunning, but we can all agree it's not native 4K.
What? According to your logic Ryse son of rome (for example) is native 1080p. Because the XB1 outputs that game at 1080p resolution and the upscaling is handled by the game internally instead of by the XB1 or the TV/monitor.Nope. Native 4K, in fact any native resolution simply means the output isn't being upscaled by the monitor or TV.
The whole point of the term is to describe an output data-stream in absence of an external resizing mechanism.
How the image is composed is irrelevant, because ultimately its not a useful or meaningful piece of data when there are so many factors potentially in play regarding perceptual image-quality.
If Baffin is 16CU, it doesnt make sense for AMD to not launch the full GPU with the 460.
What? According to your logic Ryse son of rome (for example) is native 1080p. Because the XB1 outputs that game at 1080p resolution and the upscaling is handled by the game internally instead of by the XB1 or the TV/monitor.
Edit: You seem to be talking solely about the native resolution of a display. Not that of a game. Two different things![]()
Resolution contributes towards image-quality but isn't the totality of it. There's a massive difference between an unprocessed and a heavily anti-aliased image of the same dimensions. Then there's the major issue that within any size of frame-buffer, render resolution per-element is arbitrary; UI may be different from scene res, shadow and vfx res may be different again, texture size and texel density (texture res versus pixel size when scaled, transformed and projected onto a surface) play a huge role, and so on.
Its fundamentally a meaningless metric by itself.
The only time it really comes into play is when the geometry and resolution of the display device is different to that of the output device, as that potentially involves another layer of distortion and degradation to creep in as its an additional transformation step to a finished image.
Point being; there is no such animal as native resolution within a rendered image, because its a mosaic of components at arbitrary sizes. Hence native refers to native to the output device as compared to the display device which is completely external to the rendering process.
I understand what you mean. However my point is that for a long time we've used "native" in the context of a game to speak of the rendered resolution. Ryse 1600*900 (despite outputting a 1920*1080 image), BF4 1280*720 (XB1 / 1600*900 (PS4), Knack 1920*1080 for example.Point being; there is no such animal as native resolution within a rendered image, because its a mosaic of components at arbitrary sizes. Hence native refers to native to the output device as compared to the display device which is completely external to the rendering process.
How's the power consumption of the Slim compared to the OG Xbone? Apologies if this was in the OP - I suck at unscrambling GPU-speak.
I understand what you mean. However my point is that for a long time we've used "native" in the context of a game to speak of the rendered resolution. Ryse 1600*900 (despite outputting a 1920*1080 image), BF4 1280*720 (XB1 / 1600*900 (PS4), Knack 1920*1080 for example.
Agreed.This is a fair point, but I think given how the technology and range of techniques in use has expanded in recent years I think its gotten to a state where its not really a viable metric of quality anymore. It just creates more confusion.
I expect the waters to get even muddier in future as average resolutions creep upwards (and beyond ultimately) towards 4k, making the perceptual difference increasingly difficult to discern, and changing the pro/con balance of employing internal re-projection techniques in the render pipeline.
Bottom line for me is that while I believe most gamers do care about image quality, clinging to simple numerical metrics like resolution simply serves to give the peanut-gallery fodder to try and one-up each other over.
Bottom line for me is that while I believe most gamers do care about image quality, clinging to simple numerical metrics like resolution simply serves to give the peanut-gallery fodder to try and one-up each other over.
Website in another language that I never heard of & they said nothing about Baffin
There isn't a single rumour there with source, it's just writter opinion.
Why are people still doubting its 28nm?![]()
From reddit.
So it's not 28nm.
What?Why are people still doubting its 28nm?
![]()
From reddit.
So it's not 28nm.
showmethereceipts.gif
Just FYI, AMD's other design win is some sort of ARM SoC, so prolly not powering SlimBone.
"In its most recent earnings call AMD announced a brand new semi-custom design beyond the two semi-custom chips that the company had announced last year. This means that the company is working on three semi-custom chips. One of the two that were announced last year is x86-64bit based and the other is ARM 64-bit based. AMD confirmed that the latest design win announced in the Q2 earnings call is also x86 based making this the companys fourth design to incorporate the x86-64bit CPU architecture. The previous two being the XBOX ONE and the PS4 and the third being the one announced last year."
http://wccftech.com/amd-making-processor-nintendo-nx/
lol Actually, I forgot I was gonna say, "PSP3?" at the end of my post.Sony's new handheld. Calling it now.
Takes them "beyond gaming," I believe they said. Pretty sure they also said it was one of the x86 designs though, actually… =/A long time ago AMD said that one custom design is not related to gaming.
Xbox One real-world power usage at 120W, not a huge jump from 260X accounting for lower clocks of the Durango GPU and rest of the Jaguar cores + subsystem. If the same translates for Xbox One S, we could be looking at ~80W or around 40% less consumption than the OG PS4.
This is the correct term.4Kinda