XBOX ONE Reveal: UI faked from the start. Very choppy, and CBoaT

Could explain why they never stated what the clocks were. I'll remain optimistic that they'll land on the 1.6GHz/800MHz in the end.

Nope. Clocks can only go down at this stage as they try to get acceptable yields.

Releasing useless info such as transistors count but not clock speeds should tell you all.
 
And i can go to that channel in 1 second with the xbox...options!! People love options!!

Or..you can grab your phone, unlock it, pull up the internet and then search for the TV guide or whatever...

I'll just tell my xbox to goto ESPN so i can watch the game.....

But by all means...Keep continuing to do things the hard way.

My DVR has a dedicated iPhone app. I can search the channel guide and set the DVR. With SIRI voice control if I want to be a tool.

I can even use the iPhone as a horrible touchscreen remote if I want!
 
Maybe at first. It's going to be another long generation and the games will reflect a vast difference if the hardware has a vast difference in power. People will buy the better system eventually. I mean when was the last time an undeserving system was the biggest seller for an entire generation?

the last time weaker hardware sold the most systems ? ... Wii. before that ps2 , before that ps1 etc
 
The original in game dashboard was trash in PGR3, and look how amazing it is now. I am confident they will surpass whatever they showed during that presentation.

Oh you. You have lost all cred on this site for all the crap you pulled. Keep drinking that milk.

and woo wee at the damage control up in here. Faking a reveal gets excuses these days I guess
 
it's 1 am

everyone else's asleep

you're still awake

you just wanna watch some television

are you seriously going to shout at your console to switch to thirteen

or are you just going to press the numbers one and three on your remote

That's the thing. The den (aka, the man cave) is next to my daughters' bedroom. When I game late at night, I have to turn the sound down or use headphones. When playing online, I generally don't use a mic after their bedtime.

So I'm sure as hell not going to sit there yelling at the TV.
 
Maybe at first. It's going to be another long generation and the games will reflect a vast difference if the hardware has a vast difference in power. People will buy the better system eventually. I mean when was the last time an undeserving system was the biggest seller for an entire generation?

Almost every time? Wii over PS60? PS2 over Xcube? DS over PSP?

They werent "undeserving" but certainly weaker systems.
 
I am not trying to bring out the MS Defense Force at me, but I've heard GPU clocks might be downgraded. 8-900 gigaflops for gaming. The APU is big. This isn't 100% confirmed though and is being done to improve yields.

CBoat do you mind adding anything?

If this is true... X-box Done
 
Nope. Clocks can only go down at this stage as they try to get acceptable yields.

Releasing useless info such as transistors count but not clock speeds should tell you all.

Was the rumor about Sony considering a boost to 2.00GHz credible at all then? The only way they could considering it is if they had some really efficient cooling system in place.
 
Did MS release a FLOP spec for the GPU at the reveal? I know the official PS4 spec sheet from its reveal had an actual TFLOP rating in it, but I can't recall reading an official one for the Xbone. If this is the case that's a definite sign that MS is having problems hitting targets with their version of the APU right now.
 
Almost every time? Wii over PS60? PS2 over Xcube? DS over PSP?

They werent "undeserving" but certainly weaker systems.

Yes but they were deserving. I personally don't see a system with better quality first party output, better relationship with Indy devs, no charge for online, and a "purely for gaming" console to be considered the worse console this time. Not even including the draconian anti-consumer crap like 24 hour Internet checks.
 
Nope. Clocks can only go down at this stage as they try to get acceptable yields.

Releasing useless info such as transistors count but not clock speeds should tell you all.

We're basing all of this on murmurings and rumors. We have no definite proof that they have yield issues or that they've downclocked, which is why I remain optimistic. Obviously if either is true then yea they can only downclock at this point.

TL;DR I refuse to believe until it's in mah face. D:
 
People want to overanalyze a simple console OS before release, so I get to overanalyze a gif that is clearly a dishonest portrayal of what is happening in the video. They try and make it seem like there is a huge delay between her voice command and something happening, when in fact, she gives two commands, only one of which we actually get to see transition in real-time.


Microsoft's revealing of the Kinects TV functions, which were prerecorded and faked, with comments about how fast it worked while it was really not working at all, was an honest portrayal? MS tried to make the OS look snappy and fluid,when in fact.......oh forget it, your argument is awful.
 
Oh my god, Thuway if that's true. What a god damn clusterfuck...

Almost every time? Wii over PS60? PS2 over Xcube? DS over PSP?

They werent "undeserving" but certainly weaker systems.

Yet you are completely ignoring release date and price, which are arguably much more important. Also, PS2 was extremely powerful at the time, and it can be argued that Wii didn't win the console war via the traditional way (Though Kinect etc allow the same to be repeated).
 
I am not trying to bring out the MS Defense Force at me, but I've heard GPU clocks might be downgraded. 8-900 gigaflops for gaming. The APU is big. This isn't 100% confirmed though and is being done to improve yields.

CBoat do you mind adding anything?

KO!
 
I can look it up on the app on my smartphone, or do a search in the cable box. Or use the menu to navigate to the sports tier.

I actually did this because someone was over the other day and wanted to watch the military channel. Took all of 10 seconds to find.

And pretty soon, you'll be able to tell your Xbox to turn ESPN on!
 
Jaguar cores tops out at 1.8 ghz.

Not true, you can get jaguar at 2.0GHz, AMD's new Operton X line has a 2.0GHz variant. The problem is the TDP increase from 1.6GHz to 2.0GHz is 66% which is why it's not happening despite what everyone thinks.


iXAhdBPk22Tus.png
 
I don't get the anger at people questioning the OS.

Fact 1: XBONE OS demo was faked, we did not know this previously

Fact 2: XBONE OS doesn't have good performance now, in fact it is pretty slow and choppy

Fact 3: Microsoft has 6 months to change this

All 3 are true, and some anger at fact 1 due to being deceived is also warranted. Fanboys defending Microsoft just because of fact 3 make no sense.
 
The MTV Xbox 360 reveal was faked as well. That should have been obvious by the fact that they only plugged in the power cable without any video cable. But that UI ended up being just fine upon release.
 
Not entirely surprising, but it IS surprising that the frame-rate of the UI is so low when so many resources are set aside for it. You'd expect the pre-release UI to be a bit buggy, but you wouldn't expect a simple UI to be so inefficient, not unless its much earlier along in its development than it should be by this point.

If this is true, it'd seem to at least on its surface, confirm that the X-Bone's software development is fairly early along. If so, I wouldn't expect a lot of gameplay video's at Microsofts E3 event. If everything is fairly early in development, I'd expect lots of target render videos, very short clips, and video taken from purely cinematic scenes.

In short, I'd expect the sort of clips we largely saw at the PS4 event three months ago. Sony probably had a little more to show off, since they seem to be a few months further into development. I'd expect both to talk about established platforms as much as possible though, if only to minimize how much footage they need to display to fill up their time slots.
 
Its perfectly fair to criticize it when MS stated and SHOWED the device running smooth as silk.

MS lied. Plain and simple. Nobody is in the wrong here by calling them out. Do you even understand what's going on? We're not bashing on the box for what it is right now - we are bashing on the box for being shown something entirely different - a falsification. I expect major overhauls before release but lying about a product is a no-no. We've been there in 2005 - no need to do this shit all over again.

MS didn't lie. You need to understand that this was a demo. Did you know that the xbox 360 games at E3 2005 were buggy, framey and running on MAC pros? Did MS lie? Was it a falsification? They're still making the damn things. It was probably running on a bunch of high end dev units used to overcompensate for the software bugs.

The product isn't done yet. It was a demo. Really, this is how things work dude.
 
I am not trying to bring out the MS Defense Force at me, but I've heard GPU clocks might be downgraded. 8-900 gigaflops for gaming. The APU is big. This isn't 100% confirmed though and is being done to improve yields.

CBoat do you mind adding anything?

isn't it a little too late to do that? aren't $1 billion worth of games (plus 3rd party games) already in development?
 
People grilled the Wii U for having a shitty OS, and now we see that the XBone might also have some OS issues of its own.

I also don't think we've seen any real time footage of the PS4 OS in action either... wonder why..

E3 couldn't come any sooner.
 
MS didn't lie. You need to understand that this was a demo. Did you know that the xbox 360 games at E3 2005 were buggy, framey and running on MAC pros? Did MS lie? Was it a falsification? They're still making the damn things. It was probably running on a bunch of high end dev units used to overcompensate for the software bugs.

The product isn't done yet. It was a demo. Really, this is how things work dude.
It's lying since they tried to sell it as a real live demo. They even said they were using a Comcast feed.
 
As Proelite put it, that could be the reason why they never released GPU specs, only transistors count and other bullshit. Goddamn, Microsoft.
 
Power has never killed a system. Developers aren't upset by this btw, they can easily scale down. 1080p might not be defacto for Xbone.

While that is true, PS4 might not only be cheaper or equal in pricing, but also release first. Further, Sony stands as a white knight for gamers currently. I mean, at this point there might be not a single reason to buy a box unless you're hardcore into their first party (or have problems with using a remote). Kinect 2.0 in combination with the Occulus Rift is the only thing that might be interesting, and Kinect 2.0 can be used on Pc (or so I heard).

1080p60 vs 720p60 or 720p60 vs 720p30 is HUGE.
 
While that is true, PS4 might not only be cheaper or equal in pricing, but also release first. Further, Sony stands as a white knight for gamers currently. I mean, at this point there might be not a single reason to buy a box unless you're hardcore into their first party (or have problems with using a remote). Kinect 2.0 in combination with the Occulus Rift is the only thing that might be interesting, and Kinect 2.0 can be used on Pc (or so I heard).

1080p60 vs 720p60 or 720p60 vs 720p30 is HUGE.
I dont think we'll see a 720p game on the PS4.
 
Microsoft's revealing of the Kinects TV functions, which were prerecorded and faked, with comments about how fast it worked while it was really not working at all, was an honest portrayal? MS tried to make the OS look snappy and fluid,when in fact.......oh forget it, your argument is awful.

Especially when that was the thing that non gaming tech sites really zeroed in on.. just how fast the UI was and how quickly the voice recognition was. Microsoft should really be put on blast for this, it's the exact same stunt they pulled with the original Kinect reveal.
 
If you could somehow change the data of what was posted what would the actual breakdown of the performance versus usage look like?

basically what kind of performance versus ram usage would be realistic and more acceptable?
 
Top Bottom