No, just no. They didn't put GDDR5 because they chose a different path.
No. DDR3 is better than GDDR5. The lower the number the lower the heat. There's even an extra letter there which indicates it needs more stuff so it will give off more heat.
lol, Im just trying to find a method to the madness that's all.
I never say stuff like this because I hate adding to the anonymous negativity on the Internet but ... Marcus is a shithead and I don't trust him at all. He sounds like he's complaining because the press hasn't had access to retail units. While I agree that those should be out by now, I also think Marcus is the type to whine about it and spread FUD as part of his cranky shtick.Just watched the video.
He's hearing from a couple of sources that the retail kits are prone to overheating and believes that everything we have seen so far is running on the debug units which look identical.
My question would be, shouldn't the debug units be overheating too? Those units that are on show for hours and hours and hours at shows?
He's making valid points about HDCP and nobody having access to retail boxes though. Don't buy the overheating though. It's not like hiding it now is going to save Sony any face as they'll get dragged over the coals once people realise at home. Even more so because of RROD.
Albert had to know this was coming, right?
I mean, there's a few possibilities:
1) He didn't know it'd be 720p/1080p, even though devs have been saying that shit since E3. That means he was honestly out of the loop on what the hell is going on with 3rd parties and really not a useful source of information
2) He knew it was going to explode like this, and chose to make posts and statements like "We invented DirectX" and "Not more than 30%, no" to make him look insane later.
3) He thought somehow the discrepancy would be lessened by launch, or at least ignored. Which is more likely for his line of thought, but unlikely to actually work in 1080p as the rumors were bubbling for a while and he played dumb.
4) Or... ultimately, he's a PR account (Even though he said he isn't PR), as a Major Nelson Jr., and basically has to toe the company line, so he knew ultimately what he was being forced to say is an out-and-out lie.
So he was either out-of-the-loop, insane, foolish or a liar.
I'm still waiting for that list of Kinect voice commands that work in which countries...
Bull.
Shit.
As a gamer who has been around for every console launch since the 2600, I can assure you, the old days sucked, and nobody had fun when they figured out that they fell for some slick marketing bullshit and bought the dud system of the generation, even if they had multiple consoles.
Having access to more real, factual information for consumers to make an informed purchase decision is ALWAYS better than the roll of the dice consumers had to do in the old days.
will ask two questions of the detractors, honest questions.
1. What piece of information would you want that I could provide that would convince you there is not a huge delta in performance?
2. If it comes out after we launch that the difference between 3rd party games is maybe single-digit FPS between the two platforms, will I get an apology or concession?
Are you sure he said that? Maybe we're just misremembering it...
He's clearly not PR, so I wonder why he can't explain why all the stuff he told us about his own product turned out to be bullshit.
Albert hasn't realized that we we control the outcome, HE needs us, WE don't need him or his product. Not to be over dramatic or anything but the internet has helped topple governments, brought the spotlight to other consumer unfriendly things, like pink slime, forcing them to close plants etc. so you can be sure it can help make the xbone a failure.
The word of mouth is going to kill them.
Digital Foundry: Obviously though, you are limited to just 32MB of ESRAM. Potentially you could be looking at say, four 1080p render targets, 32 bits per pixel, 32 bits of depth - that's 48MB straight away. So are you saying that you can effectively separate render targets so that some live in DDR3 and the crucial high-bandwidth ones reside in ESRAM?
Andrew Goossen: Oh, absolutely. And you can even make it so that portions of your render target that have very little overdraw... For example, if you're doing a racing game and your sky has very little overdraw, you could stick those subsets of your resources into DDR to improve ESRAM utilisation. On the GPU we added some compressed render target formats like our 6e4 [six bit mantissa and four bits exponent per component] and 7e3 HDR float formats [where the 6e4 formats] that were very, very popular on Xbox 360, which instead of doing a 16-bit float per component 64pp render target, you can do the equivalent with us using 32 bits - so we did a lot of focus on really maximizing efficiency and utilisation of that ESRAM.
Give us reasons to buy. Chasing your tail on multi platform and technical parity is a dead end. Has been for a while really, and really you should have known that it would fly with this forum. So why not try a positive alternative?
Hi Guys, I have a theory. what do you guys think?
Some believe 3rd parties could do 1080p/60fps on the xbone maybe after a few years.
We have Forza. It exists and so we know 1080p/60fps on xbone is technically feasible.
Then i remember this from one of eurogamer's interviews
So what is this 6e4 / 7e3 HRD format? Perhaps the microsoft enginnner is suggesting using the xbone custom texture format could fit into the 32Mb ESRAM
So how is render target format typically handled? What is the popular texture format nowadays? How different from this 6e4/73e format?
Will adding an additional texture format in the project greatly increases the project cost? (Artist time, Art asset, technical difficulty, ... )
Anyway my theory is IF the cost to use the texture is high and using the xbone custom format is the only way to do 1080p on xbone.
For 1st party they would have no problem using this 6e4/7e3 format. They work on xbone only and the engine could be purposely built to use the format. (Forza - 1080p)
For 2nd party dev, they may also only have to worry about the xbone platform. but their engine is not designed for the xbone. Perhaps could add some enhancements to the engine so it runs a bit better? (Ryse, Crytek CryEngine, 900p )
For 3rd party, well they may choose not to use the custom format because of the high project cost, leaving the game running on 720p. (Activision COD, EA BD4).
If true i found this ironic since Microsoft openly thinks 720p upscaling to 1080p is perfectly acceptable.
So this is a big IF, but if it's true, what's next for xbone?
Microsoft pays for all 3rd parties to use the 6e4/7e3 format?
Hi Guys, I have a theory. what do you guys think?
Some believe 3rd parties could do 1080p/60fps on the xbone maybe after a few years.
We have Forza. It exists and so we know 1080p/60fps on xbone is technically feasible.
Then i remember this from one of eurogamer's interviews
So what is this 6e4 / 7e3 HRD format? Perhaps the microsoft enginnner is suggesting using the xbone custom texture format could fit into the 32Mb ESRAM
So how is render target format typically handled? What is the popular texture format nowadays? How different from this 6e4/73e format?
Will adding an additional texture format in the project greatly increases the project cost? (Artist time, Art asset, technical difficulty, ... )
Anyway my theory is IF the cost to use the texture is high and using the xbone custom format is the only way to do 1080p on xbone.
For 1st party they would have no problem using this 6e4/7e3 format. They work on xbone only and the engine could be purposely built to use the format. (Forza - 1080p)
For 2nd party dev, they may also only have to worry about the xbone platform. but their engine is not designed for the xbone. Perhaps could add some enhancements to the engine so it runs a bit better? (Ryse, Crytek CryEngine, 900p )
For 3rd party, well they may choose not to use the custom format because of the high project cost, leaving the game running on 720p. (Activision COD, EA BD4).
If true i found this ironic since Microsoft openly thinks 720p upscaling to 1080p is perfectly acceptable.
So this is a big IF, but if it's true, what's next for xbone?
Microsoft pays for all 3rd parties to use the 6e4/7e3 format?
I'm already sold on the exclusives. At this point the only road block I see is the price. It's just so hard to justify when the more powerful system is $100 cheaper. Hell, I have already accepted that it is not as powerful. Just let the price reflect that and I bet there will be far fewer complaints. As we all know that it will still have games that are both gorgeous, as well as fun. It is just not a deal at $500.
This Reddit thread can't be real:
http://www.reddit.com/r/xboxone/comments/1pqs7b/battlefield_4_1080p_and_60_fps_footage_for_xbox/
So what is this 6e4 / 7e3 HRD format? Perhaps the microsoft enginnner is suggesting using the xbone custom texture format could fit into the 32Mb ESRAM
So how is render target format typically handled? What is the popular texture format nowadays? How different from this 6e4/73e format?
Will adding an additional texture format in the project greatly increases the project cost? (Artist time, Art asset, technical difficulty, ... )
Anyway my theory is IF the cost to use the texture is high and using the xbone custom format is the only way to do 1080p on xbone.
For 1st party they would have no problem using this 6e4/7e3 format. They work on xbone only and the engine could be purposely built to use the format. (Forza - 1080p)
For 2nd party dev, they may also only have to worry about the xbone platform. but their engine is not designed for the xbone. Perhaps could add some enhancements to the engine so it runs a bit better? (Ryse, Crytek CryEngine, 900p )
For 3rd party, well they may choose not to use the custom format because of the high project cost, leaving the game running on 720p. (Activision COD, EA BD4).
If true i found this ironic since Microsoft openly thinks 720p upscaling to 1080p is perfectly acceptable.
So this is a big IF, but if it's true, what's next for xbone?
Microsoft pays for all 3rd parties to use the 6e4/7e3 format?
Hi Guys, I have a theory. what do you guys think?
Some believe 3rd parties could do 1080p/60fps on the xbone maybe after a few years.
We have Forza. It exists and so we know 1080p/60fps on xbone is technically feasible.
Then i remember this from one of eurogamer's interviews
So what is this 6e4 / 7e3 HRD format? Perhaps the microsoft enginnner is suggesting using the xbone custom texture format could fit into the 32Mb ESRAM
So how is render target format typically handled? What is the popular texture format nowadays? How different from this 6e4/73e format?
Will adding an additional texture format in the project greatly increases the project cost? (Artist time, Art asset, technical difficulty, ... )
Anyway my theory is IF the cost to use the texture is high and using the xbone custom format is the only way to do 1080p on xbone.
For 1st party they would have no problem using this 6e4/7e3 format. They work on xbone only and the engine could be purposely built to use the format. (Forza - 1080p)
For 2nd party dev, they may also only have to worry about the xbone platform. but their engine is not designed for the xbone. Perhaps could add some enhancements to the engine so it runs a bit better? (Ryse, Crytek CryEngine, 900p )
For 3rd party, well they may choose not to use the custom format because of the high project cost, leaving the game running on 720p. (Activision COD, EA BD4).
If true i found this ironic since Microsoft openly thinks 720p upscaling to 1080p is perfectly acceptable.
So this is a big IF, but if it's true, what's next for xbone?
Microsoft pays for all 3rd parties to use the 6e4/7e3 format?
If you can afford to wait, just wait it out. The games aren't going anywhere.
Personally, I'm waiting until XB1 has 12-15 must-have exclusives. ( I previously said 10, but felt it's a bit on the low-end after reviewing the exclusives I've enjoyed across all consoles. )
Thus far only Crimson Dragon is my must-have. Project Spark and Titanfall are on PC, so your loss there on my check-list, MS.
If it's on the pricing side, both consoles will drop eventually.
I assume the main problem MS has with that surprisingly big power difference is, XBox core audience was built on the original console upon this power difference. Then x360 launched and again, the core audience had a reason to stick with the XBox as it was the most powerful console for a year, and then it kept having the best multiplatform ports.This may be one of the dumbest things I have ever heard come out of an high ranking employee of a company say.
Albert, you have got to be kidding. You have the audacity to plead for an apology because we were not buying into your lies that the Xbox One isn't that far off in power from the PS4, Now you want to vindicate yourself by comparing games at two different resolutions? That is some serious mental gymnastics you're playing there.
You should be ashamed there is still a "single-digit fps difference between the two platforms" when one is pushing 125% more pixels and in some cases more effects and better IQ.
Microsoft just needs to admit the power difference and move on to focus on the positive aspects of their console design. You know, like how Nintendo dominated the industry with a less powerful console than all other competitors. Focusing on (read: lying about) your consoles power (or lack thereof) does nothing but shine more positive light on Sony's offerings.
Let 'em dream!This Reddit thread can't be real:
http://www.reddit.com/r/xboxone/comments/1pqs7b/battlefield_4_1080p_and_60_fps_footage_for_xbox/
Hi Guys, I have a theory. what do you guys think?
Some believe 3rd parties could do 1080p/60fps on the xbone maybe after a few years.
We have Forza. It exists and so we know 1080p/60fps on xbone is technically feasible.
Then i remember this from one of eurogamer's interviews
So what is this 6e4 / 7e3 HRD format? Perhaps the microsoft enginnner is suggesting using the xbone custom texture format could fit into the 32Mb ESRAM
So how is render target format typically handled? What is the popular texture format nowadays? How different from this 6e4/73e format?
Will adding an additional texture format in the project greatly increases the project cost? (Artist time, Art asset, technical difficulty, ... )
Anyway my theory is IF the cost to use the texture is high and using the xbone custom format is the only way to do 1080p on xbone.
For 1st party they would have no problem using this 6e4/7e3 format. They work on xbone only and the engine could be purposely built to use the format. (Forza - 1080p)
For 2nd party dev, they may also only have to worry about the xbone platform. but their engine is not designed for the xbone. Perhaps could add some enhancements to the engine so it runs a bit better? (Ryse, Crytek CryEngine, 900p )
For 3rd party, well they may choose not to use the custom format because of the high project cost, leaving the game running on 720p. (Activision COD, EA BD4).
If true i found this ironic since Microsoft openly thinks 720p upscaling to 1080p is perfectly acceptable.
So this is a big IF, but if it's true, what's next for xbone?
Microsoft pays for all 3rd parties to use the 6e4/7e3 format?
Wouldn't surprise me. No straw left ungrasped.
Jesus, Mary and Joseph. Wow. WOW.
Wow My DVD's are 1080p already then, thank you TV. GoodBye BluRay.
Bull.
Shit.
As a gamer who has been around for every console launch sinch the 2600, I can assure you, the old days sucked, and nobody had fun when they figured out that they fell for some slick marketing bullshit and bought the dud system of the generation, even if they had multiple consoles.
Having access to more real, factual information for consumers to make an informed purchase decision is ALWAYS better than the roll of the dice consumers had to do in the old days.
Bluray? throw the HDMI too
Just watched the video.
He's hearing from a couple of sources that the retail kits are prone to overheating and believes that everything we have seen so far is running on the debug units which look identical.
My question would be, shouldn't the debug units be overheating too? Those units that are on show for hours and hours and hours at shows?
He's making valid points about HDCP and nobody having access to retail boxes though. Don't buy the overheating though. It's not like hiding it now is going to save Sony any face as they'll get dragged over the coals once people realise at home. Even more so because of RROD.
I think I found his source.
http://gamingbolt.com/major-nelson-...encing-don-mattrick-and-how-ps4-isnt-a-threat
Maybe the reason MS didn't put DDR5 in their console was because of overheating issues and believe Sony couldn't either?
I would but I need that for when I get my 4K tv, it's a 4k rated HDMI don't you know?
I read that more as "when there isn't a lot going on on-screen the 32MB can be stretched more effectively." And I don't know how willing 3rd party devs will be to go to such extra lengths if they can just drop resolution to 720p in order to maintain parity.
How sad is it that even MS is now running with the upscaller being the secret sauce? I mean really. Let's take a moment to think about what that means.
These guys have been throwing so much sand on their customers eyes. Everything to prevent those pre orders from being cancelled.
Its disgusting and dishonest
Sorry to hijack this topic, but it seems Marcus Beer (annoyed Gamer) is claiming via sources the PS4 is prone to overheating...and this is why no retail units are available for reviewers etc..
anybody else heard this?
It makes little sense considering debug kits are identical to retail units apart from the fact hat they can run unsigned code. So there's no reason why they should be fine while retails units would die left and right.