EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

Hold on, hold on, hold on...

The figures in the article suggest the GPU advantage and memory advantage are cumulative so in real world terms PS4 could have 80-100% advantage.

Here's the math:

1920 × 1080 = 2,073,600
1600 × 900 = 1,440,000
line1/line2 = 1.44
30/24 × line3 = 1.8

If we assume the twenty-something is around 24 then PS4 has an 80% advantage. If its actually 21-22 then the PS4 advantage is DOUBLE.

Am I totally off-base here? Can someone please check the maths.

Could PS4 really be double the graphical power of XBone in real-world scenarios?
 
What has price and release date got to do with the massive sales for the last 4 years?

There is brand loyalty in the UK, both to Sony and MS. The fact Sony sold any consoles at all when it launched is testament to that.

Are you being serious ??
Price matter a whole lot if all you care about is 3rd party games .
 
Hold on, hold on, hold on...

The figures in the article suggest the GPU advantage and memory advantage are cumulative so in real world terms PS4 could have 80-100% advantage.

Here's the math:

1920 × 1080 = 2,073,600
1600 × 900 = 1,440,000
line1/line2 = 1.44
30/24 × line3 = 1.8

If we assume the twenty-something is around 24 then PS4 has an 80% advantage. If its actually 21-22 then the PS4 advantage is DOUBLE.

Am I totally off-base here? Can someone please check the maths.

Could PS4 really be double the graphical power of XBone in real-world scenarios?
Your math is correct. And as I said earlier in the thread, I think the advantage in the OP is overstated. I believe it will be closer to either/or on the framerate/resolution advantage, which, on a cheaper and much smaller system, is still quite embarrassing.
 
Your math is correct. And as I said earlier in the thread, I think the advantage in the OP is overstated. I believe it will be closer to either/or on the framerate/resolution advantage, which, on a cheaper and much smaller system, is still quite embarrassing.

You're one of the tech gods here, if I'm not mistaken. What's your take on what JonnyLH has been saying?

Edit: Sorry for double post.
 
This is the way I see it, one is cheaper and more powerful (how much more, is up for debate). You like Halo, DR3 and other xbox titles you buy the xbone, you like Sony stuff and prefer multiplats having better performance (again, how much is up for debate) then you go with PS4.
 
Penello said it.

Did he say it was better than PS4?

1) How does he know its better than ps4?
2) Penello even after talking to his people still messing up something simple as Bandwidth number , and up to now his number are still contradiction to MS own document number and now you expect me to believe he can judge two audio chips?
3) Sorry the guy is no tech and Pr mouth who can't even back his statements. he has no creed.
 
Hold on, hold on, hold on...

The figures in the article suggest the GPU advantage and memory advantage are cumulative so in real world terms PS4 could have 80-100% advantage.

Here's the math:

1920 × 1080 = 2,073,600
1600 × 900 = 1,440,000
line1/line2 = 1.44
30/24 × line3 = 1.8

If we assume the twenty-something is around 24 then PS4 has an 80% advantage. If its actually 21-22 then the PS4 advantage is DOUBLE.

Am I totally off-base here? Can someone please check the maths.

Could PS4 really be double the graphical power of XBone in real-world scenarios?

Maybe... if you factor in not only GFlops but also DDR3-vs-GDDR5 bandwidth and ROPs differences.

Remember : that tidbit came before Durango GPU overclock.
 
This is the way I see it, one is cheaper and more powerful (how much more, is up for debate). You like Halo, DR3 and other xbox titles you buy the xbone, you like Sony stuff and prefer multiplats having better performance (again, how much is up for debate) then you go with PS4.

Or...you like games and get both ;)
 
Your math is correct. And as I said earlier in the thread, I think the advantage in the OP is overstated. I believe it will be closer to either/or on the framerate/resolution advantage, which, on a cheaper and much smaller system, is still quite embarrassing.

Truth is consoles frame rates are either 30 or 60 so even if a game on X1 is 25 to 30fps and PS4 is 40 to 45fps it not going to matter .
They just going to lock the PS4 version and call it a day without doing anything extra is what i expect to happen if both at the same res .
So in way the extra power don't matter well at least until later .

EDIT of course talking about 3rd party games .
 
Or...you like games and get both ;)

Some people love games and can't afford to get both.

The price of a PS4 is ~6.66 games
The price of a X1 is ~ 8.33 games

Some people would rather get more games for one console than invest in 2.

Yes, I know that price of consoles go down as the generation progresses. This is just based on the prices right now
 
Hold on, hold on, hold on...

The figures in the article suggest the GPU advantage and memory advantage are cumulative so in real world terms PS4 could have 80-100% advantage.

Here's the math:

1920 × 1080 = 2,073,600
1600 × 900 = 1,440,000
line1/line2 = 1.44
30/24 × line3 = 1.8

If we assume the twenty-something is around 24 then PS4 has an 80% advantage. If its actually 21-22 then the PS4 advantage is DOUBLE.

Am I totally off-base here? Can someone please check the maths.

Could PS4 really be double the graphical power of XBone in real-world scenarios?

no.
 
There is now way i can read through almost 2500+ post. Can someone summarize what I've missed?

X1 sucks ps4 rulez(its cheaper and stronger) in hardware terms. If your pc gamer one sucks more then the other.
Something about latency.
Something about clockspeed and turbo mode.
No dual apu and dual gpu talk.
Move engines and esram stuff.
You know your average console specs thread.
Ooh yeah console gaf thinks pc gaf are dicks and should stop posting in threads about console performance because console gaf gets salty.
 
I have a question I'd like answering because I have absolutely no tech knowledge whatsoever (and I mean even on a rudimentary level): If Sony decided to offer the same tech as the PS3 but just upscaled to 1080p with better AA (baked in shadows and the such), would it be feasibly possible to run it at 120fps? It might be a daft question because I'm pretty daft when it comes to these things. Go easy on me Gaf.
 
I have a question?

Why do please say that 40% theoretical difference won't be seen in real world?

Is it because its easier for XB1 to reach its theoretical max of 1.31TF vs PS4 1.84TF due to efficiency being non linear?
 
Some people love games and can't afford to get both.

The price of a PS4 is ~6.66 games
The price of a X1 is ~ 8.33 games

Some people would rather get more games for one console than invest in 2.

Yes, I know that price of consoles go down as the generation progresses. This is just based on the prices right now

True...I have debated selling my ps4 when t comes since I only like a few exclusives from sony...but will probably keep it ;)
 
I have a question I'd like answering because I have absolutely no tech knowledge whatsoever (and I mean even on a rudimentary level): If Sony decided to offer the same tech as the PS3 but just upscaled to 1080p with better AA (baked in shadows and the such), would it be feasibly possible to run it at 120fps? It might be a daft question because I'm pretty daft when it comes to these things. Go easy on me Gaf.

Maybe if its stickmen fighting? :P - even at that, TVs I dont think generally support it and only few monitors.
 
The DDR vs GDDR myth primarily comes from the CAS for the respective memory.

The CAS for GDDR tends to be 2-4x that of DDR. However that is offset by the high frequency that GDDR runs compared to DDR.

As an example Hynix (H5GQ2H24AFR) GDDR5 runs at a CAS of 5-20 (depending on configuration/bus speed). Where as their DDR3 has CAS of 5-11. In all likely hood the Xbox one will be running a CAS around 10 as thats pretty normal at those bus speeds (http://en.wikipedia.org/wiki/DDR3_SDRAM)

The PS4 memory clock is a bit more than 2x that of the Xbox One. so if we take the worst case CAS of the Hynix memory the latency of both the Xbox and PS4 will be the same at about 10ns.

As far as the DDR5 for CPU issue. I don't think thats a big issue anyways. Anyone that is concerned with CPU performance optimizes their workloads to be Local Cache (L1/L2) bound anyways. The jaguar cores have 2MB of L2 cache. With the higher bandwidth of the GDDR5 vs DDR3 this means that the 2MB cache can be flushed and reloaded 3x faster. So even if latency on the PS4 would be 2x that of the Xbox... it would be a wash.

Now there certainly are workloads that are pure latency bound that can't utilize the L2 cache. But I don't see those workloads being very likely or at least not "maskable" in a system that has an end user latency of 15ms (@60fps).. where as we're talking about latency differences in the 10s of NS.

Now the Xbox One certainly has some advantages, that 32MB memory pool is fast. And if they can fit a workload into eSRAM it can certainly outperform the PS4. Think of a workload where the GPU reads from DDR3 and eSRAM does some processing then feeds that data back to eSRAM for the next phase of precessing. In this case the Xbox One can certainly hit that ~272 GB/sec bandwidth. (IE theoretical max) The question is how much of that workload can actually be achieved?

Also: This is my first GAF post after lurking for many many years. (Been lurking since the 360 launch) so please don't shoot me. Also I preordered BOTH systems.. I'm a gamer and games matter.. and the Xbox One will have some great exclusives. But IMHO it's pretty clear which system I'll be buying most multiplatform titles for this generation.
 
I have a question I'd like answering because I have absolutely no tech knowledge whatsoever (and I mean even on a rudimentary level): If Sony decided to offer the same tech as the PS3 but just upscaled to 1080p with better AA (baked in shadows and the such), would it be feasibly possible to run it at 120fps? It might be a daft question because I'm pretty daft when it comes to these things

Do you mean use 720p.
you can calculate it pixel fillrate if im not mistaken is Rops * clock = fillrate
 
What has price and release date got to do with the massive sales for the last 4 years?

There is brand loyalty in the UK, both to Sony and MS. The fact Sony sold any consoles at all when it launched is testament to that.

The point he's making - which is correct - is that historic sales show that brand loyalty is limited in UK and not going to guarantee success. Sony went from PS2 high to weak PS3 launch and only clawed their way back in UK via strong games and price cuts. Nintendo went from weak Gamecube to dominant Wii. 360 went from weak Xbox to decent start to strong legs. Of most recent relevance is very strong Wii sales (sold more than 360 in less time remember) didn't translate over to Wii U despite strong brand recognition.

Price and date matter because the UK historic sales also show price is a key factor in broader success and date matters because a lot of the initial advantage 360 had over PS3 was launching with more than 12 month lead (and two Christmas peak sales).

In other words all the market evidence says when PS4 and XB1 launch in UK it'll be a res-set for the majority of purchase and price and games vs brand will make the difference.

Don't get me wrong the XB1 will pick up a decent launch via Xbox fans but the whole point is that isn't the majority - the bulk of UK game market has been proven multiple times to not follow previous generation brand trends.
 

There's a twist here :

One basic example we were given suggested that without optimisation for either console, a platform-agnostic development build

...he talks about "real world scenarios" while the example refer to "platform-agnostic development build".

That's where Cerny made it clear that PS4 was "easy" to program for because you don't need to optimize this much to get performances.
 
The DDR vs GDDR myth primarily comes from the CAS for the respective memory.

The CAS for GDDR tends to be 2-4x that of DDR. However that is offset by the high frequency that GDDR runs compared to DDR.

As an example Hynix (H5GQ2H24AFR) GDDR5 runs at a CAS of 5-20 (depending on configuration/bus speed). Where as their DDR3 has CAS of 5-11. In all likely hood the Xbox one will be running a CAS around 10 as thats pretty normal at those bus speeds (http://en.wikipedia.org/wiki/DDR3_SDRAM)

The PS4 memory clock is a bit more than 2x that of the Xbox One. so if we take the worst case CAS of the Hynix memory the latency of both the Xbox and PS4 will be the same at about 10ns.

As far as the DDR5 for CPU issue. I don't think thats a big issue anyways. Anyone that is concerned with CPU performance optimizes their workloads to be Local Cache (L1/L2) bound anyways. The jaguar cores have 2MB of L2 cache. With the higher bandwidth of the GDDR5 vs DDR3 this means that the 2MB cache can be flushed and reloaded 3x faster. So even if latency on the PS4 would be 2x that of the Xbox... it would be a wash.

Now there certainly are workloads that are pure latency bound that can't utilize the L2 cache. But I don't see those workloads being very likely or at least not "maskable" in a system that has an end user latency of 15ms (@60fps).. where as we're talking about latency differences in the 10s of NS.

Now the Xbox One certainly has some advantages, that 32MB memory pool is fast. And if they can fit a workload into eSRAM it can certainly outperform the PS4. Think of a workload where the GPU reads from DDR3 and eSRAM does some processing then feeds that data back to eSRAM for the next phase of precessing. In this case the Xbox One can certainly hit that ~272 GB/sec bandwidth. (IE theoretical max) The question is how much of that workload can actually be achieved?

Also: This is my first GAF post after lurking for many many years. (Been lurking since the 360 launch) so please don't shoot me. Also I preordered BOTH systems.. I'm a gamer and games matter.. and the Xbox One will have some great exclusives. But IMHO it's pretty clear which system I'll be buying most multiplatform titles for this generation.

Great first post, welcome :) - Nothing wrong with owning both consoles :P
 
Do you mean use 720p.
you can calculate it pixel fillrate if im not mistaken is Rops * clock = fillrate

I was just wondering how much extra power is necessary for games like KZS and Driveclub to offer realtime lighting and other technical words that leave me bewildered, like sub-surface scattering (actually I do know roughly what that is) :). If all that was removed and they just used a last gen approach, would it be 'feasibly' possible to run at 120fps? what exactly prevents it if it can't?
 
The point he's making - which is correct - is that historic sales show that brand loyalty is limited in UK and not going to guarantee success. Sony went from PS2 high to weak PS3 launch and only clawed their way back in UK via strong games and price cuts. Nintendo went from weak Gamecube to dominant Wii. 360 went from weak Xbox to decent start to strong legs. Of most recent relevance is very strong Wii sales (sold more than 360 in less time remember) didn't translate over to Wii U despite strong brand recognition.

Price and date matter because the UK historic sales also show price is a key factor in broader success and date matters because a lot of the initial advantage 360 had over PS3 was launching with more than 12 month lead (and two Christmas peak sales).

In other words all the market evidence says when PS4 and XB1 launch in UK it'll be a res-set for the majority of purchase and price and games vs brand will make the difference.

Don't get me wrong the XB1 will pick up a decent launch via Xbox fans but the whole point is that isn't the majority - the bulk of UK game market has been proven multiple times to not follow previous generation brand trends.

Have you told MS this? Someone should tell them that it is all a foregone conclusion.
 
I'm not saying the Xbox will dominate, far from it. I'm saying it will be very close. The Xbox brand is very powerful in the US and UK, I don't see that changing. Microsoft's mistakes will cost them the lead they would otherwise have after such a successful generation but that is all. People who expect a PS4 domination are just letting their feelings cloud their judgment.

Just like how the psOne and ps2 brands were right?
 
This is referring to GDDR v DDR latencies inside the GPU, which as I've said, doesn't matter inside the GPU. CPU doesn't have the bus to be able to handle GDDR, hence why it skips clock cycles.

EDIT: It's also very different inside a unified architecture because the GPU doesn't have the RAM sat next to it, its bus is decreased and latencies would be similar.

My head now hurts, I'm off to play some games.

I'm not a tech person, so I may be off base here, but it seems Mark Cerny is saying GDDR5 latency isn't much higher than for DDR3:

Digital Foundry: Developers tell us that they love the GDDR5, they love the bandwidth but there are questions on latency. How do you cope with that in your set-up? It's not something that developers have much experience with in terms of interfacing with a CPU.

Mark Cerny: Latency in GDDR5 isn't particularly higher than the latency in DDR3. Also, GPUs are designed to be extraordinarily latency tolerant so I can't imagine that being much of a factor.

Maybe I'm misunderstanding, but if not, I tend to trust him over some guy on GAF (no offense). It also seems to line up with what others are saying here about how the latency issue is overblown.

http://www.eurogamer.net/articles/digitalfoundry-face-to-face-with-mark-cerny

The DDR vs GDDR myth primarily comes from the CAS for the respective memory.

The CAS for GDDR tends to be 2-4x that of DDR. However that is offset by the high frequency that GDDR runs compared to DDR.

As an example Hynix (H5GQ2H24AFR) GDDR5 runs at a CAS of 5-20 (depending on configuration/bus speed). Where as their DDR3 has CAS of 5-11. In all likely hood the Xbox one will be running a CAS around 10 as thats pretty normal at those bus speeds (http://en.wikipedia.org/wiki/DDR3_SDRAM)

The PS4 memory clock is a bit more than 2x that of the Xbox One. so if we take the worst case CAS of the Hynix memory the latency of both the Xbox and PS4 will be the same at about 10ns.

As far as the DDR5 for CPU issue. I don't think thats a big issue anyways. Anyone that is concerned with CPU performance optimizes their workloads to be Local Cache (L1/L2) bound anyways. The jaguar cores have 2MB of L2 cache. With the higher bandwidth of the GDDR5 vs DDR3 this means that the 2MB cache can be flushed and reloaded 3x faster. So even if latency on the PS4 would be 2x that of the Xbox... it would be a wash.

Now there certainly are workloads that are pure latency bound that can't utilize the L2 cache. But I don't see those workloads being very likely or at least not "maskable" in a system that has an end user latency of 15ms (@60fps).. where as we're talking about latency differences in the 10s of NS.

Now the Xbox One certainly has some advantages, that 32MB memory pool is fast. And if they can fit a workload into eSRAM it can certainly outperform the PS4. Think of a workload where the GPU reads from DDR3 and eSRAM does some processing then feeds that data back to eSRAM for the next phase of precessing. In this case the Xbox One can certainly hit that ~272 GB/sec bandwidth. (IE theoretical max) The question is how much of that workload can actually be achieved?

Also: This is my first GAF post after lurking for many many years. (Been lurking since the 360 launch) so please don't shoot me. Also I preordered BOTH systems.. I'm a gamer and games matter.. and the Xbox One will have some great exclusives. But IMHO it's pretty clear which system I'll be buying most multiplatform titles for this generation.

Awesome first post. I think mine was a fart gif. *sigh*

Anyway, I think the guy you were talking to disappeared.
 
Truth is consoles frame rates are either 30 or 60 so even if a game on X1 is 25 to 30fps and PS4 is 40 to 45fps it not going to matter .
They just going to lock the PS4 version and call it a day without doing anything extra is what i expect to happen if both at the same res .
So in way the extra power don't matter well at least until later .

EDIT of course talking about 3rd party games .

They'll add in more effects on PS4 until it falls to a stable 30 and take away effects/resolution until the One reaches a stable 30.
 
X1 sucks ps4 rulez(its cheaper and stronger) in hardware terms. If your pc gamer one sucks more then the other.
Something about latency.
Something about clockspeed and turbo mode.
No dual apu and dual gpu talk.
Move engines and esram stuff.
You know your average console specs thread.
Ooh yeah console gaf thinks pc gaf are dicks and should stop posting in threads about console performance because console gaf gets salty.

So basically just another day at the office then? Good to know I haven't missed much.
 
As far as the DDR5 for CPU issue. I don't think thats a big issue anyways. Anyone that is concerned with CPU performance optimizes their workloads to be Local Cache (L1/L2) bound anyways. The jaguar cores have 2MB of L2 cache. With the higher bandwidth of the GDDR5 vs DDR3 this means that the 2MB cache can be flushed and reloaded 3x faster. So even if latency on the PS4 would be 2x that of the Xbox... it would be a wash.

I don't believe that's the case. So far, for both systems, the CPU's look to be standard Jaguar's so the L2 can only consume what a standard Jaguar core cache can. Without heavy modification, it's not going to be able to consume more or write more.

In the vgleaks's diagram, CPU bus from the GDDR is marked as <20GB/sec. If that's true, then the CPU doesn't benefit from the faster RAM. (Not that it really matters)
 
When I shop for a PC I look for something that has more power. I apply the same principle when I shop for a console. I played Xbox 360 most of last gen. About a year ago I finally bought a PS3 popped in God of War 3 and never turned back. Definitely going with PS4. Edit: First post just wanted to state my opinion. Thanks for approving mod. Cheers!
 
They'll add in more effects on PS4 until it falls to a stable 30 and take away effects/resolution until the One reaches a stable 30.

You think so maybe later on but for the first year or two don't see it .
To me the extra power going to come in handy mid gen .
When devs keep on adding more and more stuff to a game while not giving a damn about res and frame rate .


Nice first post and welcome .
 
Truth is consoles frame rates are either 30 or 60 so even if a game on X1 is 25 to 30fps and PS4 is 40 to 45fps it not going to matter .
They just going to lock the PS4 version and call it a day without doing anything extra is what i expect to happen if both at the same res .
So in way the extra power don't matter well at least until later .

EDIT of course talking about 3rd party games .

Fluctuating 25-30 FPS vs. rock stable 30 FPS doesn't matter? Well, I disagree.
 
I had thought MS did their overclocks purely for marketing reasons, but now I'm thinking after getting feedback from developers about the Xbone power vs. the PS4 power they are clawing for every little bit of power they can get to make the gap less obvious.
 
Great first post, welcome :) - Nothing wrong with owning both consoles :P

Thank you... but it does seem I made a mistake I want to correct. According to the "leaked" architecture diagrams. The Xbox one has ~30GB/sec bus to the DDR3. (IE it's not a full 68GB/sec)

PS4 has ~20GB/sec. But the PS4 also has the Onion/Onion+ for another 10GB/sec to GRRD3. So for cpu memory access the 2 seem to be equivalent.. if not the Xbox One potentially being a bit faster. (Wish I had a dev kit to play with)
 
I have a question?

Why do please say that 40% theoretical difference won't be seen in real world?

Is it because its easier for XB1 to reach its theoretical max of 1.31TF vs PS4 1.84TF due to efficiency being non linear?
Actually, it's the other way around. Because One has a more complex memory architecture which must be managed correctly, it's harder for One to reach its theoretical max than it is for PS4. That's exactly why the article gives an example with nearly double the performance on PS4 (see posts above), even though the developers are estimating the difference at ~50%.

I don't know why people say the difference won't be seen in the real world. There's plenty of evidence to the contrary:

- If on-paper differences really weren't meaningful, buying a new GPU with better specs wouldn't always make sense.
- Historically, games have always shown tech differences; it's why Digital Foundry and Lens of Truth can exist, or why Genesis vs. SNES arguments filled schools.
- I think the games already show it. The best in-engine stuff on PS4 looks more advanced to me than the best in-engine stuff on One. The best gameplay on PS4 looks or runs better than One titles (even Albert Penello agrees).
 
Why are people afraid that MS will moneyhat companies to make lower end PS4 ports? It doesn't make sense and I thought it was illegal anyway. It is some tinfoil hattery to think this way.

The consoles survive because game makers make games for them. Can you imagine if MS tried to strong arm Activision to make CoD inferior or par and then Activision said "F you. No CoD for xbone this rotation." You don't think Kotick wouldn't do that given his past of sticking his corporate peen into companies when they became unruly?

I think the big game companies could damage MS more than they could damage the game companies.

If anything, given that PS4 is purportedly easier to dev for, it would simply be a reversal of the 360/PS3 multiplat situation we have this gen.

I agree. Microsoft's under no obligation to do that sort of stuff. They don't own the gaming industry like many Xbox fanboys think.

3rd party publishers can easily ignore Microsoft's bribing & can even make games PS4 exclusive should PS4 have a much higher install base than Xbox One.

Judging by the Wii, maybe a metric ton of them. If it's used correctly. I firmly believe that bundling the Kinect was the right decision. A console can't survive on specs alone and the PS4 has nothing to attract the casual crowd with.

Hate to rain on your parade, but most consumers aren't gonna shell out $500 w/tax just for another version of Kinect.

Also, the popularity that Kinect used to have is long gone now. Just because it was popular on Xbox 360, doesn't mean it'll be the same for Xbox One.
 
Thank you... but it does seem I made a mistake I want to correct. According to the "leaked" architecture diagrams. The Xbox one has ~30GB/sec bus to the DDR3. (IE it's not a full 68GB/sec)

PS4 has ~20GB/sec. But the PS4 also has the Onion/Onion+ for another 10GB/sec to GRRD3. So for cpu memory access the 2 seem to be equivalent.. if not the Xbox One potentially being a bit faster. (Wish I had a dev kit to play with)

Wouldn't it be nice if they sent DevKits out to do things like theoretical benchmarking that could actually be published to the public? That'll be the day.
 
I don't believe that's the case. So far, for both systems, the CPU's look to be standard Jaguar's so the L2 can only consume what a standard Jaguar core cache can. Without heavy modification, it's not going to be able to consume more or write more.

In the vgleaks's diagram, CPU bus from the GDDR is marked as <20GB/sec. If that's true, then the CPU doesn't benefit from the faster RAM. (Not that it really matters)

Yes you are correct.. ( See my 2nd.. post correcting my error )
 
Have you told MS this? Someone should tell them that it is all a foregone conclusion.

Y'know you're just not willing to accept anything that might seem negative for MS are you?

MS would know this. But they expected (I'm sure) to launch from a position of goodwill, with reasonable price parity (because all signs pointed to PS4 also launching with camera packed in which would have brought the price much closer), more or less power parity (all signs pointing to Sony having 4GB RAM) and on the back of a strong reveal.

They got caught twice badly by Sony (who clearly dropped the camera late in the day and doubled their RAM size) and badly by the market.

Now I'm not saying it's a foregone conclusion but I am saying you are flat out wrong if you think the UK has the kind of brand loyalty you think and I am saying the odds are against MS right now due to current circumstances (in UK).

It's all chance and probability but you can see patterns in markets and the UK is not that brand driven for games consoles.

But I won't bother replying again because you're clearly not able to take on board anything that doesn't fit into a MS looking good reality. One last thing though - I've no actual preference, this gen I had PC, 360, PS3 and Wii. I'm simply pointing out local market realities and it's not my - or any one else's fault - that MS reveal went badly, that they suffered a consumer backlash or that they've ended up more expensive than their key competitor. That's just how it played out and to ignore the likely impact this will have at market is just to put blinkers on.

Few people (outside real Sony/Nintendo diehards) are raising concerns and issues around XB1 because they're biased - they're just talking realistically about where the console is right now.

Last gen Sony struggled for ages coming off the (still) best selling and most popular home console ever because they made some mistakes and they got caught by surprise by the competition in some areas. Right now MS is the one in that position and we don't have to look much further than how MS was able to take advantage last gen (from a weaker brand standpoint than Sony then) to see how the market will likely play out unless MS pulls off some strong reversals (given MS brand now is still weaker comparatively than Sony's was coming off the PS2).
 
Top Bottom