VGLeaks rumor: Durango CPU Overview

LukasTaves said:
Is it really that common?
Using LZO has been standard practice for most/all game-assets on disk since PS1 days - so yes, it's extremely common usage pattern, and it's also the main reason why hardwiring the algorithm in new consoles is cost-efficient/sensible.

The leaked performance figures are fine - the usual usage pattern is to speed up external media (HDD, Optical or Network) and 200MB/s will keep up with all but the fastest HDDs (the rumor also indicate a slightly more demanding algorithm than LZO, so that factors in as well).
I'll agree though that de-compression is not a big-deal - but rumors suggest unit can also do compression, which makes it much more interesting IMO.
 
I have read some of the patent and everything I have a come across so far has been DMA + Compression, there was nothing other then regular DMA + Compression mentioned by vgleaks so why expect something else?.

If you look by the building blocks, they do seem to be only dma (with tile and untiling capabilities) plus compression, but the operations described there do not sound like a regular DMA to me (perhaps it's just the wording, i hate patent language xD).

I'm not expecting them to do much else than move data around, but i'm expecting it to do so more intelligently than by just following cpu commands to read/write data (even though they can still function as that too).
 
DXTCn is far far better for textures then JPEG, a shame that it wont decode into it.

GPUs do realtime decompression of DXT, despite these "rumors" and "secret saucing" of otherwise. This is why DXT is used for texture data.

jpeg creates smaller sizes for texture storage but then it requires decompression and compression into DXT before sending it to GPUs. It is not just cpu intensive but also it introduces another lossy format into the mix which lowers overall quality of textures.

what developers currently use is they zip their textures into one file and then stream it from the game when needed. Zip is lossless so no data lost. PNG uses same compression algo as Zip. With intelligent compression, you can get great results on lossless compression of DXT textures.

200 MBs for zlib decoder is extremely fast. It is so fast that it is not needed to be that fast, unless you have SSD, because your BD and HDD cant feed enough data to make it fully used.

So thats very good... It allows for many things that will make our experience on PS4/720 better... from faster loading, to "faster" installation of games, both things that PS3 has struggled with for instance. In fact, I would say that right now, that 200 MBs will be more useful than 8 GB everyone is focusing on :-).
 
spwolf said:
200 MBs for zlib decoder is extremely fast. It is so fast that it is not needed to be that fast
It's worth looking beyond external device I/O though. Eg. fast-Jpeg on PS2 had the benefit of using it as on-demand decompressor from Ram->eDram, and with the hw-extensions for virtualizing memory access, decompression can get even more mileage.
 
GPUs do realtime decompression of DXT, despite these "rumors" and "secret saucing" of otherwise. This is why DXT is used for texture data.

jpeg creates smaller sizes for texture storage but then it requires decompression and compression into DXT before sending it to GPUs. It is not just cpu intensive but also it introduces another lossy format into the mix which lowers overall quality of textures.

what developers currently use is they zip their textures into one file and then stream it from the game when needed. Zip is lossless so no data lost. PNG uses same compression algo as Zip. With intelligent compression, you can get great results on lossless compression of DXT textures.

200 MBs for zlib decoder is extremely fast. It is so fast that it is not needed to be that fast, unless you have SSD, because your BD and HDD cant feed enough data to make it fully used.

So thats very good... It allows for many things that will make our experience on PS4/720 better... from faster loading, to "faster" installation of games, both things that PS3 has struggled with for instance. In fact, I would say that right now, that 200 MBs will be more useful than 8 GB everyone is focusing on :-).



is the Zlib Decompression Hardware in the PS4 the same as the hardware that's in the Xbox 3?
 
The leaked performance figures are fine - the usual usage pattern is to speed up external media (HDD, Optical or Network) and 200MB/s will keep up with all but the fastest HDDs (the rumor also indicate a slightly more demanding algorithm than LZO, so that factors in as well).
I'll agree though that de-compression is not a big-deal - but rumors suggest unit can also do compression, which makes it much more interesting IMO.

200 MBs is extremely fast... problem with it is that it is probably theoretical performance - in memory. Nevertheless, with dedicated engine for it, they will probably have SDK for streaming data on demand from one big game file, so thats going to be very good.

Supposedly MS is using LZX... it is 2x more powerful than LZO with compression.

Compression doesnt matter, and 200 MBs is certainly not for compression.

Overall, it is very important bit of 720/PS4.
 
is the Zlib Decompression Hardware in the PS4 the same as the hardware that's in the Xbox 3?

Depending on which algorithm it implements it could even be better, but chances are its the same. Durango implements LZ77. [PS4's Zlib is probably using DEFLATE (it seems to be Zlibs decompression method) it is a expanded LZ77]
 
It's worth looking beyond external device I/O though. Eg. fast-Jpeg on PS2 had the benefit of using it as on-demand decompressor from Ram->eDram, and with the hw-extensions for virtualizing memory access, decompression can get even more mileage.

yeah but DXT changes that... since GPU takes DXT, you have to recompress the data in memory which sucks. GPUs dont do realtime DXT compression, just decompression.

I didnt run any benchmarks, but for instance crunch is much better solution than using jpeg to store textures on disk. And their solution for textures might be very similar to crunch in nature, where DXT compressed texture is smartly compressed with lossless codec that is adjusted to the nature of DXT format/data.
 
So there isn't any specs on it besides VGleaks saying it's there?

Nope, but i just checked stuff and it is most likely DEFLATE which improves upon LZ77 but by how much I dunno (it adds huffman coding to the LZ77 compressed stream afaict)

yeah but DXT changes that... since GPU takes DXT, you have to recompress the data in memory which sucks. GPUs dont do realtime DXT compression, just decompression.

I didnt run any benchmarks, but for instance crunch is much better solution than using jpeg to store textures on disk. And their solution for textures might be very similar to crunch in nature, where DXT compressed texture is smartly compressed with lossless codec that is adjusted to the nature of DXT format/data.

Yeah and too take this even further, from what I have read modern graphics card don't have to decompress the data till its needed so it can sit in there caches compressed (L1, i think, but im not too sure) but you lose all of this is you have uncompressed textures which is exactly what the Move Engine would produce from a JPEG.
 
spwolf said:
Compression doesnt matter, and 200 MBs is certainly not for compression.
The amount of data-writes(or sending) in modern games is increasing at a rapid pace - it's becoming more important to compress those, and a dedicated unit can be a major win there. Of course I don't expect it to run at 200MB/s - but then it doesn't need to.

yeah but DXT changes that... since GPU takes DXT, you have to recompress the data in memory which sucks.
Only if you're doing it on highly-persistant data.
Decompression on demand into eDram doesn't need to keep data around - so any output that GPU can read will do (And I believe leaks specified 16bit YUV as an option).

I didnt run any benchmarks, but for instance crunch is much better solution than using jpeg to store textures on disk.
Very true - but that's a different usage pattern, where your data remains resident in memory for long time.
 
i have been trying to follow it, looked for past 3 pages... so jpeg... 373 MBs?
Modern GPUs dont have texture compression support?

Please bear with me, I am trying to learn.

Textures are sometimes stored and decoded as JPEG in games. To decode it, normally you assign that to your cpu, or in compute capable gpu. This is done to keep the file size small.After that, it is then fed into the gpu which will then convert it it to the texture format it supports. Reading the durango summit papers, it seems the gpu supports several standard and some proprietary texture format. What this does is that it saves the cpu or gpu resources (read: alu/flop) you will spend in order to do this. So, like the other fixed function units in both systems (audio unit, video decoder/encoder etc) it is there to save cpu/gpu resource that will be better spent doing other stuffs.
 
Textures are sometimes stored and decoded as JPEG in games. To decode it, normally you assign that to your cpu, or in compute capable gpu. This is done to keep the file size small.After that, it is then fed into the gpu which will then convert it it to the texture format it supports. Reading the durango summit papers, it seems the gpu supports several standard and some proprietary texture format. What this does is that it saves the cpu or gpu resources (read: alu/flop) you will spend in order to do this. So, like the other fixed function units in both systems (audio unit, video decoder/encoder etc) it is there to save cpu/gpu resource that will be better spent doing other stuffs.

Why in gods name would you want to move the JPEG to eSRAM decompress it into a raw format and then re-encode it again, thats a incredible waste of CPU/GPU performance? why not have it compressed in DXTCn the first place. Also the Durango GPU supports everything the GCN cards do, thats it, no special formats.
 
GPUs do realtime decompression of DXT, despite these "rumors" and "secret saucing" of otherwise. This is why DXT is used for texture data.

jpeg creates smaller sizes for texture storage but then it requires decompression and compression into DXT before sending it to GPUs. It is not just cpu intensive but also it introduces another lossy format into the mix which lowers overall quality of textures.

what developers currently use is they zip their textures into one file and then stream it from the game when needed. Zip is lossless so no data lost. PNG uses same compression algo as Zip. With intelligent compression, you can get great results on lossless compression of DXT textures.

200 MBs for zlib decoder is extremely fast. It is so fast that it is not needed to be that fast, unless you have SSD, because your BD and HDD cant feed enough data to make it fully used.

So thats very good... It allows for many things that will make our experience on PS4/720 better... from faster loading, to "faster" installation of games, both things that PS3 has struggled with for instance. In fact, I would say that right now, that 200 MBs will be more useful than 8 GB everyone is focusing on :-).

Really nice post. It seems like you now understand what I am saying. These things will make these consoles better irrespective of the need to compare or trivialize there functions.
 
Why in gods name would you want to move the JPEG to eSRAM decompress it into a raw format and then re-encode it again, thats a incredible waste of CPU/GPU performance? why not have it compressed in DXTCn the first place. Also the Durango GPU supports everything the GCN cards do, thats it, no special formats.

Read what fafalada just posted. He is a game developer and has more knowledge about these things than I do. And you should read the durango papers you have well. The xbox 360/ ps3 also has special texture formats. Maybe I should use the word custom to make you understand it better. By custom I certainly do not mean its better than what is already available.
 
Questions for the tech guys :D

Are we 100% sure that the CPU is from AMD?

When I read vgleaks I lean more toward IBM.

Is it possible that the CPU is from IBM A2

Quote :

Such facilities are handled by the AXU, which has support for any number of standardized or customized macros, such as floating point units, vector units, DSPs, media accelerators <( Move Engine ) ?and other units with instruction sets and registers not part of the Power ISA. The core has a system interface unit used to connect to other on die cores, with a 256-bit interface for data writes and a 128-bit interface for instruction and data reads at full core speed."

Qoute :

The Blue Gene/Q processor is an 18 core chip running at 1.6 GHz with special features for fast thread context switching, quad SIMD floating point unit, 5D torus chip-to-chip network and 2 GB/s external I/O. The cores are linked by a crossbar switch at half core speed to a 32 MB eDRAM L2 cache. The L2 cache is multi-versioned and supports transactional memory and speculative execution. A Blue Gene/Q chip has two DDR3 memory controllers running at 1.33 GHz, supporting up to 16 GB RAM

Or is this wrong?

Thnx guys :)

Jo
 
VGleaks specifically says x86, so we're pretty sure it isn't PowerPC.

I see :)

You know, We are just haunting for rumors and speculation cuz right now its totaly dead :D

who knows, Maybe its AMD maybe its IBM. Nothing is 100% until MS reveals the console, right?

I just hope for us gamers that what ever they release will keep us happy 10 years down the road. Not another WII U BS
 
I see :)

You know, We are just haunting for rumors and speculation cuz right now its totaly dead :D

who knows, Maybe its AMD maybe its IBM. Nothing is 100% until MS reveals the console, right?

I just hope for us gamers that what ever they release will keep us happy 10 years down the road. Not another WII U BS

if it's an APU, it's an AMD CPU.
 
if it's a APU or SoC why is their a North Bridge showing in the VGleaks documents?


memory_system.jpg


durango_arq1.jpg


Shouldn't the CPU & GPU already be connected?

APUs have northbridges inside them. Even most CPUs have integrated northbridges these days, ever since memory controllers were moved into the processor. The memory interface and highspeed interconnects (AGP, PCI, PCI-Express) used to be handled by a separate northbridge chip, but ever since the Athlon 64 era those functions became too performance sensitive to be off-chip functions.
 
Maybe they are going to go down the shorter lifespan and copy the PC/apple model of releasing a slightly updated version every year or two. Using generic hardware could make this more feasible for developers. This is easily manged with PC hardware so why not with a more generic console?
 
Maybe they are going to go down the shorter lifespan and copy the PC/apple model of releasing a slightly updated version every year or two. Using generic hardware could make this more feasible for developers. This is easily manged with PC hardware so why not with a more generic console?

Because it will be as confusing as shit to consumers, the N64 mem expansion pack was bad enough but a entire new console? every year?, no.

This ignoring the fact that it takes nearly a year to just design a console.
 
Maybe they are going to go down the shorter lifespan and copy the PC/apple model of releasing a slightly updated version every year or two. Using generic hardware could make this more feasible for developers. This is easily manged with PC hardware so why not with a more generic console?

I can't think how this would work with 3+ year dev cycles. Which model would devs target?
 
it's unnecessary since the thread has come back on topic. Just because you feel you should reply to something doesn't mean it's wise to actually do so
 
I can't think how this would work with 3+ year dev cycles. Which model would devs target?

Which model do PC devs target?


I'm not saying this will happen, just saying it's entirely possible. Because these new consoles are built on direct PC architecture, it would be easier than ever to just release updated models every few years. Devs could just program two, or three "settings" of the game, to run on each iteration of the hardware.

It would be like PC development of today, but even easier because you're still not dealing with dozens of different configurations, you're dealing with a few.
 
Which model do PC devs target?


I'm not saying this will happen, just saying it's entirely possible. Because these new consoles are built on direct PC architecture, it would be easier than ever to just release updated models every few years. Devs could just program two, or three "settings" of the game, to run on each iteration of the hardware.

It would be like PC development of today, but even easier because you're still not dealing with dozens of different configurations, you're dealing with a few.

I've speculated before that i expect console cycles to decrease for this particular reason. However, it is highly dependent on how successful the consoles are and how large the transitions are. Ideally the companies would probably like to end up in an android, or rather moreso, IOS scenario where software runs across generations
 
Because it will be as confusing as shit to consumers, the N64 mem expansion pack was bad enough but a entire new console? every year?, no.

This ignoring the fact that it takes nearly a year to just design a console.

Apple manage it with iPad and IPhone and consumers deal with that easily. Direct download and xbla would be straight forward with a not compatible with your device as with iTunes or google play.

Plenty of other electrical devices go through this design and release over short periods of time I don't see this being an issue if the base design is generic enough again like pc's/Android/apple
 
Apple manage it with iPad and IPhone and consumers deal with that easily. Direct download and xbla would be straight forward with a not compatible with your device as with iTunes or google play.

Plenty of other electrical devices go through this design and release over short periods of time I don't see this being an issue if the base design is generic enough again like pc's/Android/apple

This wouldn't work because different market and cost variations of apps/games. A 5 yr refresh rate would work just fine, but annually would kill the console markers and games would never truly advance because the devs/pubs would always be focused on the lowest common denominator(ie the first gen) which is what most people would have.
 
This wouldn't work because different market and cost variations of apps/games. A 5 yr refresh rate would work just fine, but annually would kill the console markers and games would never truly advance because the devs/pubs would always be focused on the lowest common denominator(ie the first gen) which is what most people would have.

Right. There are console games that take years to develop.
 
Ah, gaming forums... where if you don't hate something 100% then you are part of a "defense force".

Very mature.

Well it is a CPU thread and someone mentions Kinect and suddenly the usual suspects rush in and try to convince everyone it is a good thing. So ya, it is a defense force. Don't want to be lumped in, don't post so predictably about the same topics so much.
 
Which model do PC devs target?


I'm not saying this will happen, just saying it's entirely possible. Because these new consoles are built on direct PC architecture, it would be easier than ever to just release updated models every few years. Devs could just program two, or three "settings" of the game, to run on each iteration of the hardware.

It would be like PC development of today, but even easier because you're still not dealing with dozens of different configurations, you're dealing with a few.

Consoles! But yes this is technically possible but I don't think either Sony or Microsoft are really looking into this.
 
Saying it won't work makes little sense though this has worked for PC gaming for decades, with multiple platforms and multiple operating system versions. Apple have proven this works with iPad having 4 generations since 2010.

I'm not saying it's a good thing just that I wonder of Ms are looking at the likes of apple and Android think they can shoe horn this type of product release schedule into the home living room box product.

They seem to be looking to create a product that covers a wide variety of uses and to be able to get a regular hardware purchase and offer customers the most up to date hardware for an affordable price IMO fits into that model better than one new box every decade.

I agree a yearly release wouldn't be great but that doesn't mean something like it won't happen, maybe 2 or 3 years is more feasible.
 
it's unnecessary since the thread has come back on topic. Just because you feel you should reply to something doesn't mean it's wise to actually do so

Sorry for replying to a message directed towards me.

I thought that was the whole idea of forums.


Well it is a CPU thread and someone mentions Kinect and suddenly the usual suspects rush in and try to convince everyone it is a good thing. So ya, it is a defense force. Don't want to be lumped in, don't post so predictably about the same topics so much.

Not trying to convince anyone of anything. Just made a simple reply to what was being discussed.

Don't see why it's so hard to have an actual discussion instead of lumping people in due to different views.

But whatever -- never meant to derail the spec-speculation thread.
 
Saying it won't work makes little sense though this has worked for PC gaming for decades, with multiple platforms and multiple operating system versions. Apple have proven this works with iPad having 4 generations since 2010.

....I agree a yearly release wouldn't be great but that doesn't mean something like it won't happen, maybe 2 or 3 years is more feasible.

Different markets. Different expectations from consumers in terms of where their money is going.

Consoles have been around for a pretty long time now and people are used to spending $300-400 every 5 years or so on a console for video game entertainment.

Doing something every 2 years would put the Xbox brand in a similar position to Sega during the mid '90s -- many consoles, customer confusion, devs not knowing what to support, etc. Just wouldn't be that smart to do imo.
 
This wouldn't work because different market and cost variations of apps/games. A 5 yr refresh rate would work just fine, but annually would kill the console markers and games would never truly advance because the devs/pubs would always be focused on the lowest common denominator(ie the first gen) which is what most people would have.

either way in 5-6 years the lowest common denominator will be x720/ps4.
But with more frequent releases we would get more capable options more often instead of one big jump every 5 years.
Lets say after x720 MS releases new updated console every 2 years, same architecture, just better specs (faster gpu/cpu, ram). They support last 3 versions and developers must do it too.
All new games are compatible with last 3 versions and because of DirectX maybe old games can be made to run better on new consoles (more AA, ssao, etc.) without updates. Just like on PC developers make low/mid/high settings they could do the same for consoles. Even easier because they have only 3 very similar targets.

I would prefer that business model over current. Waiting 6 years for new console is bullshit. I buy new ipad every 2 years and i want the same with consoles.
I hope MS tries something like this. It is time someone changes console business.
 
Different markets. Different expectations from consumers in terms of where their money is going.

Consoles have been around for a pretty long time now and people are used to spending $300-400 every 5 years or so on a console for video game entertainment.

Doing something every 2 years would put the Xbox brand in a similar position to Sega during the mid '90s -- many consoles, customer confusion, devs not knowing what to support, etc. Just wouldn't be that smart to do imo.

I think that's the thing though this box isn't just targeted at video game development hence the reason Ms could go down this route, sure the nextbox could be under spec compared to Sony but in 2 years nextbox gen 2 is out with full BC and a spec that goes past Sony for the same price and then two years after that they have the strongest living room box in terms of power and Ms has been taking in double sometimes triple hardware sales, again I an not saying it's the right way to go but if it was me I would certainly be looking at it as an option
 
Sorry for replying to a message directed towards me.

I thought that was the whole idea of forums.




Not trying to convince anyone of anything. Just made a simple reply to what was being discussed.

Don't see why it's so hard to have an actual discussion instead of lumping people in due to different views.

But whatever -- never meant to derail the spec-speculation thread.

You do come in at certain times...just saying. If you take part in the discussion no one would be ragging on you like this. You only post when it favors you? I could be wrong though.
 
either way in 5-6 years the lowest common denominator will be x720/ps4.
But with more frequent releases we would get more capable options more often instead of one big jump every 5 years.
Lets say after x720 MS releases new updated console every 2 years, same architecture, just better specs (faster gpu/cpu, ram). They support last 3 versions and developers must do it too.
All new games are compatible with last 3 versions and because of DirectX maybe old games can be made to run better on new consoles (more AA, ssao, etc.) without updates. Just like on PC developers make low/mid/high settings they could do the same for consoles. Even easier because they have only 3 very similar targets.

I would prefer that business model over current. Waiting 6 years for new console is bullshit. I buy new ipad every 2 years and i want the same with consoles.
I hope MS tries something like this. It is time someone changes console business.

Not a good idea. Console benefit from low-level coding. If you do this multiple platforms thing, they'd have to use a higher level methodology which would crap out performance. It makes a console into a pc.
 
either way in 5-6 years the lowest common denominator will be x720/ps4.
But with more frequent releases we would get more capable options more often instead of one big jump every 5 years.
Lets say after x720 MS releases new updated console every 2 years, same architecture, just better specs (faster gpu/cpu, ram). They support last 3 versions and developers must do it too.
All new games are compatible with last 3 versions and because of DirectX maybe old games can be made to run better on new consoles (more AA, ssao, etc.) without updates. Just like on PC developers make low/mid/high settings they could do the same for consoles. Even easier because they have only 3 very similar targets.

I would prefer that business model over current. Waiting 6 years for new console is bullshit. I buy new ipad every 2 years and i want the same with consoles.
I hope MS tries something like this. It is time someone changes console business.

No thank you, that would mean shorter dev cycles for games, more rush jobs shoved out of the door before they were ready and less risks taken with the cutting edge hardware overall.

It would also mean higher console prices as the razor/razor blade model would have to be dropped entirely, so you'd either end up with lower overall quality hardware or higher initial prices.

No thanks.
 
I think that's the thing though this box isn't just targeted at video game development hence the reason Ms could go down this route, sure the nextbox could be under spec compared to Sony but in 2 years nextbox gen 2 is out with full BC and a spec that goes past Sony for the same price and then two years after that they have the strongest living room box in terms of power and Ms has been taking in double sometimes triple hardware sales, again I an not saying it's the right way to go but if it was me I would certainly be looking at it as an option

Problem is you end up in the same situation as PC now. Titan equipped i7 PC's (the latest iteration Xbox) being held back by the consoles (the first iteration Xbox) Exaggerating of course, but you get the point.

Also it would require chip nodes to go down in 2 year intervals to be a more powerful console at the same TDP I would think. How long have we been at 28nm now? And how long from 20nm in late 2015 say, until 14nm comes along?

I like simplicity. 2013/2TF/5 years. Developers, squeeze what you can out of this thing. Rinse/repeat.
 
Maybe they are going to go down the shorter lifespan and copy the PC/apple model of releasing a slightly updated version every year or two. Using generic hardware could make this more feasible for developers. This is easily manged with PC hardware so why not with a more generic console?

Durango is a generational leap on its own and the differences between PS4 games and games on Durango won't be obvious to the general market, even if they're things that seems substantial to enthusiasts. Durango isn't underpowered, PS4 is just more powerful. There's a big difference. There's way too much money invested in its research and development to drop it after a couple years, it'll be here to stay for the whole gen. It'll take a game or two for devs to even get a handle on how to really push it, and that'll be 2-4 years into the gen.
 
I'm not saying it's a good thing just that I wonder of Ms are looking at the likes of apple and Android think they can shoe horn this type of product release schedule into the home living room box product.
Big problem with this is pricing. Consoles are usually sold with really low or negative margins, in hope of making money on software and later on reduction of manufacturing complexity. This would not be possible if new consoles would have to be designed and sold every next year. They'd be constantly losing money on hardware. They only reason this model works for apple is because they managed to convince people that it's worth paying so much money for tablets and phones that they make large profit on each device sold. No one will pay such premium for a game console or set top box device, PS3 very clearly proved that.
 
Durango is a generational leap on its own and the differences between PS4 games and games on Durango won't be obvious to the general market, even if they're things that seems substantial to enthusiasts. Durango isn't underpowered, PS4 is just more powerful. There's a big difference. There's way too much money invested in its research and development to drop it after a couple years, it'll be here to stay for the whole gen. It'll take a game or two for devs to even get a handle on how to really push it, and that'll be 2-4 years into the gen.

The differences between X360 games and PS3 games have driven plenty of 360 sales.
 
The differences between X360 games and PS3 games have driven plenty of 360 sales.

We can confidently assume a small bump in sales due to publicized differences, anyway, and despite being hit over the head with sledgehammers how PS3 is a significantly more powerful platform, based on specs, and because of its free online play. However, most of the difference probably comes from preference for the platform's differences and the larger, more connected-userbase of gamers who go where their friends and others are, leading to greater and more effective WOM. Of course, it helps immensely that X360 significantly leads in installed base in the largest, most software sales-heavy market in the world. Barely anyone outside of forum-dwellers reads or strongly cares about DF or LoT analyses. If there's one thing that PS3 proved quite clearly, it's that highly-touted spec differences can mean so very little to the end products' commercial performance.
 
Top Bottom