• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

vg247-PS4: new kits shipping now, AMD A10 used as base, final version next summer

i-Lo

Member
GAF will be glorious once ps4 specs are released. The meltdowns are already happening.

Here's the bucket of tears your highness. Now be silent and drink from the fountain of sorrow for it tastes delicious.

PS4 may have decent specs when unveiled but the true uproar will commence when the XB3's specs are revealed (provided they are superior).
 

Sid

Member
Here's the bucket of tears your highness. Now be silent and drink from the fountain of sorrow for it tastes delicious.

PS4 may have decent specs when unveiled but the true uproar will commence when the XB3's specs are revealed (provided they are superior).
Yeah I'm hoping at least one of them will be a true beast,I'm getting both and a gaming PC either way just like this gen.
 

Jburton

Banned
Here's the bucket of tears your highness. Now be silent and drink from the fountain of sorrow for it tastes delicious.

PS4 may have decent specs when unveiled but the true uproar will commence when the XB3's specs are revealed (provided they are superior).

According to you the PS4 will be as weak as the Wii U, now it will have decent specs!

Which one is it?

Also no word on ram and gpu in the ps4 is a falsehood you have been spouting in this thread.
 

KageMaru

Member
I agree. Sony would be extremely short sighted to release an under powered machine. They fostered an entire first party on the bleeding edge of visual fidelity. Both consoles should be around 10% within each other's range. The true differences will lie within first parties who will code specifically for the platforms.

You act like either company would know if they released hardware that was under powered. There's little chance either company knows about the competition outside of the same gossip and rumors we tend to read on the internet.

Not saying it's going to happen, but it's possible one can release a much more powerful console than the competition and no one would know until it's revealed.

Also I agree with gofreak 100%. These vague measurements or multipliers are very meaningless.
 

quest

Not Banned from OT
Insider knowledge or just a guess?

I would say expectations from most on both next generation consoles. People expecting 4-8 gigs of GDDR5, 2.5-3.0 TF GPUs or 16 threaded CPUs. I fully expect the next gen consoles to be half a generation leap a ways off how dreamcast was over the PS1 and N64. I don't see any of the Console makers wanting to take any major losses and wanting a sub 399.99 SKU. Add to that both probably going to include a gimmick(kinetct 2,Eye toy2) taking away even more from the CPU/GPU/RAM budget. My expectations are very low right now for the PS4 and 720. I am just looking forward to the small things like much improved texture filtering and less sub HD res games.
 
Some folks are really ignorant. Bc through gaikai? BC through an add on? It's freakin hard if not out right impossible.

the only technical and feasible way of including BC in the ps4 is adding the Cell in the system. Ofcourse that would make too much sense for the idiots in charge of SCE if we are to believe the rumours
The only way I see BC happening is if an actual piece of hardware is included in the PS4 that can run it. Just as you say.

yeah no. last thing we need is to increase the cost of the system by throwing cell in there and either cutting back on performance or passing the cost on to the consumer.
As crazy as it sounds, Jeff has shown AMD's interest (well, you didn't need Jeff to say it) in providing a system that can plug modules together to get a single coherent system. If they can provide the Jaguar (or whatever rumored cores) along with something that can BC PS3, it could work well. It wouldn't be used JUST for PS3, but for PS4 games as well.

Don't have use PS3 version. Can be use the pc version if they have one. PS only/1st party may have some problem but they can assets it by emulator or rewrite for pc.
That would NEVER happen. The undertaking to licensing PC versions of games, when they already have PS3 usage is just silly. And once again, first party would be a huge problem, and that is one of their biggest selling points. They aren't going to throw millions into first party and then just dump them.

Don't cloud gaming services such as Gakai/Onlive use high end pcs? A ps3 would be cheaper than that. They'd probably nickel and dime the process too and end up making money on the ps3 back catalogue.

The ps3 transition to a more pc type environment in the ps4 means the end of the bc difficulty from next gen onwards.

There are things to worry about though. Will the process of transferring my psn over to ps4 make my digital only ps3 games obsolete?

High end PC's don't have the bandwidth to play PS3 games. Cell blade servers can, if paired with countless GPUs, but IBM doesn't make those anymore.

MASSIVE amounts of emulation would be required if using x86. MASSIVE amounts. Not only that, but providing BC for still, highly relevant games, would require lots of costly bandwidth on their end. I just don't see it happening.

But I also don't see them dropping BC at all.
 

StevieP

Banned
If you don't have a Cell with the memory and interfaces being pretty much exactly the same (timings being the most important) you won't get hardware BC.

Anything else would require re-licensing software - at your cost probably.
Where is brain_stew when you need him!
 

eastmen

Banned
General numbers like 10% are a little meaningless.

Our current porting twins have differences far larger than that in some cases, in different aspects, going both ways.

But importantly they're still capable of sharing content.

I think something like '10%' - depending on what you're actually talking about - is far too tight a margin to place on things before there'd actually be a question mark over multiplatform-ism between the two.

The problem with the ps3 is simple. Bluray. It carried a premium price at the start of the gen and caused the ps3 to release a year later and since it took a slice of the budget it stopped them from increasing the ram amount.

A ps3 with a dvd drive and even just 256/512 ram c onfiguration over the 256/256 it shipped at would have slaughtered the xbox 360 in terms of graphics.

At the same time a ps4 with say 4 gigs of system ram and 2 gigs of vram could get slaughtered visualy by a xbox 720 with 6gigs of system ram and 3 gigs of vram or worse 4 gigs.

They could also go with more and faster ram which would increase the visual gap even more.

Right now we have video cards that have 2/3/4 gigs of ram on them but the ram is just used to display higher resolutions . If a developer actually targeted these ram amounts you would have a huge increase in graphical fidelity but of course you'd give up the higher resolutions and fsaa amounts .


If the ps4 ships in nov of 2013 and the xbox is delayed until 2014 ms could potentially be on the right side of a micron drop meaning they would be able to use bigger chips and more ram at similar prices to sonys console. So they might not even have to take such huge losses to put out a better console.

The reason why people equate these advantages to MS and not sony is because of the problems sony as a company is having. MS can eat a few quarters or years of losses in the xbox division because the rest of the company makes billion dollar profits.

Sony on the other hand has been loosing money or selling off parts of itself to stay afloat , so it could likely be that big losses on the ps4 coupled with a weak demand on games could end up sinking them.


It will be really interesting to see what happens next generation.
 

eastmen

Banned
Impossible? Would they have really bothered patenting something twice if it was impossible?

gaiki works because you pay for it . So sony would have to charge you for BC on the ps4 if they use gaiki or it will cost them money.

gaiki works in the same way that onlive works , they have numerous server farms running the games and compressing the framebuffer to send out to the end user and it still ends in a sub par game experience.
 

Zoe

Member
gaiki works because you pay for it . So sony would have to charge you for BC on the ps4 if they use gaiki or it will cost them money.

gaiki works in the same way that onlive works , they have numerous server farms running the games and compressing the framebuffer to send out to the end user and it still ends in a sub par game experience.

I was referring to the attachment there.
 

gofreak

GAF's Bob Woodward
At the same time a ps4 with say 4 gigs of system ram and 2 gigs of vram could get slaughtered visualy by a xbox 720 with 6gigs of system ram and 3 gigs of vram or worse 4 gigs.


I wasn't saying there is no possible difference that wouldn't be meaningful. I was saying putting a window like '10%' - which is rather different than outlining some specific issue - is a bit arbitrary and meaningless.
 
If you don't have a Cell with the memory and interfaces being pretty much exactly the same (timings being the most important) you won't get hardware BC.

Anything else would require re-licensing software - at your cost probably.
Where is brain_stew when you need him!

Well yeah, but I just don't see it through Gaikai. I see Gaikai being used for other features. As I've said before, Playstation Home would be fantastic use of this. Doesn't need something super high bandwidth. Just re-code Home (one program) to run on x86 servers.

Want to get on home? Bam, instant loading. No need to download spaces since it's all there.
 

Elios83

Member
The problem with the ps3 is simple. Bluray. It carried a premium price at the start of the gen and caused the ps3 to release a year later and since it took a slice of the budget it stopped them from increasing the ram amount.

A ps3 with a dvd drive and even just 256/512 ram c onfiguration over the 256/256 it shipped at would have slaughtered the xbox 360 in terms of graphics.

At the same time a ps4 with say 4 gigs of system ram and 2 gigs of vram could get slaughtered visualy by a xbox 720 with 6gigs of system ram and 3 gigs of vram or worse 4 gigs.

They could also go with more and faster ram which would increase the visual gap even more.

Right now we have video cards that have 2/3/4 gigs of ram on them but the ram is just used to display higher resolutions . If a developer actually targeted these ram amounts you would have a huge increase in graphical fidelity but of course you'd give up the higher resolutions and fsaa amounts .


If the ps4 ships in nov of 2013 and the xbox is delayed until 2014 ms could potentially be on the right side of a micron drop meaning they would be able to use bigger chips and more ram at similar prices to sonys console. So they might not even have to take such huge losses to put out a better console.

The reason why people equate these advantages to MS and not sony is because of the problems sony as a company is having. MS can eat a few quarters or years of losses in the xbox division because the rest of the company makes billion dollar profits.

Sony on the other hand has been loosing money or selling off parts of itself to stay afloat , so it could likely be that big losses on the ps4 coupled with a weak demand on games could end up sinking them.


It will be really interesting to see what happens next generation.

Blu Ray was the major problem costs wise, it added alone an extra 100$ to the retail price and it caused delays. But Cell wasn't cheap either, yields were poor at the time and they were forced to use one SPE as redundancy to improve them, also they ended up downclocking the RSX from 550MHz to 500MHz to get better yields as well and reduce the system's total power consumption.

As for the the next gen at this point no one knows, we have no idea what kind of product these companies are making, the retail prices they're targeting and the profitability they're targeting. As I said before I'm sure that gaming wise the next gen will just continue the same situation we're having with the PS3 and 360, similar hardware power, same games, first party exclusives defining the systems.
But the fact that a few people are assuming that MS can do no wrong is just bullshit and can turn in their back.
Microsoft is of course much healthier than Sony, but they're under severe pressure as well for different reasons.They have missed all the expansion opportunities they got in thr past, Windows 8 so far is a disappointment far from having the success they hoped for and their core businesses are declining. Investors are not happy with them because they're struggling to continue being a major player in a changing market, for different reasons compared to Sony, Microsoft cannot afford to lose billions in their entertainment division as well, they simply can't justify it to their investors.
Also delays and manufacturing problems happen everytime and can ruin everyone's party.

Rumors so far seem to indicate that Microsoft is focused on creating a Windows 8 media center with Kinect as a part of their control interface. Hence it's likely that a lot of their silicon budget will have to go towards CPU multithreading and they'll need a good amount of memory allocated just to the OS. How that will affect the gaming side of the system we don't know, as we don't know what kind of price they're targeting and if they're seriously counting on subscription deals to make it cheaper (a Iphone-like business model).

I really hope that official announcements are behind the corner (CES, GDC?). I believe that both Microsoft and Sony will make announcements before summer, leaving E3 as the first venue for playable games.
 

i-Lo

Member
According to you the PS4 will be as weak as the Wii U, now it will have decent specs!

Which one is it?

Also no word on ram and gpu in the ps4 is a falsehood you have been spouting in this thread.

You'll know soon.

As for no word on RAM or GPU, I did state that so far nothing is concrete (taking into account the old rumour document and even the latest rumours of dev kits of 8 and 16GB). So given the only somewhat certain info we have is Crytek's lust for more RAM and the representative's mathematical insinuations from which it can be safely deduced (still an assumption that is better than Sony's) that MS may have met their demands.

One thing is not falsehood. It's people's expectations (rightly so) against their pricing threshold beyond which yells of "waaa... PS4 will be DOA if it comes out at over $My-price-range" can be heard loud and clear. And as such, it's best to contain expectations for next gen. I for one would be most surprised if PS4 (alone) can surpass the visual fidelity set forth by say, BF3 on high (no, not even ultra) settings (with AA, AF, mitigated pop in) in a non-linear game at a steady 30fps. At best, if we get 90% TLoU's cinematic fidelity during gameplay in an open world scenario, it'll be a true generational shift. Simply imagine playing the vanilla editions (which the console owners are stuck with) of games like Fallout 4 or Elder's Scrolls game with graphical fidelity of TLoU or Uncharted 3 (with further cosmetic improvements) as opposed to what's on the table today.
 
The core building blocks of the PS4 (Thebe) and Xbox3 (Kryptos) will be based on mobile designs not current A10 PC designs.

Wide IO (512 bit wide memory) was first a mobile standard and should be seen at the same time the next generation consoles are released. In a Handheld Mobile SoC, Wide IO memory is clocked slower to increase efficiency. In a Notebook, Laptop or Game console the same design will be clocked higher using Stacked DDR4 or LPDDR3 to get GDDR5+ bandwidth with much less drive current and TDP. This is just about 100% confirmed for the PS4 and I have no doubt that the Xbox3 will use the same.

AMD's GNB core (Graphics North Bridge) is based on the AMD fusion core technology, The GNB is a fusion of Graphic processor, power optimizer, audio processor, south bridge and north bridge which share a common interface with system memory and will be first seen in the 2013 Samara mobile APU. It is a 20nm AMD building block and should be the 2014 coming technology.

Sweetvar26 who has proved to be very accurate and knew the Kryptos and Thebe AMD project names has said that BOTH will be using Jaguar CPU packages. The HWinfo.com site confirms Thebe and Kryptos exist (Oct 8, 2012) as AMD products but they are not listed anywhere in AMD articles. A developer leak has Xbox3 with multiple CPUs at 1.6 Ghz = Jaguar CPU speed and 4 packages of 4 cores is in another leak.

That there are mobile GPUs (Solar system naming convention) and Thebe is a moon of Jupiter (orbis - orbit reference) would tend to support the PS4 using mobile components like the GNB @ 20nm. And if there are mobile GPUs then there has to be some plan to attach a second GPU to a Mobile APU which we haven't seen yet. Further, with a new Wide IO memory, mobile GPUs will no longer use GDDR5; if a faster memory is used for mobile GPUs it will be Ultrawide IO (512-1024 wide) and likely they will be 20nm GPUs and memory. Currently there is no official information for 8000 series Mobile GPUs and the guess is they are 28nm because they will be released in 2013. Samara is to be released in 2013 and it is at least in part 20nm. With WideIO memory available a performance laptop (mobile) design can have a faster clock for the memory and both APU and second GPU can use the same memory pool.

There is no AMD naming convention that could apply to Kryptos the AMD-Microsoft Xbox 3 SoC project. I've speculated that Thebe is hidden in Kryptos and mistercteam has speculated that a Volcanic islands 20nm GPU family might be in Kryptos (Cryptos is a hidden Lava Dome).

I find little thought is being given to this and assumptions are that high performance designs (desktop CPUs and designs) will be used without thought to mobile being ultra high performance per watt with lower clock. Just increase the clock speeds of mobile designs and you have next generation game console performance at reasonable cost.

Thebes was stated by an outgoing AMD financial officer as the Sony project name. Thebes is a City in Egypt and the Kryptos statue in front of CIA Langley is the John Carter account of the discovery of King Tuts tomb in Thebes Egypt (on the Thebe river). Thebe is the AMD project name not Thebes and shows up in HWinfo.com as Thebe (a river in Egypt). Samara is a river in India and is a 2013 Mobile APU with at least the GNB @ 20nm.

Any way you try to look at the Thebe name and AMD naming conventions it comes up as a mobile design and likely Thebe is part of the Kryptos SoC. Thebe does not preclude a second mobile GPU which probably uses the same Ultrawide IO memory pool. There is also a chance it's 20nm and a 2014 release for BOTH consoles.

Neptune and Sun as well as Samara are listed in HWinfo.com so it's possible that Samara+Neptune and Samara+Sun are PS4 and Xbox 3.

At the present time there are 7XXXM AMD graphics cards @ 28nm for mobile that use 256 bit wide GDDR5.
Since the 6990M was running at 100W and was based on the 6870 core. I assumed that the card which will be used for the flagship of the 7000M series will have similar TDP as the Desktop 6870.

The card that best matches this criteria is the 7950, which does consume a little bit more than the 6870, but it's very close to it.

Thus, the next thing to do was to look at power efficiency, and the 7950 is between 20-25% more power efficient than the 6870.
Samara (20w) + 7950M in a 8000 or 9000 series GPU (~70w) using a common Ultrawide IO memory pool.
 
So what do you think they are targeting, 120 W as a whole?
That was just used as an example. Key here is that the ultrawide IO mobile standard is going to make changes in Laptop, notebook and Phone designs.

In the AMD forum there has been mention of PCIe buss missing on some APUs. PCIe was previously used to connect to I/O which in Kabini to a lesser extent and Samara is totally supported without a PCI port. No PCI port and the standard way of connecting to a add-on mobile GPU is gone.

On another note, Microsoft is advertising Windows 8 using a Sony 19" tablet with touch screen (Sony logo prominently displayed). I suspect from the initial placement in a 6 year old's room, it's also got TV display ability. Is this the start of microsoft-sony.com? Windows 8 so easy to use; 6 year old can use it.
 

GopherD

Member
Now, answer the question....why are they both proceeding with customised mobile parts? No, it's not to keep in line with possible energy ratings or law changes. It's not cost/watt because when you increase the clock, you decrease yields and decrease the cost-benefit ratios you were hoping for by going mobile. Technology efficiency gains are more achievable in-engine on more open hardware rather than forcing through locked down hardware design choices. Keep digging Jeff.
 

mrklaw

MrArseFace
Now, answer the question....why are they both proceeding with customised mobile parts? No, it's not to keep in line with possible energy ratings or law changes. It's not cost/watt because when you increase the clock, you decrease yields and decrease the cost-benefit ratios you were hoping for by going mobile. Technology efficiency gains are more achievable in-engine on more open hardware rather than forcing through locked down hardware design choices. Keep digging Jeff.

performance per watt? even if they go with a launch PS3 power envelope, thats maybe 120W for the GPU (guess). what performance would you get with a desktop part at that power level vs a mobile part?
 
Now, answer the question....why are they both proceeding with customised mobile parts? No, it's not to keep in line with possible energy ratings or law changes. It's not cost/watt because when you increase the clock, you decrease yields and decrease the cost-benefit ratios you were hoping for by going mobile. Technology efficiency gains are more achievable in-engine on more open hardware rather than forcing through locked down hardware design choices. Keep digging Jeff.
A game console is a locked down device and is more efficient than a PC. That is primarily software but it's similar thinking for the hardware in embedded designs.

There will be some custom in game console designs. The APUs and GPUs I'm speculating can be used in Laptops and the only reason they might not be in desktop is configure-ability. You can't change ultra-wide IO memory size as it must be soldered to a transposer/carrier. Same is going to be true for second GPU if it's sharing the same memory pool. For a few years PCs will have slower memory and as a result GPUs, than laptops and game consoles (Apples to Apples comparisons).

mrklaw said:
performance per watt? even if they go with a launch PS3 power envelope, thats maybe 120W for the GPU (guess). what performance would you get with a desktop part at that power level vs a mobile part?
It's the wide IO buss that is the ONLY difference barring TDP power limitations in a Laptop. The wide IO buss allows Full HSA at 200-1TB/sec rather than current A10 with 27GB/sec and a second GPU with GDDR5 @ 200GB/sec that has a PCIe bottleneck to the A10.
 
PS4 may have decent specs when unveiled but the true uproar will commence when the XB3's specs are revealed (provided they are superior).

It'll be fun either way. You know what else will be fun? When people finally get to play the games and see how the specs translate into real life performance. It was very entertaining the last time around (well, let's not forget the Wii U either).


A ps3 with a dvd drive and even just 256/512 ram c onfiguration over the 256/256 it shipped at would have slaughtered the xbox 360 in terms of graphics.

No shit. Likewise, a 360 with 768 MB of shared memory and an HDD in every SKU would have slauthered the PS3.
 

Ashes

Banned
@jeff_rigby: Do you have a benchmark for a mobile apu that is anywhere near realistically plausible for 1.8tf+ next gen console?

The performance just isn't there.
 

mrklaw

MrArseFace
@jeff_rigby: Do you have a benchmark for a mobile apu that is anywhere near realistically plausible for 1.8tf+ next gen console?

The performance just isn't there.

AMD 7970m is supposed to be around 7850-7870 levels

Eurogamer said:
The Pitcairn core is fairly small, occupying 212mm2 of area. Compare that with the 240mm2 of the RSX in the launch version of the PlayStation 3 and the 180mm2 of the Xbox 360's original 90nm Xenos GPU and we have a ballpark match. Of more interest is power consumption: at full tilt, the 7970M sucks up around 65 watts of power. That's not going to be especially good news for a laptop running on battery power alone, but considering that the launch versions of the Xbox 360 and PS3 both consumed around 200W in total, again we see an eminently suitable match.

http://www.eurogamer.net/articles/df-hardware-radeon-7970m-alienware-m17x-r4-review

what would an 8xxxm bring to the table?
 

mrklaw

MrArseFace
Maybe I worded it wrong. But we're not discussing mobile gpu variants. I'd be interested to see the best of the best apus.

I thought we are expecting a custom APU based on customer requirements? so you could put a bunch of jaguar CPU cores alongside a GPU of your choice with optional edram?
 
I thought we are expecting a custom APU based on customer requirements? so you could put a bunch of jaguar CPU cores alongside a GPU of your choice with optional edram?
eDRAM can be ultrawide IO. In the past it was on the logic or GPU connected via transposer. If you have ultra wide IO 512 bit on a transposer 2.5D outside on a combination transposer/MCM carrier you don't need it on top of the GPU or CPU.

GopherD, perhaps I misunderstood. Only Ram clock and registers inside the GNB would run faster but for the MMU no faster than a standard DDR3 interface and that is a 20nm part. In 2011 the AMD CTO was talking about the Southbridge being eventually 22nm. Typically it's scaled behind the GPU and CPU as previously it did not need the efficiency. BUT if Southbridge is now GNB and needs to be 22nm then it's being used with a 14nm or smaller CPU and GPU if past trends are used OR it's pushing allot more bandwidth to graphics hardware because the memory bottleneck is gone.

Current mobile GPUs are binned PCI card GPUs (most efficient are cherry picked) and attached to a PCIe card that plugs in or is soldered to a laptop mother board. With WideIO laptops, there may be no PCIe interface. USB3 is just about fast enough for most applications except GPU.
 

Cornbread78

Member
I figure this may be a good place to ask a general question on next gen tech..

I bought a Samsung 7000 series 3D LED (1080P, 240Hz) in the spring and I'v notice with some games that the graphics look actually worse with the 240Hz than they did with 60Hz. Will be tech be upgraded enough on consoles to utilize the extra refresh rate on newer TVs next gen, or should that not matter at all?

Like I said, may be a dumb question, but definately something I've noticed.
 

Globox_82

Banned
4GB of RAM. Or more...

Early rumours suggest PS4 will have 2 to 4Gb of RAM (that's four to eight times the size of PS3's 512Mb RAM: which is split between 256Mb for video, 256Mb for systems). Wii U has 2Gb: 1Gb for menus and systems and 1Gb for video. However, the latest rumours suggest Microsoft's next console will have a staggering 8Gb.

Why is RAM important? It allows your console to run more programs in its main memory at any one time, rather than swapping between programs (via loading), hugely raising performance. If your hard drive is the size of your filing cabinets, RAM is how much you can fit on your desk. Differences in the RAM structure of PS3 (2x 256Mb) and 360 (unified 512Mb) were one factor cited to cause the notorious Skyrim lag.

More RAM allows higher resolution textures and less loading - like when you enter buildings in an open world. When we saw Square Enix's Luminous next-gen game engine demo at E3, all we were told is that it was running on high-end PC specs with 'a lot of RAM'. If Microsoft opt for 8Gb of RAM, it may force Sony's hand - even 4Gb might cause issues when porting code across consoles. The downside is that RAM is expensive, but Sony can't afford to scrimp.

It'll use a Quad-Core AMD chip

Sony will opt for AMD's quad-core APU (accelerated processing unit) codenamed 'Liverpool,' according to multiple reports in June. It's tipped to be built on a 28-nanometer process. The smaller this number, the more transistors can be fitted into the same space on the chip, and the lower the power consumption, but the more complicated the chip is to build. For context, PS3's Cell processor shrank from 90nm to 45nm over the console's six-year life.



The clock speed is 3.2 GHz, which while not lightning fast will be supplemented by powerful graphics hardware - the Radeon HD 7970 (currently £300 on its own) is being linked to PS4.

Sony will be looking to assemble PS4 from 'off the shelf' PC parts, reducing costs and making it easier to program for. This is in contrast to PS3's Cell chip, which its creator Ken Kuturagi once envisioned appearing in freezers and TVs as part of a parallel processing network. And look how that worked out.

AMD's chips allow for easy porting of code, theoretically preventing the issues surrounding, say, the PS3 port of Skyrim compared to Xbox 360. It'd be easier for developers to get PS4 games up and running, without waiting years for them to learn its unique tricks.


How many TFLOPs has 7970?
http://www.computerandvideogames.co...-of-the-next-playstation-4/?page=1#top_banner
 

onQ123

Member
Now, answer the question....why are they both proceeding with customised mobile parts? No, it's not to keep in line with possible energy ratings or law changes. It's not cost/watt because when you increase the clock, you decrease yields and decrease the cost-benefit ratios you were hoping for by going mobile. Technology efficiency gains are more achievable in-engine on more open hardware rather than forcing through locked down hardware design choices. Keep digging Jeff.

Always on Set-top Boxes
 
Single precision 3.7 TFLOPS, double precission 947GFLOPS. Dont expect this to be in PS4, its a big standalone chip who eats 250W. PS4 will most likely use only APU with GPU section in line to 7850/7870 at max [1.8-2.4 TFLOPS].

7970 has a 250W TDP ... so even with a shrink and HD8xxx move that won't cut it.
 
I figure this may be a good place to ask a general question on next gen tech..

I bought a Samsung 7000 series 3D LED (1080P, 240Hz) in the spring and I'v notice with some games that the graphics look actually worse with the 240Hz than they did with 60Hz. Will be tech be upgraded enough on consoles to utilize the extra refresh rate on newer TVs next gen, or should that not matter at all?

Like I said, may be a dumb question, but definately something I've noticed.

The 120 and 240hz settings aren't allowing your games to run at a higher frame rate natively. What happens on higher end TV's (like yours) is, it takes a 30 or 60 fps source, in this case your game, and then interpolates and creates new images in between the native frames, to make it appear more smoothly.

Lets say you have a video feed of a ball rolling across the screen. Here are two frames:
[_O_____]
[_____O_]

What the 120 and 240 settings do, is create more frames in between by analyzing the before and after images... so you get:
[_O_____]
[___O___]
[_____O_]

or even better:
[_O_____]
[__O____]
[___O___]
[____O__]
[_____O_]

The more frames you have, the smoother it looks. The problem with this is that, because the TV (NOT THE CONSOLE) creates the new images, those new images often have flaws. Especially with the extremely "jaggy" and low image quality console games we have today. The created frames will look worse.

Unless the next gen brings more 60fps games, you won't see a frame rate increase. 120fps is used in maybe a couple games, and one of those is Super Stardust HD on PS3, and that's only because of 3D.
 

iceatcs

Junior Member
Why is that a lot of rumours are so old and repeatability?

MS got to watch out to keep double memory than PS4 because it is what media expecting, otherwise they will be hating.
 

DieH@rd

Banned
7970 has a 250W TDP ... so even with a shrink and HD8xxx move that won't cut it.

According to the leaked data, 8850 will have 2.99 TFLOPS with ~130W and 8870 will have 3.9TFLPPS with ~160W... But all this is not confirmed.
http://en.wikipedia.org/wiki/Compar...essing_units#Sea_Islands_.28HD_8xxx.29_series


If its true, then strip down GDDR5 ram, board chips, display adapters, and you get with very efficient and very powerful GPU chips. Hopefully 88xx series will end up in nextgen console APUs or dedicated GPUs.
 

Cornbread78

Member
The 120 and 240hz settings aren't allowing your games to run at a higher frame rate natively. What happens on higher end TV's (like yours) is, it takes a 30 or 60 fps source, in this case your game, and then interpolates and creates new images in between the native frames, to make it appear more smoothly.

Lets say you have a video feed of a ball rolling across the screen. Here are two frames:
[_O_____]
[_____O_]

What the 120 and 240 settings do, is create more frames in between by analyzing the before and after images... so you get:
[_O_____]
[___O___]
[_____O_]

or even better:
[_O_____]
[__O____]
[___O___]
[____O__]
[_____O_]

The more frames you have, the smoother it looks. The problem with this is that, because the TV (NOT THE CONSOLE) creates the new images, those new images often have flaws. Especially with the extremely "jaggy" and low image quality console games we have today. The created frames will look worse.

Unless the next gen brings more 60fps games, you won't see a frame rate increase. 120fps is used in maybe a couple games, and one of those is Super Stardust HD on PS3, and that's only because of 3D.


Thank you!

Very nice, that makes sense to me. So, the manufactured images by the TV are what is creating the "bubbles" I'm seeing around characters at times in games? Worst offender so far is WKCII, but I've seen the bubbles/shadows on other games as well. I guess it just pulls out the inperfections even more. I never notices these before on my older Samsung that ran at 60hz.
 

DieH@rd

Banned
Unless the next gen brings more 60fps games, you won't see a frame rate increase. 120fps is used in maybe a couple games, and one of those is Super Stardust HD on PS3, and that's only because of 3D.

Super Stardust has a frame sequential 1080p "60fps gameplay" 3D? :-O
 
Super Stardust has a frame sequential 1080p "60fps gameplay" 3D? :-O

Yessir, woops, was thinknig 720p but you said 1080. Not 1080 3D lol.
http://www.eurogamer.net/articles/super-stardust-3d-720p120-confirmed-article

Thank you!

Very nice, that makes sense to me. So, the manufactured images by the TV are what is creating the "bubbles" I'm seeing around characters at times in games? Worst offender so far is WKCII, but I've seen the bubbles/shadows on other games as well. I guess it just pulls out the inperfections even more. I never notices these before on my older Samsung that ran at 60hz.

Yeah, the more frames, the worse it gets. Because it'll create the middle image, then it'll create the image between the first and the middle one. Since the middle one itself is created by the TV, the second image gets even more imperfections. Also, if you play with the "motionflow" or whatever the brand may call it, it creates a lot of lag because of the image processing that is required for the feature. It may be fine on more cinematic or slower games, but if you were to do this while playing street fighter online or even battlefield, you're giving yourself a huge disadvantage.
 
Top Bottom