• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

vg247-PS4: new kits shipping now, AMD A10 used as base, final version next summer

mrklaw

MrArseFace
Agreed. If not 3+, 2.5 TF will put us in a fantastic middle ground. If they end up going for conservative machines, I would hope the prices would reflect it. The more people with these machines in hand, the better.

Even if they go for a reasonable price, it'll still be more expensive than PS3/360 and therefore rule out a large segment of consumers for a few years. Meanwhile the core is screaming out for something new. So I'd rather they go for power at a 'not outrageous' price, and aim to cost reduce in 18-24 months
 

thuway

Member
Even if they go for a reasonable price, it'll still be more expensive than PS3/360 and therefore rule out a large segment of consumers for a few years. Meanwhile the core is screaming out for something new. So I'd rather they go for power at a 'not outrageous' price, and aim to cost reduce in 18-24 months

I know what you mean. I've been harping on about the 3TF+ system in 2014 for last 6 months. The problems associated with a hot, heavy, and expensive box will give us another round of ill will like the RROD and the YLOD fiasco. It's just so very hard to justify a 25% increase in power for 70% reliability over 4 years.
 
YLOD wasn't related to power/heat. And RROD was just due to piss poor placement. They could have easily had launch 360s not RROD without changing anything about the power of the console, just by putting more thought into where to place the components.

Even if they go for a reasonable price, it'll still be more expensive than PS3/360 and therefore rule out a large segment of consumers for a few years. Meanwhile the core is screaming out for something new. So I'd rather they go for power at a 'not outrageous' price, and aim to cost reduce in 18-24 months

Agreed. The casuals will stick with the previous gen for a few years, they always do. And no matter how conservative the power, it won't be priced below $399 for the not severely gimped version.

They would be better off making it as powerful as conceivable at a 'not outrageous' price and forgetting about being certified as energy star compliant (future revisions years from now can be brought below the 250W or whatever they need for energy star certification).
 

thuway

Member
Agreed. No more how conservative, it won't be priced below $399 for the not severely gimped version. The casual will stick with the previous gen for a few years, they always do.

They would be better off making it as powerful as conceivable at a 'not outrageous' price, forgetting about energy star compliance (future revisions years from now can be brought below the 250W or whatever they need for energy star certification).

Reliability, noise, and poor yields have to be taken into account. The upper limit of 200-250W sounds reasonable, but the current economic situation both Sony and Microsoft (last I checked Ballmer and Co. aren't exactly happy with Windows 8 sales) are not in a position to be forking over blank checks.
 
My speculation:

130W GPU (AMD HD8xxx @ 2.5 TFlops)
25W CPU (4 Jaguar Cores @ 2GHz)
45W 4GB (G)DDR(4,5) RAM, Drive, etc.
------------------------------------------
200W

Would mean a console I could live with, probably doing around 3+ TFlops depending on the CPU. Everything below 2Tflops would be a disappointment to me because if you go current-tech you have to launch earlier than late 2013 early 2014, especially if you want to beat MS in at least some parts of the world.

More like

110W GPU (AMD HD 78XX @ 2~3TFlops)
45W CPU (4-6 Cores @ ~2Ghz)
45W 3-4 GB RAM, Drive, etc
--------------
200W

?
 
Why are we limited to 200W. The original launch PS3 had a 380W power supply, while the slim had a 250W one. Or does that have to do with something else?
 
Why are we limited to 200W. The original launch PS3 had a 380W power supply, while the slim had a 250W one. Or does that have to do with something else?

Power Supply, but the power consumption was never so high.

Edit: 1st generation PS3's used around 200W maximum.

image.php
 

i-Lo

Member
Why are we limited to 200W. The original launch PS3 had a 380W power supply, while the slim had a 250W one. Or does that have to do with something else?

Power supply and actual power consumption limits are different with the former having a higher capacity to provide flexible leeway during gameplay. In actuality, the first gen PS3 which also happened to be the most power hungry topped off at around 209W during gameplay.

PS4 will not launch 2013. The only ones in a position to do so are Microsoft. Expect PS4 March-May 2014.

Enlighten me as to your reason why MS is at an advantage of over Sony for certain pertaining to release schedule?
 
More like

110W GPU (AMD HD 78XX @ 2~3TFlops)
45W CPU (4-6 Cores @ ~2Ghz)
45W 3-4 GB RAM, Drive, etc
--------------
200W

?

Why the delay for a 78xx? In 2013/14 a HD8850 (~3TF, 130W) should be the obvious choice. Beef it up to have more power at the same TDP or the same power with less TDP. 45W for Jaguar seems high? Kaveri with Steamroller cores is estimated by AMD at 15-35W and Kabini with Jaguar up to 25 so my estimation is high enough I think.
 

i-Lo

Member
More like

110W GPU (AMD HD 78XX @ 2~3TFlops)
45W CPU (4-6 Cores @ ~2Ghz)
45W 3-4 GB RAM, Drive, etc
--------------
200W

?

More probable with the CPU and GPU TDP. However, PS4 will not have 78XX series in for GPU. Why? 8XXX will be better because of lower power consumption, less heat production for similar if not better performance than Pitcairn for the same price (or lower).
 
Why the delay for a 78xx? In 2013/14 a HD8850 (~3TF, 130W) should be the obvious choice. Beef it up to have more power at the same TDP or the same power with less TDP. 45W for Jaguar seems high? Kaveri with Steamroller cores is estimated by AMD at 15-35W and Kabini with Jaguar up to 25 so my estimation is high enough I think.

Great points. A 3TF GPU is easily doable for a sub 200W console.

But IMO, it's not the end of the world if they release a 250-300W console. If doing so lets them squeeze in an even more powerful GPU that gives them a leg up on the completion, by all means they should.
 

RoboPlato

I'd be in the dick
Why the delay for a 78xx? In 2013/14 a HD8850 (~3TF, 130W) should be the obvious choice. Beef it up to have more power at the same TDP or the same power with less TDP. 45W for Jaguar seems high? Kaveri with Steamroller cores is estimated by AMD at 15-35W and Kabini with Jaguar up to 25 so my estimation is high enough I think.

This is actually making me think that 3 TFlops is possible when I originally thought 2.5 would be the max, especially when you consider there will be nips and tucks to the CPU and GPU which will lead to further reduced wattage and board complexity.
 
This is actually making me think that 3 TFlops is possible when I originally thought 2.5 would be the max, especially when you consider there will be nips and tucks to the CPU and GPU which will lead to further reduced wattage and board complexity.

I think the limiting factor is cost in that case. How much money is Sony willing to spend to subsidize the new console, develop and tune the hardware to their liking - basicly how much is Sony able to lean out the window before falling.
 

RoboPlato

I'd be in the dick
I think the limiting factor is cost in that case. How much money is Sony willing to spend to subsidize the new console, develop and tune the hardware to their liking - basicly how much is Sony able to lean out the window before falling.

I don't think an 8850/8870 base would be too expensive, which is what I'm expecting/hoping for.
 
doniewhalberg,

That may be your reasoning, Im just saying its not rational. If money is so tight that you would prefer a far weaker console just to save $10-15 an year on electricity, how can justify spending $400 upgrading to a next gen console?

Gaming is a very expensive hobby. Too expensive to be taken up by anyone that would consider $15 an year too big a price to pay for much better hardware and games. If that was the case, then $400 for a new console, $60 for Xbox Live and $60 per game would all be too big a price to pay as well.

Seriously, still? Don't agree = irrational in your mind, I guess.

Did I at any point say I lack money? No. It's about not being selfish and thinking only about myself though. Money is not tight, but I won't spend money unless it's justified and necessary. Here's an example if we shift to games for a second. I have probably the personal money after bills, food, petrol, things the little one needs for school for the week, and so on, for about 2 brand new games every week. That's my money, my girlfriend doesn't care what I spend it on. I instead have a lovefilm account, rent new games, then buy them when they've came down. Is it because money is tight? No. It's because I want to be able to take them out for dinner, and take the little guy out for the day, and so on. If you can extrapolate ANYTHING form that example, apply it to the original question, and there's your reasons.

However, I digress. Somewhere, about the point you started editing in stuff after my responses, you added the *given* what the wattage would be sensible and as efficient as possible. That point wasn't there in the first point and that's what I was trying to respond to, and show you some realistic ways people look at the world against at your "why is gaf so fucking dumb for wanting the wattage to be at a certain level". Again, you asked, I answered man. I didn't expect you to agree, but c'mon, did you genuinely want to know, or did you just want to argue with whoever took the bait? The reasons are fine and rational to me, I don't need them to be rational for you. You wanted to know, I told you. End of. We're taking up space of some absurd tangent now, so lets leave it, I can't illuminate this any other way. Take what you want from it.
 

THE:MILKMAN

Member
My speculation:

130W GPU (AMD HD8xxx @ 2.5 TFlops)
25W CPU (4 Jaguar Cores @ 2GHz)
45W 4GB (G)DDR(4,5) RAM, Drive, etc.
------------------------------------------
200W

Would mean a console I could live with, probably doing around 3+ TFlops depending on the CPU. Everything below 2Tflops would be a disappointment to me because if you go current-tech you have to launch earlier than late 2013 early 2014, especially if you want to beat MS in at least some parts of the world.

I can't see them doing another 200W+ console.

120W APU (4/8 Jaguar cores + 8850) Or 30-40W APU + separate 70-80W 8850.
40-50W for everything else.

The PS3's chips at launch were I believe ~50W Cell and ~80W RSX yet it consumed over 200W at the wall.

130W for the GPU alone is unrealistic IMO.

Dark_AnNiaLatOr said:
Oh ok, thanks. It makes sense, but at almost 180W of leeway, that is quite a bit.

It is to do with efficiency. A PSU works most efficient at 50-60% load. With the launch PS3 that is at +-200W.

systemfehler said:
Why the delay for a 78xx? In 2013/14 a HD8850 (~3TF, 130W) should be the obvious choice.

Wait..We're getting confused here. Let's say the 130W figure is correct for the 8850. That is what I understand to be total board power which includes 2GB of very fast/high power GDDR5 memory plus other stuff that isn't needed.
 
doniewhalberg

Can't believe we're still talking about this. You make it sound like electricity is expensive. It's cheap as fuck to power a console. It costs $20.10 to power a launch 200W Xbox 360 for an year. A 300W console would cost $30.30 to power for an year.

If that extra $10 an year is something you can't justify for a far better GPU and far better looking games with better physics better AI, more characters and bigger worlds, then how could you possibly justify spending $400 to upgrade to a next gen console in the first place? If you're spending $400 to upgrade, wouldn't you want the leap in graphics, physics, AI, and game size to be as big as possible even if it means an extra $10 in power consumption?
 
doniewhalberg

Can't believe we're still talking about this. You make it sound like electricity is expensive. It's cheap as fuck to power a console. It costs $20.10 to power a launch 200W Xbox 360 for an year. An 300W console would cost $30.30 to power for an year.

If that extra $10 an year is something you can't justify for a far better GPU and far better looking games with better physics better AI, more characters and bigger worlds, then how could you possibly justify spending $400 to upgrade to a next gen console in the first place? If you're spending $400 to upgrade, wouldn't you want the leap in graphics, physics, AI, and game size to be as big as possible even if it means an extra $10?

Everything is expensive dude ha. It's my responsibility to keep a handle on this stuff. The electricity, the gas, the cable, all that stuff adds up. How does that not mean anything to you? What you were asking is why do people want their electrical appliances not to just go batshit using power? Because I can literally use any penny of that money on something better.

Does this make more sense if I put it this way: If they announced the ps4, and it took say 450-500w, I would go and buy a gaming pc. Because that would be justified. That would do a shitload more than a console does, and look better while doing it. It would also do a million non-game applicable things too, that the others in the house could benefit from, instead of just me. This isn't about an unwillingness to pay more, this is about justifying it against what the product is and does. Does that make sense to you?
 

Elios83

Member
Everything is expensive dude ha. It's my responsibility to keep a handle on this stuff. The electricity, the gas, the cable, all that stuff adds up. How does that not mean anything to you? What you were asking is why do people want their electrical appliances not to just go batshit using power? Because I can literally use any penny of that money on something better.

Does this make more sense if I put it this way: If they announced the ps4, and it took say 450-500w, I would go and buy a gaming pc. Because that would be justified. That would do a shitload more than a console does, and look better while doing it. It would also do a million non-game applicable things too, that the others in the house could benefit from, instead of just me. This isn't about an unwillingness to pay more, this is about justifying it against what the product is and does. Does that make sense to you?

That risk is zero, the PS3 fat was at the limits of what is possible to do with a console before it turns into a desktop PC.
200W is the power budget they have.
 
I can't see them doing another 200W+ console.

120W APU (4/8 Jaguar cores + 8850) Or 30-40W APU + separate 70-80W 8850.
40-50W for everything else.

The PS3's chips at launch were I believe ~50W Cell and ~80W RSX yet it consumed over 200W at the wall.

I guess the GDDR3/XDR memory, HDD and BluRay drive added up to the total TDP aswell. I don't have any real numbers just that the system used about 200W at launch. Furthermore a Kabini APU (4 Jaguar cores + GPU) takes 9-25W according to AMD slides. So there is more than enough room for a "big" GPU.

130W for the GPU alone is unrealistic IMO.



It is to do with efficiency. A PSU works most efficient at 50-60% load. With the launch PS3 that is at +-200W.



Wait..We're getting confused here. Let's say the 130W figure is correct for the 8850. That is what I understand to be total board power which includes 2GB of very fast/high power GDDR5 memory plus other stuff that isn't needed.

I only found figures for a whole graphics card with the HD8850 - so interface, ports, GDDR5, etc. consumes 130W. I just left the 130W because I don't know how much the blank GPU alone would take - so it is a rather conservative approach. Furthermore I believe that a APU, stacked design, etc. is easier to cool and there have been some improvements in cooling and PSU efficiency.
 
That risk is zero, the PS3 fat was at the limits of what is possible to do with a console before it turns into a desktop PC.
200W is the power budget they have.

Oh, I know man, I'm just trying to get this point across :p His original Q was along the lines of why don't they just put in 2014 pc parts with the wattage that would come with it, why would anyone ever have a problem with a wattage increase. I was attempting to answer the last part. I'm not sure I'm not being just as dumb though at this point.
 
Everything is expensive dude ha. It's my responsibility to keep a handle on this stuff. The electricity, the gas, the cable, all that stuff adds up. How does that not mean anything to you? What you were asking is why do people want their electrical appliances not to just go batshit using power? Because I can literally use any penny of that money on something better.

Does this make more sense if I put it this way: If they announced the ps4, and it took say 450-500w, I would go and buy a gaming pc. Because that would be justified. That would do a shitload more than a console does, and look better while doing it. It would also do a million non-game applicable things too, that the others in the house could benefit from, instead of just me. This isn't about an unwillingness to pay more, this is about justifying it against what the product is and does. Does that make sense to you?

Don't throw up straw men. No one is talking about a 450W console. There is a 0% chance of that happening. We are talking about 200W vs 300W.

Your argument doesnt make sense to me because of the amounts we are talking about.

It's like saying you can justify spending $400 on a coat but couldn't justify spending $410 on an otherwise identical coat made out of far better material that looks better and will last longer before it looks dated.
 

Killthee

helped a brotha out on multiple separate occasions!
The link says those standards are Voluntary. Energy Star cerification isnt required or mandated for anyone. I can't think of a single serious gamer that would opt not to buy a next gen console because its not energy star compliant.

It would be much more environmentally friendly to release a non-energy star complaint console with the power to satisfy console gamers into 2020, than to release a weak ass console that will force console gamers to constantly upgrade their PC just to keep up with the graphics despite not liking PC gaming anywhere near as much.

Rigby is under the belief that some California regulations being proposed right now will set the standards for next gen power consumption. I'm not sure where he's getting the Energy STAR standards = new CA standards, but I'm guessing he's just assuming the standards will be on a similar level. I also have no idea how strict at enforcing these new standards CA will be and whether or not they will ban products that don't meet them, but I'm pretty sure these standards would at least have a grace period with exemptions for consoles launching within the next couple of years so they would probably only apply to future revisions.


Here's what I've been able to dig up regarding the issue:



I'm by no means an expert in this area, but it just seems to me that his speculations on this topic are making assumptions without many facts behind them.
 
Don't throw up straw men. No one is talking about a 450W console. There is a 0% chance of that happening. We are talking about 200W vs 300W.

I specifically told you I was using an extreme example to try and illustrate the point, in case that helped. I see it didn't.

Your argument doesnt make sense to me because of the amounts we are talking about.

It's like saying you can justify spending $400 on a coat but couldn't justify spending $410 on an otherwise identical coat made out of far better material that looks better and will last longer before it looks dated.

It doesn't make sense to you, because your example here hits the nail on the head. You've specifically picked a personal, me only example. Something that only affects me. If we then talk about games, that works. But the houses bills, the houses electricity, that's for everyone. And if I accepted every appliance jumping up, I'd suddenly have a much bigger problem than your $10 (I'm british btw, so I'm trying to work with the dollar thing here). If I then accept the cable went up by $ this month, the phone bill was $10 more, the gas station put its prices up, so now thats $X amount more each week. So I have to micromanage everything when I do the bills. I break everything down. Everytime something changes, I have to think about it.

Bear in mind, you asked why anyone would ever have a problem with their console wattage going up. This is why. This is life, and it's not just about me wanting to play video games in my free time. Its about constantly optimising everything so we get the most out of what we have. I hate wasting money, even when I have plenty. I get the feeling you can't relate to this though.
 

Ashes

Banned
The link says those standards are Voluntary. Energy Star cerification isnt required or mandated for a console. I can't think of a single serious gamer that would opt not to buy a next gen console because its not energy star compliant. Hell I don't expect 99% of gamers to even know which consoles are energy star compliant and which aren't.

It would be much more environmentally friendly to release a non-energy star complaint console with the power to satisfy console gamers into 2020, than to release a weak ass console that will force console gamers to constantly upgrade their PC just to keep up with the graphics despite not liking PC gaming anywhere near as much.

Rigby is under the belief that some California regulations being proposed right now will set the standards for next gen power consumption. I'm not sure where he's getting the Energy STAR standards = new CA standards, but I'm guessing he's just assuming the standards will be on a similar level. I also have no idea how strict at enforcing these new standards CA will be and whether or not they will ban products that don't meet them, but I'm pretty sure these standards would at least have a grace period with exemptions for consoles launching within the next couple of years so they would probably only apply to future revisions.

I'm by no means an expert in this area, but it just seems to me that his speculations on this topic are making assumptions without many facts behind them.

That's the Rigby way.
 
Stephen Colbert said:
The link says those standards are Voluntary. Energy Star cerification isnt required or mandated for a console. I can't think of a single serious gamer that would opt not to buy a next gen console because its not energy star compliant. Hell I don't expect 99% of gamers to even know which consoles are energy star compliant and which aren't.

It would be much more environmentally friendly to release a non-energy star complaint console with the power to satisfy console gamers into 2020, than to release a weak ass console that will force console gamers to constantly upgrade their PC just to keep up with the graphics despite not liking PC gaming anywhere near as much.

Killthee said:
Rigby is under the belief that some California regulations being proposed right now will set the standards for next gen power consumption. I'm not sure where he's getting the Energy STAR standards = new CA standards, but I'm guessing he's just assuming the standards will be on a similar level. I also have no idea how strict at enforcing these new standards CA will be and whether or not they will ban products that don't meet them, but I'm pretty sure these standards would at least have a grace period with exemptions for consoles launching within the next couple of years so they would probably only apply to future revisions.

I'm by no means an expert in this area, but it just seems to me that his speculations on this topic are making assumptions without many facts behind them.

That's the Rigby way.

http://www.appliance-standards.org/product/game-consoles said:
Although DOE currently has no plans to set standards for video game consoles, significant per-unit savings (around 80 kWh annually) could be achieved by implementing several simple measures outlined in a study conducted by NRDC. One of these measures - ensuring that the game system enters a low-power mode when not in use - would achieve the substantial majority of the potential savings. There is no know incremental cost to meet this standard, so savings would be seen by consumers immediately. DOE issued a request for information early in 2012 for this product. Game consoles are included in the California Energy Commission Phase 1 rulemaking with an expected rule due in 2013. The ASAP/ACEEE report, The Efficiency Boom, estimates savings of 8 Twh in 2035 and net present value savings of $5.3 billion.

Peak power while playing a game will not be regulated so that is a moot point. What is being regulated and needs to be supported are the always-on power mode and streaming media modes including server mode. The difference between a PS3 and a Kabini SoC in electricity costs at a menu screen is $7/month or $84/year; that's not playing a game just an idle menu screen. Still not a huge amount but governments like California look at millions of always on Xbox 720's and PS4's and will regulate that and other power modes. TVs have already been regulated.

Point two and it's a big one, it appears that the future is handheld and the efficiencies in handheld designs. Leakage goes up with smaller process so as we shrink to a new node we are not getting the power efficiency we used to get in smaller transistor junctions. Low power bulk process silicon and lower transistor voltage is where GloFlo and TSMC are going for future 20nm plane with 14-5nm 3D transistors.

The Xbox3 and PS4 will most likely be mobile SoC designs on low power bulk silicon not high performance silicon. It can not support high speeds unless Transister voltage is higher which is less efficient. High performance Silicon supports faster speeds at the same voltage but faster is still less efficient and when you push above 3 Ghz the voltage has to increase even more. FD-SOI reduces leakage so it can support both. There are a few things supporting low power Silicon:

1) Jaguar CPUs
2) Thebe name is AMD Solar system naming convention which is Mobile
3) Samara <-- Jaguar 4C, GNB Core GCN2 20nm 8XXX series GPU is being released in 2013. GNB Core is the key to speculation GCN2 (8XXX series GPU) and 20nm. Thebe could be using Samara as a base design not Kabini. We don't know the GPU in Samara could be 9000 series.

TSMC and GloFlo can not share designs and produce the same parts to meet next generation volume unless at 20nm. Pad layout @ 20nm will be identical with refreshes at 20nm plane and 14nm or 5nm. IF Thebe is produced at 32 or 28nm then the next refresh will have a significant redesign but at 20nm the wiring will be nearly identical with only the 3D transistors changing.

http://cens.com/cens/html/en/news/news_inner_42228.html said:
TSMC began volume production of 28nm chips in October, 2010, with initial output of around 1,000 wafers a month. Fab 15 is the foundry giant&#8217;s pivotal production site for 28nm chips, turning out over 10,000 wafers using the process in the second quarter this year. The monthly output of 52,000 wafers marked the unprecedented volume production speed at the company&#8217;s giga-size foundry factories.

TSMC is estimated to begin pilot production of chips at 20nm nodes in the second half of 2013 and volume production of the chips in 2014.

TSMC will begin tooling the phase 4 and phase 5 production modules of Fab 15 sometime in December and put the two modules into volume production in the second quarter of 2013. Employees at the factory will increase to 2,400 in 2013 from current 1,800. This fab will take TSMC a total of NT$300 billion (US$10.3 billion) to complete in 2015.

See for support of the following table: http://www.neogaf.com/forum/showpost.php?p=45200368&postcount=3147

http://www.amdzone.com/phpbb3/viewtopic.php?f=532&t=139479&start=25 said:
GPU names from HWinfo patch info: http://www.hwinfo.com/History64.txt (Version 4.06)
Sun <-- Solar System (HD 8xxxM) series GCN2 M= mobile GPU version???
Neptune <-- Solar System (HD 8xxxM) series GCN2
Ibiza <-- Southern islands
Cozumel <-- Southern islands
Kauai <-- Southern Islands
Hainan <-- no info
Curacao <-- no info
Aruba <-- no info
Richland Devastator/Scrapper <-- Three Desktop APUs from 32nm to 28 nm NI to SI GPU
Thebe <-- PS4, GCN 2013 release (not confirmed) Thebe as a moon of Jupiter fits into the Mobile Solar system naming convention => GCN2 HD 8XXXM Mobile GPU? 20nm?
Kryptos <-- Xbox3, GCN 2013 release (not confirmed) Fits no known AMD naming convention!
Samara <-- Jaguar 4C, GNB Core GCN2 20nm 2013 release
Pennar <-- Jaguar 2C, GNB Core GCN2 20nm (probably a 2013 lower power version of Samara)

Now look at the following leaked Xbox 720 powerpoint chart:

Slide9.jpg

As we investigate AMD SoCs we find that it's possible to have a SoC at 50w and support 10X PS3 and 300FPS (if wide IO or eDRAM is used) = 1.8Tf. Kabini is 35 watts and almost makes it.

Cost is the big Gorilla here not power while playing a game if we are looking at 50-90 watts (limit of low performance silicon). If they use FD-SOI instead of low power bulk then we could have higher clocks and a faster SoC. On SOI BC could be supported with little additional cost.

Support for above speculation on 20nm

GNB core (Graphics North Bridge) is based on the AMD fusion core technology, The GNB is a fusion of Graphic processor, power optimizer, audio processor, south bridge and north bridge which share a common interface with system memory. "GNB is targeted for 20nm technological library with GF foundaries".

"Involved with migration of Pennar database from TSMC to GF libraries." TSMC and GloFlo can not share designs and produce the same parts to meet next generation volume unless at 20nm because GloFlo is gate first at 28nm and will not be gate last same as TSMC until 20nm.

http://www.indeed.com/r/Rami-Dornala/e0704aad508659b2 said:
Graphic processor
AMD - Waltham, MA
September 2011 to Present
Project:1 GNB core SOC
Duration: Sept 2011 , till date
Location: AMD
Description:
GNB core (Graphics North Bridge) is based on the AMD fusion core technology, The GNB is a fusion of Graphic processor, power optimizer, audio processor, south bridge and north bridge which share a common interface with system memory.

Role: Tech Lead, Was responsible for Delivery of verification for Tapeout
Contribution:
1. Responsible for Functional verification of GNB. (Graphics North Bridge)
2. Integrated ACP IP into the GNB environment
3. Integrated ISP IP into the GNB environment.
4. Aware of BIA, IFRIT flows.
5. Responsible for SAMARA and PENNAR integration.
6. Involved in kabini coverage closure, involved in LSC for kabini
7. Involved in fc mpu integration.
8. ONION and GARLIC bus OVC understanding and GNB environment set up for samara database.
9. Involved in LSA for Samara and Pennar GNB's
10. Involved in setting up of Pennar database with GF libraries
9.Involved with migration of Pennar database from TSMC to GF libraries.

Team Size: 12
Technology used:
Verification environment is a hybrid mixture of System-C, SystemVerilog and C++ language.GNB is targeted for 20nm technological library with GF foundaries.
Project:2 G4Main SOC
"Involved with migration of Pennar database from TSMC to GF libraries." TSMC is apparently leading GloFlo in 20nm.

Add a little FPGA to the above.
 
I'll say this about power x cost.

If Sony offered me a 5TF console drawing 500W and sold at US$ 599 i would run on the first day to buy the thing. Says a lot about my hobby. (And before you ask, yes, i do have a good PC - Asus G74sx - but there are things and games you get on a console that you can't get anywhere else).

I know it doesn't work like this.

Speaking of release and sales, i don't get why they have to come during the holidays. They are always supply constrained, so if you're selling 1.5-2 million consoles in the first two months, you can release at any time of the year. If Q1 of 2014 is needed because of production issues, they wouldn't miss any sales because of this.

For personal reasons i would prefer a 2013 launch, i'll be in Los Angeles (i regularly live in Brazil) in november and early december, so i could buy one if they were released in Q4 of 2013.
 

RaijinFY

Member
I'll say this about power x cost.

If Sony offered me a 5TF console drawing 500W and sold at US$ 599 i would run on the first day to buy the thing. Says a lot about my hobby. (And before you ask, yes, i do have a good PC - Asus G74sx - but there are things and games you get on a console that you can't get anywhere else).

I know it doesn't work like this.

Speaking of release and sales, i don't get why they have to come during the holidays. They are always supply constrained, so if you're selling 1.5-2 million consoles in the first two months, you can release at any time of the year. If Q1 of 2014 is needed because of production issues, they wouldn't miss any sales because of this.

For personal reasons i would prefer a 2013 launch, i'll be in Los Angeles (i regularly live in Brazil) in november and early december, so i could buy one if they were released in Q4 of 2013.

Such machine would be more expensive... much more expensive.
 
A post on SemiAccurate in response to mine that I find to be good speculation. Mistercteam and Misterxmedia are I think the same person and have also been banned from BY3D but good speculation is good speculation. It's supported by cites and logical. Doesn't mean it's true but it is worthy of being discussed, he does bring up alternate theories for the Kryptos name that are as or more valid than mine. Edit: the semiaccurate post appeared on BY3D and was promptly deleted by the Mods.

Key here is:

1) TSMC and GloFlo can not share designs and produce the same parts to meet next generation volume unless at 20nm. Pad layout @ 20nm will be identical with refreshes at 20nm plane and 14nm or 5nm. IF Thebe is produced at 32 or 28nm then the next refresh will have a significant redesign but at 20nm the wiring will be nearly identical with only the 3D transistors changing. AMD/GloFlo started implementing support for TSVs at 20nm 2 months after 28nm started taping out. First test wafers @20nm in 2008. No mention of GloFlo and TSVs @ 28nm. IBM had TSVs @ 32nm.

2) Samara APU <-- Jaguar 4C, GNB Core GCN2 20nm (8XXX series GPU?) is being released in 2013. GNB Core is the key to speculation GCN2 (8XXX series GPU) and 20nm. Thebe could be using Samara as a base design not Kabini. We don't know the GPU in Samara could be 9000 series (GCN 2.1).


Sea Island will be 28nm based , and mostly will be EGCN (GCN 1.1)
Volcanic Island series will bring full HSA and context switching and full QOS
Basically true GCN V2.0, 20nm based


VOLCANIC Island
AMD-Prepares-Sea-Volcanic-and-Pirate-Islands-GPU

the Volcanic Islands will be the first GPU family possible to manufacture in Common Platform Alliance as well as TSMC.

In other words, AMD will have alternatives to TSMC's GigaFab Hsinchu/Taichung: IBM East Fishkill, GlobalFoundries in New York and Dresden or Samsung in Austin. Thus, there should be no chip shortage crippling its marketing performance anymore.

The Volcanic Islands GPUs will be designed on the 20nm Gate-Last manufacturing process and will bring silicon-level integration. The discrete GPU will tightly collaborate with the graphics capabilities of the APU, and the CPU will be treated as &#8220;an integral part&#8221; as well.
Straight_Ahead_AMD_s_Sea_Islands_Volcanic_Islands_ and_Pirates_Islands

system architecture. Volcanic Islands GPUs, when accompanied by appropriate microprocessors and operating system, will support GPU compute context switch [GPU computes every single piece of application that it can] as well as GPU graphics pre-emption.

Look at above quote carrefully Then remember who has low yield news, with fermi like low yield trying to fabbed to 3 factory (it is like try to fab 2014 GPU onto 2012)

Volcanic Island
key point
- 20nm
- Full Context full HSA
- APU + GPU will be integral
- 1st to be fabbed to CPA (common Platform Alliance)


-----------------------------------------------------------------------------------
Now, Secret APU for Xbox next and PS4
My Observation
-----------------------------------------------------------------------------------

------------------------------------------------------------------------
------------------------------------------------------------------------
Thebes
-28nm based, will APU only, and will be based on Sea island
-Thebes, is like moon on Jupiter, rather cold than hot, so basically this is the mobile version in performance or the low cost version, Venus XTX and Sun XT are the fastest one related to Hot than cold.
-Certainly not meet the CPA (common platform Alliance) standard, so even if this is more onto of the self based (minor custom), Sony will be get delayed not because waiting some technological breakthrough but because they not meet CPA standard to fab efficiently on 3 other Factory if necessary.
-will be very affordable

------------------------------------------------------------------------
------------------------------------------------------------------------

Kryptos

From above Quote related to Volcanic Island

The discrete GPU will tightly collaborate with the graphics capabilities of the APU, and the CPU will be treated as &#8220;an integral part&#8221; as well

system architecture. Volcanic Islands GPUs, when accompanied by appropriate microprocessors and operating system, will support GPU compute context switch [GPU computes every single piece of application that it can] as well as GPU graphics pre-emption.
-Kryptos means:
Hidden message
Secret
Volcanic (very Hot) Kryptos Lava dome (spelled Cryptos but taken from the Greek Kryptos)
- A Custom GPU and APU based on Volcanic Island, beyond 8000 series future GPU
so very high probability it will be 2 Big Chip
-will be APU and GPU
-GPU part certainly will be 20nm, low yield monster chip > fabbed to multi factory CPA standard
-APU will be 22nm , GPU 20nm (final target)
-Meet with recent insider
sweetvar: "High Priority, supercomputer architecture etc"
aegis
dualpixel "2-3 times than PS4"
lherre ("a Beast, etc)
-will have Fully HSA GCN V2.0

So i expect:
Xbox next targetted 2013-2014, to compensate with that MS will release xbox surface or xbox TV first.
PS4 or PS omni, will be targetted for 2013, unless there will be delay because related to fab problem (non CPA)
Possible but much more expensive. With both chips connected to a interposer, Thebe could be at a different node size and still part of Kryptos and I believe the second GPU is connected to the common memory buss so it alone would need to be 9000 series...Thebe could be an APU with 8000 series GPU as long as the memory interface is the same as the second 9000 series GPU. If Thebe is part of Kryptos it's then very important it be at 20nm too so that TSMC and all CPA can produce it.

Lots of guesses here so can anyone add to this?
 
Are we sure Microsoft will go the apu route?

Apus are not known for having monster multicore CPUs.
It's accepted by most that Xbox3 and PS4 will use Jaguar CPUs.....


http://www.rage3d.com/interviews/manju_hegde_state_of_hsa/ said:
The final mystery about the HSA Foundation is who will fill the final tile space on the web page, it's likely already decided with each member choosing when they announce their membership. I can think of a few candidates that might be interested in being part of the HSA, like Broadcom, IBM, Microsoft and Sony. Microsoft and Sony? Well, they both design their own hardware, in Microsoft's case have successfully worked with AMD in the past, and AMD is rumored to be the platform provider for both next gen consoles with custom IP designs. Will the next generation of Sony Playstation and XBOX be powered by AMD APUs, GCN architecture variant, and will they be Bulldozer family architecture - or Jaguar family?
Hint and "always on" plus ATSC 2.0 XTV support = Jaguar. We could have figured this out without Sweetvar26. Jaguar is a CPU package that needs to plug into a APUs Xbar switch. The advantage is the APU infrastructure and to provide discrete components for what an APU provides would be impractical. APU is a given, the CPU package that can plug into that APU can be bulldozer or Jaguar or even the MPA CPU package in the Sony patent. (Low power Silicon eliminates SPUs at 3.2 Ghz unless produced with a different process and attached 2.5D or 3D to the AMD APU.)

The second GPU is speculative not the APU.
 

jaosobno

Member
I call bulls*it on this the whole X720 being 2-3 times stronger than PS4 thing. Consider what Tretton said:

Jack Tretton On PS4: &#8220;We&#8217;ve Never Been First, We&#8217;ve Never Been Cheapest, It&#8217;s About Being Best&#8221;

Being the best and being 2-3 times weaker than your main competitor doesn't really make any sense. I expect PS4 and X720 to have 10-15% difference in power.

And X720 being 2-3 times more powerful would also mean that Microsoft would have to price it substantially higher than PS4, unless they are willing to suffer a meaningful loss per console sold.
 
I call bulls*it on this the whole X720 being 2-3 times stronger than PS4 thing. Consider what Tretton said:

Being the best and being 2-3 times weaker than your main competitor doesn't really make and sense. I expect PS4 and X720 to have 10-15% difference in power.

And X720 being 2-3 times more powerful would also mean that Microsoft would have to price it substantially higher than PS4, unless they are willing to suffer a meaningful loss per console sold.
I don't disagree but this is the only-best argument I can come up with also. If we believe sweetvar26 and he appears to be the first and accurate, we have to give credence to his other statements cited in the SemiAccurate post. Sweetvar26 did not give any performance differences between consoles, it can be something not related to a second GPU; ""High Priority, supercomputer architecture etc". Also mixed in with older technical terms mentioned by sweetvar26 was Milos a volcanic Island name.

Volcanic Islands was previously out of the running because it's 20nm and everyone is expecting next generation to be 28nm with my cites only saying it's slightly more than a 50-50 chance that next generation might be 20nm. The Samara APU mentioned in the Linkedin cite haveing GNB and GNB targeted for 20nm with Samara shown for a 2013 release changed the odds of 20nm appearing in 2013 not 2014 6 months or less later.

Edit: Extra GPU might be for background serving of Xbox 720 game to handhelds and little of the extra GPU might used for 720 games. That's the original vision and much more expensive. PS4 may do the same just not at the same time and may use Nanse to serve media. PS4 cheaper which means more sold and a larger market for the PSN.
 

i-Lo

Member
I wouldn't listen to either of them.

Pretty much this. However, if xb3 is by some miracle 2-3x more powerful than ps4 then in time their third party support will dry up resulting in poor ports and/or more third party exclusives for xb3. Because Sony is no Nintendo, this will hurt their sales considerably.
 

Reiko

Banned
Pretty much this. However, if xb3 is by some miracle 2-3x more powerful than ps4 then in time their third party support will dry up resulting in poor ports and/or more third party exclusives for xb3. Because Sony is no Nintendo, this will hurt their sales considerably.

If Sony's tools are PC Dev friendly you would have to stoop really low to create a poor port like what was seen this gen.
 
Top Bottom