vg247-PS4: new kits shipping now, AMD A10 used as base, final version next summer

1. You're ignoring that the 7970M is a full spec (downclocked) desktop 7870.

2. There is an 8970M. It will be faster than the 7970M. It will be double the 8800M, just like 7900M was double 7800M (which has the same core count and specs as 8800M).

The question is when and for how much?

7970M -> based on Pitcairn 7870 desktop version.

8970M -> based on OlandXT 8870 desktop version.

The problem is that the 8870 isn't available until late 2013 it seems - otherwise AMD would have shown the 8970M aswell in their unveilling. Maybe this time it won't be based on a 8870 anyway - we don't know that. But time and cost speak against a 8970M. I personally wouldn't mind having a mobile GPU with 75W TDP and 3+ TFlops because ... I just doubt it is that easy.
 
Why would you switch between them? Unless the APU one is for low power home media type stuff and shuts the discrete one down. Maybe it's not for augmentation or GPGPU?
 
The question is when and for how much?

7970M -> based on Pitcairn 7870 desktop version.

8970M -> based on OlandXT 8870 desktop version.

The problem is that the 8870 isn't available until late 2013 it seems - otherwise AMD would have shown the 8970M aswell in their unveilling. Maybe this time it won't be based on a 8870 anyway - we don't know that. But time and cost speak against a 8970M. I personally wouldn't mind having a mobile GPU with 75W TDP and 3+ TFlops because ... I just doubt it is that easy.

M's are still high-binned parts among their desktop counterparts. I wouldn't really bother looking in their general direction in terms of consoles. They are available in MUCH lower quantities than you'd need to launch a console.
 
You know I've always wondered why NVIDIA hasn't tried to get back in the console market, I know they would not NEED to get in the business as they're doing well in the stand-alone pc market but still it wouldn't hurt either.
 
M's are still high-binned parts among their desktop counterparts. I wouldn't really bother looking in their general direction in terms of consoles. They are available in MUCH lower quantities than you'd need to launch a console.

Totally agree with you especially for a console you need a rather mature process with predictable supply lines not only a promise by AMD or a foundry.
 
Someone tried to impersonate SuperDaE on teamxbox forums and wrote a about X720 bullshit specs
http://forum.teamxbox.com/showthread.php?t=681409&page=2
Real SuperDaE quickly confirmed that this crap [2GB of EDRAM :lol:, IBM 6core cpu with 4way SMT = 32cores :rofl] was posted from fake account
http://forum.beyond3d.com/showpost.php?p=1687758&postcount=16672



edit - And now more exciting news, SCEA posted a patent application for "DYNAMIC CONTEXT SWITCHING BETWEEN ARCHITECTURALLY DISTINCT GRAPHICS PROCESSORS":

http://appft.uspto.gov/netacgi/nph-...68".PGNR.&OS=DN/20120320068&RS=DN/20120320068
"Graphics processing in a computer graphics apparatus having architecturally dissimilar first and second graphics processing units (GPU) is disclosed. Graphics input is produced in a format having an architecture-neutral display list. One or more instructions in the architecture neutral display list are translated into GPU instructions in an architecture specific format for an active GPU of the first and second GPU."


It seems that they are aiming for APU+GPU combination in PS4. As for architecturaly disctinct graphic processors, they will most likely use 7xxx for APU and 8xxx for discrete GPU... but off course, who knows. Maybe they are just doing preemptive patenting for their patent war chest.

Forgive my ignorance but how is this different to nVidia's Optimus in laptops? For example, my laptop uses the integrated graphics for normal Windows type stuff but switches to the discrete for gaming or graphics-heavy software. Is this not the same sort of thing and if so, how can it be patented?
 
Forgive my ignorance but how is this different to nVidia's Optimus in laptops? For example, my laptop uses the integrated graphics for normal Windows type stuff but switches to the discrete for gaming or graphics-heavy software. Is this not the same sort of thing and if so, how can it be patented?
This patent starts in 2009 with the following about context switching, paragraph 0006 might answer your question:

BACKGROUND OF INVENTION

[0003] Many computing devices utilize high-performance graphics processors to present high quality graphics. High performance graphics processors consume a great deal of power (electricity), and subsequently generate a great deal of heat. In portable computing devices, the designers of such devices must trade off market demands for graphics performance with the power consumption capabilities of the device (performance vs. battery life). Some laptop computers are beginning to solve this problem by introducing two GPUs in one laptop--one a low-performance, low-power consumption GPU and the other a high-performance, high-power consumption GPU--and letting the user decide which GPU to use.

[0004] Often, the two GPUs are architecturally dissimilar. By architecturally dissimilar, it is meant that the graphical input formatted for one GPU will not work with the other GPU. Such architectural dissimilarity may be due to the two GPUs having different instruction sets or different display list formats that are architecture specific.

[0005] Unfortunately, architecturally dissimilar GPUs are not capable of cooperating with one another in a manner that allows seamless context switching between them. Therefore a problem arises in computing devices that use two or more architecturally dissimilar GPUs in that in order to switch from one GPU to another the user must stop what they are doing, select a different GPU, and then reboot the device. This is somewhat awkward even with a laptop computer and considerably more awkward with hand-held portable computing devices such as mobile internet access devices, cellular telephones, hand-held gaming devices, and the like.

[0006] It would be desirable to allow the context switching to be hidden from the user and performed automatically in the background. Unfortunately, no solution is presently available that allows for dynamic, real-time context switching between architecturally distinct GPUs. The closest prior art is the Apple MacBook Pro, from Apple Computer of Cupertino, Calif., which contains two architecturally distinct GPUs but does not allow dynamic context switches between them. Another prior art solution is the Scalable Link Interface (SLI) architecture developed by nVidia Corporation of Santa Clara, Calif. This architecture lets a user run one or more GPUs in parallel, but only for the purpose of increasing performance, not to reduce power consumption. Also, this solution requires the two GPUs to be synchronized when the system is enabled, again requiring some amount of user intervention.
The patent was updated July 2012 and published three days after December 17 which was the embargo date on the AMD Slide that was posted yesterday.
 
Forgive my ignorance but how is this different to nVidia's Optimus in laptops? For example, my laptop uses the integrated graphics for normal Windows type stuff but switches to the discrete for gaming or graphics-heavy software. Is this not the same sort of thing and if so, how can it be patented?

reading through it's for gpu's that are not only different archiecture but also for the switching to be done automatically and in real time which apparently none of the other devices can do
 
I really think they setup for beta kits will be/ is
A10 with 384 SPUs 512 gflops
The 768 spu 7850 that was leaked earlier. 1.3 teraflops

Equaling the rumor of 4 core cpu and 1152 spu from vgleaks, also 1.8 teraflops.

Final kit will be the same spu count but 25-30% more efficient.

Compare that to 1.2-1.5 teraflops for Durango.

Where did you get 1.2-1.5 teraflops for Durango? I thought most people are thinking the 720 is going to be more powerful.
 
I take it you are referring to a sweetvar26 post that was deleted:

PS4:



Milos is probably a 20nm Volcanic Islands desktop GPU so you have Southern Islands and Volcanic islands at the same time.

GNB is new with the Samara/Pennar (Jaguar) APU and @ 20nm the Forge database to produce it was ported from TSMC to GloFlo.

New Starsha GNB 28nm TSMC
Milos

Southern Islands


What is Starsha APU that comes after Samara?
Milos is CPU?
 
M's are still high-binned parts among their desktop counterparts. I wouldn't really bother looking in their general direction in terms of consoles. They are available in MUCH lower quantities than you'd need to launch a console.
There is a reason 8000M GPUs are out now and for 125 platforms that are using Windows8. GloFlo and TSMC have switched to LPM (low Power Mobile) bulk silicon. There is no 28nm performance Silicon in 2013. Richland is 32nm SOI and Kavari was canceled for 2013.

Mobile is no longer binned from the performance silicon.
 
We were talking about that patent before Vita was unveiled...

If we must talk about it again, why would you put two architecturally distinct GPUs in a PS4 (i.e. ones with different shader models/instruction sets etc.)?
 
There is a reason 8000M GPUs are out now and for 125 platforms that are using Windows8. GloFlo and TSMC have switched to LPM (low Power Mobile) bulk silicon. There is no 28nm performance Silicon in 2013. Richland is 32nm SOI and Kavari was canceled for 2013.

Mobile is no longer binned from the performance silicon.

?

I thought we already went over this. AMD officially denied this.
 
New Starsha GNB 28nm TSMC
Milos

Southern Islands


What is Starsha APU that comes after Samara?
Milos is CPU?
I think Starsha is the name for the GNB not a APU and if correct @ 28nm. Samara is 20nm and Kabini uses a different North Bridge. It's possible that the Starsha GNB is the core of Thebe @ 28nm that will be modified as it contains the North Bridge that has the MMU and connects to memory.

http://www.neogaf.com/forum/showpost.php?p=45419442&postcount=403

http://www.indeed.com/r/Rami-Dornala/e0704aad508659b2 said:
Graphic processor
AMD - Waltham, MA
September 2011 to Present
Project:1 GNB core SOC
Duration: Sept 2011 , till date
Location: AMD
Description:
GNB core (Graphics North Bridge) is based on the AMD fusion core technology, The GNB is a fusion of Graphic processor, power optimizer, audio processor, south bridge and north bridge which share a common interface with system memory.

Role: Tech Lead, Was responsible for Delivery of verification for Tapeout
Contribution:
1. Responsible for Functional verification of GNB. (Graphics North Bridge)
2. Integrated ACP IP into the GNB environment
3. Integrated ISP IP into the GNB environment.
4. Aware of BIA, IFRIT flows.
5. Responsible for SAMARA and PENNAR integration.
6. Involved in kabini coverage closure, involved in LSC for kabini
7. Involved in fc mpu integration.
8. ONION and GARLIC bus OVC understanding and GNB environment set up for samara database.
9. Involved in LSA for Samara and Pennar GNB's
10. Involved in setting up of Pennar database with GF libraries
9.Involved with migration of Pennar database from TSMC to GF libraries.

Team Size: 12
Technology used:
Verification environment is a hybrid mixture of System-C, SystemVerilog and C++ language.GNB is targeted for 20nm technological library with GF foundaries.
 
?

I thought we already went over this. AMD officially denied this.
http://www.extremetech.com/computing/140870-amd-refutes-kaveri-cancelation-rumor-claims-big-cores-still-a-priority said:
Last week, we reported that AMD has delayed the tapeout of its next-generation APU, codenamed Kaveri, in order to improve the part’s performance and capability. This means we won’t see a follow-up to Piledriver until 2014, as it typically takes at least a year for a foundry to ramp a part to full production.

Now, rumors are flying that AMD has canceled Kaveri altogether and that it intends to terminate “big core” development in favor of Kabini-based hardware and new ARM parts. SemiAccurate is reporting that AMD killed its big-core design team in the last round of layoffs, effectively canceling both Steamroller (the CPU half of Kaveri) and Excavator.
The rumor that Kaveri was canceled (SemiAccurate) was denied. There is no Kaveri in 2013.

Ashes1396 said:
If Milos is the volcano this could mean graphic chip, as the Volcanic islands, [is set to follow Sea Islands], in 2014.
VI is 20nm and likely on high performance silicon not LPM. If VI then likely Sony would have to wait till 2014 as there is no High performance in 2013 at least on the AMD and GloFlo roadmaps.
 
We were talking about that patent before Vita was unveiled...

If we must talk about it again, why would you put two architecturally distinct GPUs in a PS4 (i.e. ones with different shader models/instruction sets etc.)?

Maybe they include an ARM CPU with a GPU in the PS4 for GoogleTV / media playback / other os functions and introduce a new type of PSN games which can be played on PS4 and PS Vita. That way they could use almost all of the power of the bigger CPU/GPU for games.
 
We were talking about that patent before Vita was unveiled...

If we must talk about it again, why would you put two architecturally distinct GPUs in a PS4 (i.e. ones with different shader models/instruction sets etc.)?
The patent describes multiple states; similar GPUs, dissimilar and CPU + GPU where the GPU would be turned off and the CPU would perform the GPU operations if possible. All to save power. This can apply to coming Laptops or handhelds that Sony would be building or it can apply to the PS4. The patent was updated July 2012 and published yesterday. The publish date is what I key on.

Why talk about it again;

1) because XTV and RVU are going to require the PS4 and Xbox3 to be on when the TV is on and power used in those modes is going to be regulated.
2) Jaguar Mobile CPUs are going to be used.
3) Likely a modified GNB out of Samara which appears to be a later Ultra efficient design when compared to Kabini.

All to save power. Or you can take the tack that LPM Silicon is all that's available in 2013 so they have to use Mobile designs.
 
The patent describes multiple states; similar GPUs, dissimilar and CPU + GPU where the GPU would be turned off and the CPU would perform the GPU operations if possible. All to save power. This can apply to coming Laptops or handhelds that Sony would be building or it can apply to the PS4. The patent was updated July 2012 and published yesterday. The publish date is what I key on.

Why talk about it again; because XTV and RVU are going to require the PS4 and Xbox3 to be on when the TV is on and power used in those modes is going to be regulated.

I hope you can turn it off, I don't want my PS4 to be non stop on.
 
So, Sony has this patent, what does that mean for competitors? Or is it more specific than typical context switching?
EDIT: Oh, it's for dissimilar gpu architectures... Hmm..
 
I really think they setup for beta kits will be/ is
A10 with 384 SPUs 512 gflops
The 768 spu 7850 that was leaked earlier. 1.3 teraflops

Equaling the rumor of 4 core cpu and 1152 spu from vgleaks, also 1.8 teraflops.

Final kit will be the same spu count but 25-30% more efficient.

Compare that to 1.2-1.5 teraflops for Durango.

Mr. Elite- are you saying it will be approximately 2-2.3 TF?
 
Mr. Elite- are you saying it will be approximately 2-2.3 TF?

There are not SPUs in any kits.

edit - Oh, GPU SPUs. OK. But still a bit...not sure about how to read Proelite's speculation per the below:

I think Proelite has been lowballing Durango to a certain extent and highballing PS4 to a certain extent.

StevieP mentioned in an earlier post that he thought Proelite was intentionally muddying waters a bit. And did I pick it up wrong or does he work for one or other of the companies in play here, or someone close to them?
 
superDaE just scoffed at possibility of 4k resolution on Durango (twitter). Seems like this device will not be future proof.

How quick do you think 4K is coming, in terms of mass penetration? I still have doubts it'll ever even reach general household use. Isn't it 80" & up before your eyes perceive a difference?
 
While I believe it too... Remember when Xbox 360 could only do up to 1080i?

Well to be fair the 360 itself was always capable of 1080p. Only reason it was only doing up to 1080i initially was because the original launch boxes didn't have HDMI. And a vast majority of TVs don't let you do 1080p over component despite the fact component is actually capable of such.
 
There are not SPUs in any kits.

edit - Oh, GPU SPUs. OK. But still a bit...not sure about how to read Proelite's speculation per the below:

I think Proelite has been lowballing Durango to a certain extent and highballing PS4 to a certain extent.

StevieP mentioned in an earlier post that he thought Proelite was intentionally muddying waters a bit. And did I pick it up wrong or does he work for one or other of the companies in play here, or someone close to them?

I believe he works for Microsoft.
 
So Xbox will get one GPU with eDRAM, and PS4 getting "two" (APU+GPU)?
Wondering which one is more efficiently?



It is naturally that lowballing on your fav company and highballing on opposite one.
So that you will eventually get happy of your fav one and more laugh to the opposite one.

For example - I still expect Xbox getting 8-12 cores with ARM & 8 to 12GB, and PS4 will get AMD 4-8 cores with 4 to 8GB.
 
I really don't get the fuss over 4K. Outside of possibly PSN/Live Arcade, it has nothing to do with games.

Was there this much fuss over the PS2 doing 1080i?
 
I really don't get the fuss over 4K. Outside of possibly PSN/Live Arcade, it has nothing to do with games.

Was there this much fuss over the PS2 doing 1080i?

Very few people had HD CRT sets when that game came out. So I would believe no.

Well to be fair the 360 itself was always capable of 1080p. Only reason it was only doing up to 1080i initially was because the original launch boxes didn't have HDMI. And a vast majority of TVs don't let you do 1080p over component despite the fact component is actually capable of such.

Ah. I see.
 
The problem is that the 8870 isn't available until late 2013 it seems - otherwise AMD would have shown the 8970M aswell in their unveilling.

Supposedly AMD is releasing the rest of the 8000 Q2 2013, which could still mean April so that's not too late.

I have a feeling AMD wanted to push their new low end/mainstream laptop products first, since that's where nVidia and Intel dominate more iirc. The fact that AMD was still using their older 40nm architectures and that the 8000 series are using just a cleaned up version of GCN (GCN 1.1/2.0 won't be a major overhaul), gave them plenty of time to redo the low end first.

A GPU with the amount of shaders similar to the 7800/8700 series with (some of) the improvements GCN 1.1 will bring seems very likely to end up in the PS4 at this point.
 
Well to be fair the 360 itself was always capable of 1080p. Only reason it was only doing up to 1080i initially was because the original launch boxes didn't have HDMI. And a vast majority of TVs don't let you do 1080p over component despite the fact component is actually capable of such.

I think I read recently there wasn't an 1080p standard for hdmi at the point where 360 released too.
 
I believe he works for Microsoft.

If people know that I don't think he would be here hinting and winking :D
The only rumor we got so far is that Microsoft is willing to invest more of their silicon budget on the CPU because they're aiming to create a WIndows 8 media center where multitasking plays an important role. Sony might be investing more in pure graphical power with their GPU.

Btw the APU+GPU solution makes much more sense especially because it can give them much more flexibility with the memory system whereas just an APU would considerably restrict their possibilities. The patent suggests that based on the application there will be available different combinations to use the two GPUs available (ex. both of them for the most taxing applications, just the APU's GPU for simpler games or maybe BC emulation :P).
 
Hey AMD, what card do you recommend pairing with the A10 APU?

6670

I guess that's because the graphic chip in the A10 is the AMD Radeon HD 7660D, which is a rebranded HD 6670, right?

edit: I love that they call it next gen apus. :P
 
I think it was to do with HDMI 1.3 but I could be totally mistaken.

I think that's for 3D.

Xbox don't have HDMI output when it launch.

[edit] No, it seem 3D for 1.4, not 1.3 which improve bitstreams. I think MS don't want pre-1.3 (before 2006) hence why there isn't HDMI first place.
 
VI is 20nm and likely on high performance silicon not LPM. If VI then likely Sony would have to wait till 2014 as there is no High performance in 2013 at least on the AMD and GloFlo roadmaps.

I wasn't going to post this link that somebody messaged me, but since you insist on going on about 20nm, I think it's worth pointing out that AMD are not in a rush to move past 28nm:

“It is getting tougher and tougher to get to new nodes. 28nm might be with us a little longer than people [think]. [It will be a while] before they jump into 20nm node or 16nm or 14nm,” said Devinder Kumar, the corporate controller and the interim chief financial officer, at Raymond James IT supply chain conference.

http://www.xbitlabs.com/news/other/...s_Technologies_for_Longer_Period_of_Time.html

You've been told by multiple people that 20nm is not realistic for these consoles at launch, so hopefully now you finally realize that your assumptions here are waaaaay off.

superDaE just scoffed at possibility of 4k resolution on Durango (twitter). Seems like this device will not be future proof.

This isn't a serious post, is it?
 
Even if 4K TVs do penetrate the mainstream, will these systems really be able to do much with it? Or will it will be like this gen and 1080p where they are technically capable, but only a small handful of games actually use it due to the desire for better quality effects and higher polygon counts?

Hell, I'm not even convinced devs will shoot for full 1080p on many games next gen so 4K seems mostly meaningless.
 
Even if 4K TVs do penetrate the mainstream, will these systems really be able to do much with it? Or will it will be like this gen and 1080p where they are technically capable, but only a small handful of games actually use it due to the desire for better quality effects and higher polygon counts?

Hell, I'm not even convinced devs will shoot for full 1080p on many games next gen so 4K seems mostly meaningless.

It would be used mainly for Blu Ray 4k movies.
1080p will be the most common resolution for games.
 
Even if 4K TVs do penetrate the mainstream, will these systems really be able to do much with it? Or will it will be like this gen and 1080p where they are technically capable, but only a small handful of games actually use it due to the desire for better quality effects and higher polygon counts?

Hell, I'm not even convinced devs will shoot for full 1080p on many games next gen so 4K seems mostly meaningless.

I think it'll be more about being able to play 4K movies and basically being 4K players. Sony is trying to break into medical imagining and they need 4K for that. It makes sense that they would add a similar chip to the PS4. It will help them push the technology. I doubt games will come out doing 4K images, although maybe we'll get a Wipeout or PixelJunk 4K game at the end of the gen and that will blow our minds.

Also, isn't 4K picture viewing already supported with PS3?
 
Top Bottom