Confirmation: PlayStation 3 will use an in-house GPU (proof inside, well you'll see)

Panajev2001a

GAF's Pleasant Genius
The successful candidate will develop a state-of-the-art shading language compiler for an advanced forth-generation graphics processing unit (GPU). With the assistance of other team members, the individual must be capable of designing and implementing the major components of the compiler backend.

To achieve this goal, the individual should have extensive recent experience with backend internal representations suitable for advanced code optimization. Detailed knowledge of modern code optimization techniques, register allocation, and code generation expertise is also required, as well as experience with programming language front ends, assemblers, linkers, and runtime libraries. Exposure to shading languages, such as nVidia CG, Microsoft HLSL, Brook, or StreamIt, and exposure to 3D graphics APIs, such as OpenGL and DirectX, is also desirable.

http://hotjobs.yahoo.com/jobs/CA/Foster-City/Technology/J900906BR;_ylt=AjoV4U.AaTskU1tcllsAlpyxQ6IX

This would not be needed if they relied on ATI, nVIDIA, IMGTech.'s PowerVR or BitBoys made GPUs as all of those would come with a shading language compiler designed by the GPU provider: especially big boys like ATI, nVIDIA and IMG Technologies.


Good news:

Well, we know the GPU will have Shaders.

We know the shaders will not be trivial in length: no need to step-up from the ASM level shaders of the DirectX 8.0 era and go with High Level Shading Languages otherwise.


Bad news:

We still have no official PR or technical documentation stating more details where Vertex and Pixel Shading is done: if it is split between CPU and GPU or if it is done all on the GPU.

Is SCE going at it alone ? Are they receiving help ? Or are they the ones helping their partner (it would have to be either Toshiba or IBM to cut down the costs and add another partner to the PlayStation 3 project at this point... people on Beyond3D seemed to say it would have to be Toshiba).
 
Fourth generation? Would this not be third generation?

Also, shouldn't work on this be wrapping up soon if they're to get kits to developers soon(ish)?
 
Forth... Forth-coming generation ?

(PSP might be the third-generation GPU they have in mind)

This job offer does not mean work is not on its way... it mean that they still want more bright people working on this and improoving the technology's performance (development tools evolve over time).
 
Panajev2001a said:
Forth... Forth-coming generation ?

(PSP might be the third-generation GPU they have in mind)

This job offer does not mean work is not on its way... it mean that they still want more bright people working on this and improoving the technology's performance (development tools evolve over time).

True, was just thinking that after I read the "With the assistance of other team members" bit..there's probably already a team in place.

And I'm guessing that "forth-generation" is a misspelling, but PSP as the third generation would explain that.
 
Panajev2001a said:
http://hotjobs.yahoo.com/jobs/CA/Foster-City/Technology/J900906BR;_ylt=AjoV4U.AaTskU1tcllsAlpyxQ6IX

Is SCE going at it alone ? Are they receiving help ? Or are they the ones helping their partner (it would have to be either Toshiba or IBM to cut down the costs and add another partner to the PlayStation 3 project at this point... people on Beyond3D seemed to say it would have to be Toshiba).

It is Toshiba. My source was laughing when he told me this because He said that ATI will work Toshiba, if you compare the GPU between PS3 and Xbox2.
 
my guess

1. Voodoo Era GFX Cards
2. Geforce 1~2 Era GFX Cards
3. Geforce 3/DX9 Era GFX Cards
4. 4th Gen?

Perhaps the PS3 will feature above DX9 capabilities?
So, if this is true, DoeS that mean that the next
Playstation will not be as far behind as the PS2 is?
 
WordofGod said:
It is Toshiba. My source was laughing when he told me this because He said that ATI will work Toshiba, if you compare the GPU between PS3 and Xbox2.
:lol

Nvidia must really be smarting by now. It's a complete shut-out! I love it.
 
Fafalada said:
Uh... what? ATI will work for Toshiba???? :\

I think he means they will "work" Toshiba, as in they will completely and totally own them. Of course, that would be a matter of opinion. There's no telling what Toshiba would do for this. Does anyone know what Toshiba's history in this field is? IMO, ATi is the best consumer-level graphics card engineering company out there right now.
 
Somebody on Beyond3D made an interesting point about this job being with SCEA, not SCEI, suggesting that perhaps Sony Japan didn't see the importance of having a high level shading language, but that Sony America decided to press ahead with it anyway because of developer pressure etc. I guess it's possible, but might it also be possible that SCEI is simply "outsourcing" that part of development to their American wing?
 
So... in a nutshell, sony are at the point where the start to hire pwople to write compilers for their new custom chip, while, well, we all know how supposedly farther ahead in the devkit game MS is...

This could spell a massive difference in launch lineups.
 
Somebody on Beyond3D made an interesting point about this job being with SCEA, not SCEI, suggesting that perhaps Sony Japan didn't see the importance of having a high level shading language, but that Sony America decided to press ahead with it anyway because of developer pressure etc. I guess it's possible, but might it also be possible that SCEI is simply "outsourcing" that part of development to their American wing?
Amercian wing is where the software technology development part of SCE is (at least right now).
 
If PS3 is gonna be a bitch to code for, the same way that devs said PS2 initially was, it will really help MS. MS will probably provide an excellent framework for devs, we all know that MS is all about DEVELOPERS DEVELOPERS DEVELOPERS!!
 
Nerevar said:
Does anyone know what Toshiba's history in this field is? IMO, ATi is the best consumer-level graphics card engineering company out there right now.

I believe Toshiba designed the graphics chip for the PS2, if that is any indication of how good they are at graphics.
 
These are indeed great news. Can't wait to see what features it will implement. Ok perhaps the GS wasn't so great feature-wise, but the PSP GPU has showed us a change of direction.
 
mr2mike said:
So... in a nutshell, sony are at the point where the start to hire pwople to write compilers for their new custom chip, while, well, we all know how supposedly farther ahead in the devkit game MS is...

This could spell a massive difference in launch lineups.

It doesn't necessarily mean they're *starting* to hire. I voiced similar concerns at the top of this thread, but a team could already exist - this person would join that team.
 
Bah why does Sony have to reinvent the wheel? This in and of itself makes the learning curve for development on the PS3 the worst of the next generation. And noo.. a tougher learning curve does not mean that the console keeps on getting better looking games for ever (as the case was made for the PS2), it just means that it takes longer for the games to hit the peak of the machine and fewer games hit it.

Standard technologies are there for a REASON. Bleck.

Edit: Standard technologies like OpenGl and Directx have been use for years and so are really stable with really rich feature sets. Anything brand new will be unproven and buggy for a good long while.
 
I believe Toshiba designed the graphics chip for the PS2, if that is any indication of how good they are at graphics.
No, Toshiba was behind design of Emotion Engine. GS was Sony design.

Azih said:
Bah why does Sony have to reinvent the wheel?
Who says they are (or aren't)? Or do you mean that just by not licensing someone elses technology is reinventing the wheel by default...?

Edit: Standard technologies like OpenGl and Directx have been use for years and so are really stable with really rich feature sets.
While we don't officially know squat about what they will use - every rumour so far has been suggesting OGL.
Though - personally I find pre 2.0 versions of OGL more convoluted then feature-rich, especially in the PC market where they have been turned into milion and one extension panels, rather then any kind of standardized API.
 
interesting. I always assumed PS3 would have an in-house GPU.


4th generation might be:

1. PS1 GPU
2. PS2 GS
3. GS2 (not used in products AFAIK
4. PS3 GPU

or

1. PS1 GPU
2. PS2 GS
3. PSP GPU
4. PS3 GPU


this still does not rule out Nvidia making certain portions of the PS3 GPU, which will still be an Sony or STI or Sony-Toshiba designed GPU, and manufactured at STI fabs.
 
Edit: Standard technologies like OpenGl and Directx have been use for years and so are really stable with really rich feature sets. Anything brand new will be unproven and buggy for a good long while.
PSP devkit uses a carbon copy of OpenGL language, it's just Sony 'rebranded', so to say. PS3 will likely do the same thing.
 
I'm just saying that
a state-of-the-art shading language compiler for an advanced forth-generation graphics processing unit
is completely unnecessary and is a massive undertaking that will be hard to use and be another thing that devs will have to grapple with initially.
 
PS3's GPU will be the ultimate decider as far as what a PS3 can paint onto our screens. regardless of how many flops the CPU can crunch. :)
 
Marconelly said:
PSP devkit uses a carbon copy of OpenGL language, it's just Sony 'rebranded', so to say. PS3 will likely do the same thing.


Some of the smartest moves they can make.
 
Azih said:
I'm just saying that is completely unnecessary and is a massive undertaking that will be hard to use and be another thing that devs will have to grapple with initially.

Azih, I don't know how well versed you are in Microprocessors and compilers but saying "just use an existing standard" is an incorrect oversimplification. Each Microprocessor has its own set of low-level assembly instructions it uses to run, and every processor is different. Now cell is a completely different architecture from the standard x86 approach so a new compiler is needed that will compile code to run on that architecture, but also run it well and in an optimized fashion.

So right away they need someone to write a new compiler for their architecture, this new compiler could be for existing forms of code, or they could just make up a new language that better exposes the power of their own architecture. Trust me when I say that the actual programming language used is irrelevant when it comes to an experienced programmer. I'm only in my 3rd year of Software in University and I've already used upwards of 20 languages.
 
Azih said:
is completely unnecessary and is a massive undertaking that will be hard to use and be another thing that devs will have to grapple with initially.
So if I'm reading what you are saying right - you would rather have them force all developers to use low level assembler instead of one of the graphic languages that just about everyone else is starting to use nowadays?

Why stop there, why not also ditch the C/C++ compilers for the CPU - they are Always unstable and buggy on new CPU architectures as well. Both MS and Sony should just give up on that, and let everyone work with stable and reliable assemblers for their new CPUs!

You know it's not like ATIs chip in Xenon is exactly just rehashing stuff either - it's using a completely new shader model which will use new compiler backends as well.
 
xexex said:
PS3's GPU will be the ultimate decider as far as what a PS3 can paint onto our screens. regardless of how many flops the CPU can crunch. :)

Heh, not really, if you gave a CPU enough power, you wouldn't need a GPU ;)
 
xexex said:
interesting. I always assumed PS3 would have an in-house GPU.


4th generation might be:

1. PS1 GPU
2. PS2 GS
3. GS2 (not used in products AFAIK
4. PS3 GPU

or

1. PS1 GPU
2. PS2 GS
3. PSP GPU
4. PS3 GPU


this still does not rule out Nvidia making certain portions of the PS3 GPU, which will still be an Sony or STI or Sony-Toshiba designed GPU, and manufactured at STI fabs.

It does rule them out. If they had nVIDIA on board, they would have a BIG BIG help in terms of shading language support and shading compilers.
 
It does rule them out. If they had nVIDIA on board, they would have a BIG BIG help in terms of shading language support and shading compilers.
They could very well license CG for their shading language, but that doesn't mean that NVidia will offer them support for writting compiler backends for an alien architecture.
 
Azih said:
I'm just saying that is completely unnecessary and is a massive undertaking that will be hard to use and be another thing that devs will have to grapple with initially.

No, Azih.... people will probably be using something like GLSLANG to write OpenGL code at the High Level and if they push the performance of the Shading core as I think they will, the potential instruction count of the shading core will kinda push developers to desire not to write all the Shading code at the ASM-level.

They probbaly already have a high level shading language compiler, but what people like this one that they are looking for will do is to puh the performance of the compiler down to the levels of hand-written ASM code for the graphics processor (or at least get reasonably close to it).
 
Phoenix said:
Some of the smartest moves they can make.


I agree. I like that Sony pushes the envelope with their technology, but it is a sign of smartness not a sign of weakness to embrace standards where it makes sense: OpenGL ES, MP3, MPEG4 AVC, MPEG4 AVC High Profile, etc...
 
In fact to design a GPU is nothing special...Simplifying things a lot: You need a unit (shader) to process floats (pixels) and a place to put them (memory).
The CELL paradygm fits here without problems. The more distributed the computing model is, the better it will be in terms of performance for a GPU...
 
Phoenix said:
Some of the smartest moves they can make.

Nintendo has been using OpenGl-like API since the N64.

Sell your ....


edit

You know it's not like ATIs chip in Xenon is exactly just rehashing stuff either - it's using a completely new shader model which will use new compiler backends as well.

When was it announced that ATI's Xenon chip was using a new shader model ?
Thanks for the tidbits Faf
 
I've been saying for years that Sony should have utilized the OpenGL API (Going with DirectX might be, awkward?), it's good to see that may be the end result with the PS3. Now to see if I can play around with a PS3 devkit somewhere along the line.
 
Panajev2001a said:
Hehe, not yet... I have never studied compiler optimization before... :( Well. I'll make my way into Sony one day... MAUAHAHHAHAHAHAH.

Pana, be a sellout and work for an investment bank. You want to be able to support your hobby AND feed yourself. :P
 
Millhouse said:
When was it announced that ATI's Xenon chip was using a new shader model ?
It wasn't - but if you look at the leaked document, and the assumption everyone makes that it's a SM3.0+ part - that's obviously very different from existing ATI chipsets.

MrSingh said:
Pana, be a sellout and work for an investment bank. You want to be able to support your hobby AND feed yourself. :P
Bah, food is overrated anyway :)
 
Fafalada said:
It wasn't - but if you look at the leaked document, and the assumption everyone makes that it's a SM3.0+ part.

It's no Shader 4.0 though :( I really wish MS would hold off for a year and put DX10 in Xbox2. I was previously convinced that they would put DX10 in there, so I was a bit disappointed with this.
 
gofreak said:
It's no Shader 4.0 though :( I really wish MS would hold off for a year and put DX10 in Xbox2. I was previously convinced that they would put DX10 in there, so I was a bit disappointed with this.

DX10 might be about 2 years away.
 
It's no Shader 4.0 though :( I really wish MS would hold off for a year and put DX10 in Xbox2. I was previously convinced that they would put DX10 in there, so I was a bit disappointed with this.
Well, it's got the at least some key features there, ie. unification of shaders and certain things in regard to GPUs memory I/O handling.

So to look at the most likely MIAs -
I'll go on a limb here and argue that the addition of Integer ISA in SM4.0 is not a big deal - IMO that's something that truly comes into its own when you have full control over the memory flow in your units, and that doesn't seem to be happening anytime soon on DX front.
As for topology processors and enhanced tesselation etc. - I don't know what GPU will or won't have, but with the large amount of CPU power Xenon is supposed to have, GPU doesn't exactly need that functionality.

To sum it up - unlike PC parts that get shafted in mid-generations sometimes - the non DX9 features in this GPU will actually get used. So I think that holding off another year just to get a few more checkboxes isn't really worth it, especially since their best chance is to be first to the market this time.
 
MrSingh said:
Pana, be a sellout and work for an investment bank. You want to be able to support your hobby AND feed yourself. :P

No, no food is required when you get to step in front of Ken Kutaragi's office at least once a week in admiration.

I hear it rejuvenates you too.











KEEEEEEEEEEEEEENNNNNNNN ;).
 
MrSingh said:
Pana, be a sellout and work for an investment bank. You want to be able to support your hobby AND feed yourself. :P
i'm thinking about it, any guidelines? what posts do they offer engineers, financial analyst?
 
If PS3 is gonna be a bitch to code for, the same way that devs said PS2 initially was, it will really help MS. MS will probably provide an excellent framework for devs, we all know that MS is all about DEVELOPERS DEVELOPERS DEVELOPERS!!

--------------------------------------------------------------------------------

No, its all about USERBASE USERBASE USERBASE. And I'm not going to bet against Sony on that one just yet. If you buy it, they will come. The PS2 was supposed to be a cow to develop for, and that still got the developers.

Bah why does Sony have to reinvent the wheel

You're making a big assumption - that the current wheels are fine and deliver all that is needed from realtime, interactive 3D graphics. OpenGL was never really aimed at that, more at modelling. PS2 has shown that Sony are not afraid to question the existing mantras, and I hope they do the same with PS3.

Either they deliver in spades, and change the way we think about video game graphics, or they fuck up royally and we all buy xbox2s. We win either way
 
I don't know if anyone already pointed this out, but actually inside the newspaper that features the article about Cell there's another article about featured development "teams of the year". One is the Austin Sony/Toshiba/IBM team.

I don't know about IBM, they're okay but tend to solve problems with brute force and end up with solutions that are not as usable as they make them out to be. Anybody who's (tried to) work with DB2 knows what I'm talking about.

Sony Computer Entertainment, Toshiba, IBM
Cell processor


Team: More than 400 at peak. Key members included IBM Fellow James A. Kahle, Sony Computer Entertainment director Masakazu Suzuoki, Toshiba director Yoshio Masubuchi, and IBM director Kathy Papermaster, along with functional managers and technical leads Shigehiro Asano, Scott Clark, Michael Day, Sang Dhong, Sanjay Gupta, Paul Harvey, Hiroo Hayashi, Peter Hofstee, Charlie Johns, Atsushi Kameyama, John Keaty, Ted Maeurer, Mary Many, Lisa Maurice, Dac Pham, Robert Putney, David Shippy, Linda Van Grinsven, James Warnock, and Dieter Wendel.

Project Duration: March 2001 to present

Tools: IBM internal suite

Sites: Austin, Texas; Rochester, Minn.; Yorktown Heights, N.Y.; Raleigh, N.C.; Boeblingen, Germany; Burlington, Vt.

Biggest hurdle: Getting more than 400 engineers, working from sites around the world, to effectively build all of the design elements required to a brand new, highly customized chip.

Other excerpts from the article:

"As the Japanese developers arrived in Austin, it soon became apparent that, though they could read and write English, they struggled to speak English. The Sony and Toshiba management quickly hired an English teacher to visit the Design Center twice a week and teach conversational English. The IBM parenters also helped by ensuring all communication was done in two forms: written and verbal"

"Test hardware is now running in the lab and is going through extensive validation, while th eteam prepares for completion of the project"
 
It wasn't - but if you look at the leaked document, and the assumption everyone makes that it's a SM3.0+ part - that's obviously very different from existing ATI chipsets.

true, because existing ATI VPUs are SM2.0+ parts. I mean the ones that are on the market now. not counting the forthcoming R520.
 
Top Bottom