• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGN Posts Up More Project Cafe Hardware Power Rumors

big_z

Member
antonz said:
How about people wake the fuck up and realize no system launching in 2012 or 2013 is going to be using 2012/2013 technology.

Even if They went with a 6990 its a card that was developed in 2010. Oh my god 2013 console using 2010 tech alert the presses!

didnt 360 and ps3 use gpus that were only one generation behind what was available for pc at the time. even then they were custom made so they would support some of the newer effects.

nintendo is going way to far back in tech. they need to use higher end stuff and take a loss on hardware for the first year. sure they loose instant profits but they'll have a head start in userbase and wont get leapfrogged when the new xbox/ps4 arrive.

a shitty gimmick and last gen tech arent going to solve things. they're only setting themselves up for the same problem the wii has but most likely with far less sales.
 

Mr_Brit

Banned
big_z said:
didnt 360 and ps3 use gpus that were only one generation behind what was available for pc at the time. even then they were custom made so they would support some of the newer effects.

nintendo is going way to far back in tech. they need to use higher end stuff and take a loss on hardware for the first year. sure they loose instant profits but they'll have a head start in userbase and wont get leapfrogged when the new xbox/ps4 arrive.

a shitty gimmick and last gen tech arent going to solve things. they're only setting themselves up for the same problem the wii has but most likely with far less sales.
The 360 used a GPU more advanced than anything at the time and the PS3 used a GPU as advanced as anything at the time(the PS3 was originally meant to launch in 2005).
 

wsippel

Banned
SneakyStephan said:
Fuck me, the 5770 is like 40 percent faster in games than a 4770, let go of that one arbitrary number that you seem to think equals and completely encompasses performance already.

Let me spell it out:
hd 4890 runs crysis benchmark at 35 ish fps, uses about 190 Wh
hd 6850 runs crysis benchmark at 45 fps, uses about 140 Wh

Also if you increase the voltage given to your gpu to oc, power consumption skyrockets relative to the performance gain that you get from the slight extra clockspeed increase it allows you to get.
Raw low level performance is a lot less arbitrary than random game benchmarks. ;)
 
Mr_Brit said:
The 360 used a GPU more advanced than anything at the time and the PS3 used a GPU as advanced as anything at the time(the PS3 was originally meant to launch in 2005).
This is nonsense and demonstrably untrue.
 

Kenka

Member
Mr_Brit said:
The 360 used a GPU more advanced than anything at the time and the PS3 used a GPU as advanced as anything at the time(the PS3 was originally meant to launch in 2005).

That is wrong. You probably refer to the PS2 era. The Graphics Synthetiser back then was still hot shit. But the PS3's GPU was outdated on Day 1.
 
wsippel said:
Raw low level performance is a lot less arbitrary than random game benchmarks. ;)

I couldn't give a damn about raw performance if it doesn't translate into the games I'm playing :p
Game benchmark results (synthetic benches and fuckup releases not withstanding) are usually fairly even across the board.

A gtx460 generally performs 20-40 percent better than a 4870 for example.

If I can get that extra performance at a lower power consumption then that is exactly what I'm looking for, now isn't it?

5870 and 6870, gtx 470/480 and gtx 570/580 fit the bill really well.
Ignore my mentioning the 4890 vs 6870 as they have a different size manufacturing.
(lack of sleep)

If architecture optimisations didn't make a difference then why would they bother with them? Just pick a design that functions and then cram as many transistors on a die as you can right.

If you can disprove the 6870 and gtx 480 results that people from ars , anandtech and guru3d boasted about (or point out a flaw in the logic we were applying to them) then I'm more than willing to swallow my words and admit defeat, I'm not a petty man :p
 

Mr_Brit

Banned
elrechazao said:
This is nonsense and demonstrably untrue.
Kenka said:
That is wrong. You probably refer to the PS2 era. The Graphics Synthetiser back then was still hot shit. But the PS3's GPU was outdated on Day 1.

Tell me about a 2005 GPU which is more advanced than the Xenos then.
 
Mr_Brit said:
Tell me about a 2005 GPU which is more advanced than the Xenos then.

Depends what you mean by more advanced, are we talking performance or feature set....?

You could argue that by virtue of its on chip embedded 10mb EDRAM, there is no current GPU as advanced as the Xenos one to this day, but clearly in terms of performance it was left for dust years ago by PC GPU's.

I had a X850 XTPE back in 2005 at the time of the 360's launch that was more powerful than the Xenos.
 

jett

D-Member
I don't know if I would call this a generational leap. This is going to end up like an inbetween-gens console similar to the Dreamcast, although the Wii2 is much less of an improvement than the DC was over the PS1.

Johnny2Bags said:
No witcher 2 cannot be ported to 360 in its current form...

CD Projekt has announced that console versions of W2 are coming. Controls were redesigned entirely in W2 to be more console-friendly, even....so yeah.
 

Minsc

Gold Member
Nirolak said:
Well, here's are The Witcher 2's system specifications:


A tri-core semi-modern PowerPC clocked over 3.2 GHz should be comparable to Intel's first ever quadcore.

The graphics card would be equivalent via this rumor.

A 4850 (going from the impressions of the preview build) is about 50% slower than what is likely needed to run high/ultra. I wouldn't claim the recommended specs run even high settings so far, most impressions put it more towards a combination of medium/high, with a generation higher of cards (5850) needed to run high smoothly.
 
When you guys are mentioning RAM, are you all assuming it will be unified RAM, like the 360 or more like the PS3 with set amounts for system and GPU?...

The way i see it, 1Gb of system RAM with 512mb of dedicated GPU RAM would be more than enough for the sort of cost/performance they seem to be aming for.

Although, if they want really nice 1080p visuals, 1GB of VRAM would be far better...as that would allow for plenty of Anti-Aliasing, which would have to be sacrificed with only 512mb.

Of course it is arguable, as to whether you need lots of AA @ 1080p anyway.....
 

Durante

Member
I just have to comment on the argument being made earlier that theoretical FLOPs are the only "sane" way to compare GPU architectures. And my comment is that it couldn't be more wrong. It's trivial to build a chip that gets 5x times the FLOPs/Watt compared to any current GPU, but only reaches that throughput for, say, dense matrix/matrix multiplication.

It's ridiculous to try and reduce GPU performance to a single theoretical number.
 

Door2Dawn

Banned
Meh. I don't see Nintendo going with such a beefy console. They'll launch a system that's just slightly above 360 level to get ports and that's it.

Nintendo isn't the company that pushes technology those days are over.
 

Wazzim

Banned
Door2Dawn said:
Meh. I don't see Nintendo going with such a beefy console. They'll launch a system that's just slightly above 360 level to get ports and that's it.

Nintendo isn't the company that pushes technology those days are over.
Having a 3 core processor and a 4850 isn't really pushing technology you know. I think this is pretty plausible.
 
caligula13 said:
unbelievable! nintendo using "old" technology for "new" system. someone call the press!

Thats a terrible idea, you only have to see how they got suckered into hyping up an overclocked Gamecube in a shiny new plastic white case, and went onto make billions selling crap hardware to the guliable non gamers who thought looking like an asshole waving a funny little pointy thing was cool.
 

Nirolak

Mrgrgr
Minsc said:
A 4850 (going from the impressions of the preview build) is about 50% slower than what is likely needed to run high/ultra. I wouldn't claim the recommended specs run even high settings so far, most impressions put it more towards a combination of medium/high, with a generation higher of cards (5850) needed to run high smoothly.
Interesting.

Hmm, seems I should be more skeptical of RPS's assessments.

Thanks for the info.
 

neoanarch

Member
DeFiBkIlLeR said:
Thats a terrible idea, you only have to see how they got suckered into hyping up an overclocked Gamecube in a shiny new plastic white case, and went onto make billions selling crap hardware to the guliable non gamers who thought looking like an asshole waving a funny little pointy thing was cool.


caligula13 said:
unbelievable! nintendo using "old" technology for "new" system. someone call the press!


juniors.



Is anyone really, really expecting MS and Sony to price their successors at $5-$700 to get another 10X jump on current gen machines? I think 4-6x is realistic for $450 if the Wii2 is 2-3x in a year for 350-400.
 

Chaplain

Member
34242721.jpg


I imagine buying this RAM in millions would reduce the price significantly. So why would Nintendo only use 1 Gig of RAM when RAM is so cheap?
 

Minsc

Gold Member
Nirolak said:
Interesting.

Hmm, seems I should be more skeptical of RPS's assessments.

Thanks for the info.

We'll know for sure in two more days, but there's been two separate accounts of the preview build running on a 4870 (slightly faster than a 4850) with a dual core CPU though, and it not being able to do high (medium at 720p or low at 1080p). Granted that's a dual core CPU, and most impressions of faster cards are with quad core CPUs, so it gets a little murky to guess, but in one of the previews the person left their dual core CPU alone and upgraded the 4870 to a GTX 460 and was able to run high well, or ultra with some choppiness, suggesting the dual core wasn't a total handicap.
 

mrklaw

MrArseFace
Comparing to a PC is nonsensical. If a PC with that spec could run the Witcher 2 in 1080p at max settings (no AA), a console with the same specs would shit all over it purely from efficiency of a focused architecture

Most current PC games are held back by current consoles - you get HD and AA but poly counts are similar.

big jumps in console power will benefit PCs

I have three main worries about café
- Nintendo won't leverage the tech properly, the.mobe to Shafer based GPUs has PS3 and 360 devs at an advantage as they simply have more experience with it

- 3Rd parties will ignore it until PS4/720 come along. Thereight be some token HD ports but I can't see anyone leading new development on it until costs can be amortised across more than one console

- The odd screen concept might suck up power and reduce what you get on the screen. Bit like how 3DS games would have more detail if they were 2D, the higher horsepower going on creating the 3D. If deva have to set aside enough power to stream 4x800x500 screens then you are almost drawing 2 1080p screens.
 

Jocchan

Ὁ μεμβερος -ου
Game Analyst said:
http://www.netaffilia.com/images/2011/05/13/34242721.jpg[IMG]

I imagine buying this RAM in millions would reduce the price significantly. So why would Nintendo only use 1 Gig of RAM when RAM is so cheap?[/QUOTE]
Because they won't be using DDR3. Unless you weren't serious, and "RAM is cheap" is the new "this is how 3DS games will look like" video.
 

Wazzim

Banned
Game Analyst said:
34242721.jpg


I imagine buying this RAM in millions would reduce the price significantly. So why would Nintendo only use 1 Gig of RAM when RAM is so cheap?
*sigh*

Not this again. Console RAM is not the same as the sticks you put in your PC.
 

mrklaw

MrArseFace
In addition, I'm not sure current PC GPU tech is suited to fixed resolution consoles. High end GPUs have lots of fillrate needed for high res monitors and multiple screens. What consoles need is more power per pixel, tuned to 1080p displays. And ideally embedded framebuffers and AA support that doesn't eat RAM.

Take the Xenos model, add MLAA hardware and apply it to the most recent GPU tech that is affordable
 
Here's the two most stupid notions in this thread:


A) Nintendo going with a slightly/significantly capable version of 360/PS3 = a macho, beefy next gen console

B) Nintendo going with a slightly/significantly capable version of 360/PS3 = a weak, one generation behind console comparable to next 360/PS4


Both are fucking stupid. This will be a system that will comfortably handle ports of next generation systems, albeit scaled down in effects and geometry to accommodate with the lesser specs. What will this not be for sure is a repeat of the Wii situation, where porting a game to the Wii essentially meant creating a different game due to technology restraints.
 

KKRT00

Member
This rumor doesnt make sense, earlier ones mentioned that Cafe will be a little more powerful than current generation, but 4850 indicates that it could be even 4 times more powerful than x0/ps3.

BTW i dont get Nintendo. Why do they even consider R700 and especially 4850 [overcloaked 4770 would be smarter]? R800 are made in 40nm instead of 55nm like R700, has much efficient TDP, are still produced and has better instructions set [dx 11 so shaders 5.0, openCL 1.1 and OpenGL 4.1 [3.1 in r700] - 5770 is so much better solution.

5770 not only would be faster that 4850 [more ways to optimize and writing code], but wouldnt be outdated in next-generation like Wii is now.
 
KKRT00 said:
This rumor doesnt make sense, earlier ones mentioned that Cafe will be a little more powerful than current generation, but 4850 indicates that it could be even 4 times more powerful than x0/ps3.

That's how system power rumors work. You're going to hear everything. From a system not really being much more powerful than what's currently on the market to it blowing everything out the water. We saw an example of that recently with all the crazy rumors circulating around the PSP2's power.
 

DieH@rd

Banned
Game Analyst said:
34242721.jpg


I imagine buying this RAM in millions would reduce the price significantly. So why would Nintendo only use 1 Gig of RAM when RAM is so cheap?

I take you picture and raise you with another one:
drambandwidth.gif


PC ram is cheap cos it has crappy speeds.

X360 uses 512mb gddr5 for unified pool, PS3 has 256mb of gddr3 for main ram and 256mb of XDR for video memory.
V3NA3.png
 
KKRT00 said:
This rumor doesnt make sense, earlier ones mentioned that Cafe will be a little more powerful than current generation, but 4850 indicates that it could be even 4 times more powerful than x0/ps3.

BTW i dont get Nintendo. Why do they even consider R700 and especially 4850 [overcloaked 4770 would be smarter]? R800 are made in 40nm instead of 55nm like R700, has much efficient TDP, are still produced and has better instructions set [dx 11 so shaders 5.0, openCL 1.1 and OpenGL 4.1 [3.1 in r700] - 5770 is so much better solution.

5770 not only would be faster that 4850 [more ways to optimize and writing code], but wouldnt be outdated in next-generation like Wii is now.

This is a customised design. Why are you assuming later design improvements won't make it in.
 

chaosblade

Unconfirmed Member
Jocchan said:
Because they won't be using DDR3. Unless you weren't serious, and "RAM is cheap" is the new "this is how 3DS games will look like" video.
It may as well be, I think I've seen people post that in every Cafe thread so far.

Nintendo does like fast RAM, even if they tend to gimp the total amount. I don't think 1GB 512-bit GDDR5 unified system memory is out of the question. I'd be surprised to see them use more than that because this is Nintendo we are talking about, and less would essentially defeat the purpose of the rest of the hardware. And given the 360 comparisons, a small pool (16MB?) of eDRAM on the GPU might be thrown in too. Not sure what that adds to the price, but I know it's not cheap.

(Edit: And I guess Nintendo could go with XDR2, but isn't that a lot more expensive than GDDR5?)

KKRT00 said:
BTW i dont get Nintendo. Why do they even consider R700 and especially 4850 [overcloaked 4770 would be smarter]? R800 are made in 40nm instead of 55nm like R700, has much efficient TDP, are still produced and has better instructions set [dx 11 so shaders 5.0, openCL 1.1 and OpenGL 4.1 [3.1 in r700] - 5770 is so much better solution.

5770 not only would be faster that 4850 [more ways to optimize and writing code], but wouldnt be outdated in next-generation like Wii is now.
You have to keep in mind that Nintendo isn't going to actually use 4850 chips, they are going to get a customized part based on R700 that is roughly equal to a 4850. It will probably be 40nm, and it could very well support OpenGL 4.1.

It's not unheard of, the 360 GPU had features that weren't available until the generation after the card it was based on.
 

Minsc

Gold Member
DieH@rd said:
PC ram is cheap cos it has crappy speeds.

But it doesn't have any significant effect on framerate, so why pay for the faster stuff? Serious question, as many a people smarter than me told me to stay away from buying more expensive, faster RAM, since it wouldn't change my gaming framerates, and the few non-memory sponsored benchmarks I found confirmed as much.
 

Mr_Brit

Banned
Minsc said:
A 4850 (going from the impressions of the preview build) is about 50% slower than what is likely needed to run high/ultra. I wouldn't claim the recommended specs run even high settings so far, most impressions put it more towards a combination of medium/high, with a generation higher of cards (5850) needed to run high smoothly.
RPS used a GTX 260 and got 20-30FPS at 1080p in ultra, with optimisation and assuming Nintendo don't cripple it with 1GB RAM, the Witcher 2 could easily run on this thing at 30FPS locked at 1080p ultra settings.

Minsc said:
But it doesn't have any significant effect on framerate, so why pay for the faster stuff? Serious question, as many a people smarter than me told me to stay away from buying more expensive, faster RAM, since it wouldn't change my gaming framerates, and the few non-memory sponsored benchmarks I found confirmed as much.
People are assuming Nintendo are going for a unified memory setup so using DDR3 would affect the system unlike PCs where even DDR2 is more than good enough in the majority of situations.
 

chaosblade

Unconfirmed Member
Minsc said:
But it doesn't have any significant effect on framerate, so why pay for the faster stuff? Serious question, as many a people smarter than me told me to stay away from buying more expensive, faster RAM, since it wouldn't change my gaming framerates, and the few non-memory sponsored benchmarks I found confirmed as much.
Not quite the same thing in a console though, especially in a case where the memory is unified and the GPU and CPU share the RAM (and I'd guess that the Cafe will have unified RAM, it just seems like a better setup for a console).

What makes a difference in your framerate is your GPU VRAM, that's why it uses fast GDDR memory.
 

Durante

Member
Minsc said:
But it doesn't have any significant effect on framerate, so why pay for the faster stuff? Serious question, as many a people smarter than me told me to stay away from buying more expensive, faster RAM, since it wouldn't change my gaming framerates, and the few non-memory sponsored benchmarks I found confirmed as much.
The differences are that:

- on PC, buses are wider and more chips are used in parallel, so the individual chip's BW doesn't have to be particularly high. On consoles, board complexity and the number of components is kept low, so you don't have the luxury of going parallel.
- consoles mostly use unified memory pools, which need to provide enough bandwidth for both CPU and GPU. High-end PC GPUs alone have memory bandwidths far in excess of what you'd ever see in a contemporary console.
 

legacyzero

Banned
Log4Girlz said:
This machine screams for 2 GB of RAM. Oh how it won't happen.
This would almost guarantee a buy from me. I would love to have a console that doesn't choke up.

Also, I hate how slow and chuggy the Xbox360 dashboard runs while I'm playing a game (or any time, really.) I'm sure more memory would certainly help with something like this.

Impress me Nintendo! It's nice to know that they're aiming at the hardcore again, though.
 

Minsc

Gold Member
Mr_Brit said:
RPS used a GTX 260 and got 20-30FPS at 1080p in ultra, with optimisation and assuming Nintendo don't cripple it with 1GB RAM, the Witcher 2 could easily run on this thing at 30FPS locked at 1080p ultra settings.

It's the oddball out so far. I dunno, I guess you can't discount it, but there's a half dozen other examples that contradict it. Plus a GTX 260 is usually like 25%-30% faster than a 4850.

There's a good chance Ultra's framerate could be worse in the final version too, as it wasn't working properly in the beta build, and some options might not have been active at all, as the ultra vs high screenshots were very similar.

I'd say there's a 95% chance a GTX 260 will not run ultra at 60fps at 1080p and a 75% chance it won't at 30fps in the build that comes out in two days. Other reviews have commented a quad core CPU and GTX 460 (20% faster than a 260) doesn't run Ultra perfectly.

Edit: It is likely a GTX 260 would run high acceptable with a good CPU, 20-30 fps. We have a recent reviewer using a GTS 450 and getting 15-20fps on high, and a GTX 260 is a bit faster, so ~30fps on a 260 on high seems reasonable, but ultra remains to be seen.
 
Annoying Old Party Man said:
Here's the two most stupid notions in this thread:


A) Nintendo going with a slightly/significantly capable version of 360/PS3 = a macho, beefy next gen console

B) Nintendo going with a slightly/significantly capable version of 360/PS3 = a weak, one generation behind console comparable to next 360/PS4


Both are fucking stupid. This will be a system that will comfortably handle ports of next generation systems, albeit scaled down in effects and geometry to accommodate with the lesser specs. What will this not be for sure is a repeat of the Wii situation, where porting a game to the Wii essentially meant creating a different game due to technology restraints.

Next gen as in ps4 and 720 im assuming, right?
 

Luckyman

Banned
TekkenMaster said:
Looks like Cafe will be well set to compete graphically with PS4/720, even if they both release in 2014.

You do know this GPU is four years old in 2012. If thet put just 1GB its really wishful to think this will not be left out when development moves to next gen.
 

Threi

notag
Game Analyst said:
34242721.jpg


I imagine buying this RAM in millions would reduce the price significantly. So why would Nintendo only use 1 Gig of RAM when RAM is so cheap?
Not to mention how much money Nintendo save of all of the mail-in rebates!

:D
 
Top Bottom