• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

6'th gen hardware wars: Game Cube vs Xbox OG vs PS2 vs Dreamcast

Xbox and wiis gpus are similar in what they can push out as well, but xbox has serious limitations in bandwidth and CPU to gpu communication. The latter can impact polygon pushing performance, physics etc. GameCube had a fsb just as fast as its gpu so it didn't share xboxs bottleneck.

xbox memory is unified ¿what you need to pass that require huge BW?

CPU to GPU comunication is more important for polygons only when the cpu gives the geometry, if you use vertex buffer objects(called this way in opengl) the GPU handles the geometry directly for you, and CPU doesnt have to give geometry it only tells gpu what to do with it, you dont need too much BW for that, we can argue what cpu is better but the reality is that they dont have the same tasks

physics when calculated in cpu is similar, you calculate the position of an object and you pass the variable to the vertex shader, the cpu handle the physic yes but the result is easy to pass and doesn't require too much BW


Xbox is the more powerful console over cube, but GameCube had enough advantages to where it could produce superior results in alpha transparencies, polygons and water/reflection tricks in a lot of games made from the ground up on cube.


xbox has 4 pixel pipes with 2 textures units each while GC has 4 pixel pipes buy only 1 texture unit each

on GC the TEV handles the textures combination, on xbox is programable, it can do water more easily and more complex, you can do whatever water effect on GC, PS2 or Xbox, but when it comes to shader processing xbox is better unless you require an insane amount of multipass for water(like MGS tanker rain effects) like particles in that case ps2 will be better

Wii just takes cubes advantages, ditches its ram size and clock speed weaknesses and calls ot a day.

yes but its memory is 64 bit compared to xbox 128 bit, its 3.8 GB/s vs 6.4 GB/s , xbox still has huge advantage

and it still maintains GC's embeded 3 MB, texture cache(1 MB) and frame and z buffer (2 MB) and same pixel pipes and texture units I dont know if wii games takes textures from ram and this is for GC compatibility, but if its like GC then it can only take textures from this cache maybe they can probably update textures more times in that case but I dont see it as "advantage" over xbox[/QUOTE]
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
Battalion Wars looks pretty nice. Pretty meh water (much improved for the Wii sequel alongside everything else) but Battlefield-scale maps with a decent amount of units and assets that look nice whether you're flying high in a jet or take things up close and personal with a tank or infantry.
 
Last edited:
Refresh my memory. Was it only the 360 that had the Gamma problem (black crush) or did the og have it as well? I'm assuming it got fixed eventually?
Og Xbox didn't have it or at least I don't notice / never heard of it.

In 3
xbox memory is unified ¿what you need to pass that require huge BW?

CPU to GPU comunication is more important for polygons only when the cpu gives the geometry, if you use vertex buffer objects(called this way in opengl) the GPU handles the geometry directly for you, and CPU doesnt have to give geometry it only tells gpu what to do with it, you dont need too much BW for that, we can argue what cpu is better but the reality is that they dont have the same tasks

physics when calculated in cpu is similar, you calculate the position of an object and you pass the variable to the vertex shader, the cpu handle the physic yes but the result is easy to pass and doesn't require too much BW





xbox has 4 pixel pipes with 2 textures units each while GC has 4 pixel pipes buy only 1 texture unit each

on GC the TEV handles the textures combination, on xbox is programable, it can do water more easily and more complex, you can do whatever water effect on GC, PS2 or Xbox, but when it comes to shader processing xbox is better unless you require an insane amount of multipass for water(like MGS tanker rain effects) like particles in that case ps2 will be better



yes but its memory is 64 bit compared to xbox 128 bit, its 3.8 GB/s vs 6.4 GB/s , xbox still has huge advantage

and it still maintains GC's embeded 3 MB, texture cache(1 MB) and frame and z buffer (2 MB) and same pixel pipes and texture units I dont know if wii games takes textures from ram and this is for GC compatibility, but if its like GC then it can only take textures from this cache maybe they can probably update textures more times in that case but I dont see it as "advantage" over xbox
[/QUOTE]
Your 3.8gb/s figure for wiis main bandwidth is a complete fabrication. I could give you 10 years and you wouldn't be able to prove that nonsense.

Wii uses gddr3, not ddr as found in Xbox. We don't actually know exactly how fast wii's 64mb pool is, but if we look at xbox 360's 128bit pool of gddr3 ; it's 22.6gb/s.

The best possible scenario for Wii then would be 11.3gb/s On a 64 bit bus ; but it's far from likely. Wii almost certainly uses lower speed chips, but my extremely conservative estimate for it would be 6.4gb/s putting it on par with Xbox minimum. But, there is a strong possibility it is faster than that. There are no concrete figures either way.

Your description of the pixel pipelines is also somewhat misleading. xbox can only do 4 texture layers per pass, but given that it has 2 units that means 8 textures per pass.

GameCube and wii can both do 2 texture layers per texture unit ; also adding up to 8 textures per pass.

Please thoroughly research and don't have a defense force attitude or i won't bother further.
 
Last edited:

pawel86ck

Banned
That's what im getting at, i don't think Nintendo were being conservative with the tech demo unlike past zelda tech demos this wasnt representative of the visuals wiiu could produce outside heavily scripted scenarios

Are there any PS3 games with reflections as good looking and clean? genuinely curious

I agree with most of your post except this premise.
Splinter Cell was developed around Xbox streghths as that was the lead platform, both PS2 & GC could produce better results (than the ports) if the game was developed from the ground up around their hardware. XBOX would still win obviously

Same with RE4, I think PS2 could have done better if the game was developed for it from the ground up around ps2 hw instead of ported, or MGS2 on Xbox... See my point?
I dont know, the problem is splinter cell games were build with specific Xbox hardware in mind from the start: shaders, shadow buffers, HDD + more RAM (because of that xbox version had bigger and more detailed levels). If they would build PS2 or GC version from the ground up I think they would be able to build better looking game for sure but they would have to make different game at the same time (remove dynamic lighting totally and push more polygons instead like in Metal Gear Solid games, choose different art direction and fixed camera, and maybe mask imperfections with filters).

Clipboard02.png


4.png


Clipboard05.png


5.png


Clipboard08.png


6.png

Clipboard03.png
11.png


Clipboard20.png


8678678.png


Clipboard11.png


43.png


Clipboard09.png


27.png


Clipboard12.png


5235.png


Clipboard13.png


5467.png


Clipboard14.png


2223.png


Clipboard06.png


86.png


Clipboard07.png


967.png

When it comes to Splinter Cell 3, my understanding is port for PS2 was build specifically for PS2.

Clipboard09.png


pcsx2-2019-07-10-22-18-42-63.png


FFFE07-D220190706174756274.png


Dolphin-2019-07-09-02-35-09-23.png


Clipboard15.png


pcsx2-2019-07-10-22-34-55-84.png


IMO if people would only play PS2 or GC version and compared it to other GC and PS2 games, they would think splinter cell on PS2/GC looks good from that perspective. Very few games on PS2 and GC were using dunamic shadows like in splinter cell 1, water or geo texturing like in SC3, so that was impressive.

PS2 screenshots
43627-silent-hill-3-playstation-2-screenshot-escaping-through-the.jpg


gfs_51261_2_6_mid.jpg


PC screenshots
54959-silent-hill-3-windows-screenshot-hey-silent-hill-fans-recognize.jpg


53622-silent-hill-3-windows-screenshot-well-now-that-i-door-i-looks.jpg


54955-silent-hill-3-windows-screenshot-you-know-you-re-in-a-silent.jpg


53621-silent-hill-3-windows-screenshot-you-know-you-re-in-a-silent.jpg


Sillent Hill 3 had dynamic shadows and it was build with PS2 in mind, and yet it looks more like Splinter Cell 1 on PS2 or GC.

863717-luigi-s-mansion-gamecube-screenshot-ghost-mice-in-the-basement.png


23852-luigi-s-mansion-screenshot.jpg


863720-luigi-s-mansion-gamecube-screenshot-decoy-boo.png


863709-luigi-s-mansion-gamecube-screenshot-catching-lady-ghost.png


863693-luigi-s-mansion-gamecube-screenshot-using-water-element.png




23851-luigi-s-mansion-screenshot.jpg


863726-luigi-s-mansion-gamecube-screenshot-spooky-the-dog-in-the.png


Ligis Mansion was build with GC hardware in mind, of course it's game for kids, but it has dynamic shadows I dont know if there are other GC games with dynamic shadows.

Metal Gear Solid Twin Snakes however should be more similar to Splinter Cell



solid-snake.jpg




large.jpg


full.jpg


full.jpg


full.jpg


29031-ingame-Metal-Gear-Solid-The-Twin-Snakes-Special-Disc.jpg


Why I just cant see 20-30 million of polygons here, ultra sharp textures, and shader effects that could rival xbox effects? This game was build with GC in mind, so why it cant look much better than Splinter Cell 1 on GC version and not to mention match xbox version? We should blame developer for that?

I really think Splinter Cell games were ported to PS2 and GC with good results, and the only reason why these games looked much worse compared to xbox version is because MS console was more superior than people are willing to admit. Like P polybius80 has said, MS build extremely capable console, but at the same time they have lost insane amount of money because of that. What's funny I dont even know if MS ever made money with the Xbox brand and why are they keeping it around but I can thank them for that, because I have very good memories thanks to xbox consoles.


Oh no, you see Nintendo fanboys compare stuff, just only by their own standards.

The moment you do the same, it doesn't count all of a sudden!
It's funny, but when RE4 is mentioned Nintendo fans say that's a proof GC was superior compared to PS2, but when games like Splinter Cell on xbox are mentioned suddenly we shouldn't talk about multiplatform games :messenger_tears_of_joy: .
 

SonGoku

Member
PS2 or GC version from the ground up I think they would be able to build better looking game for sure but they would have to make different game at the same time
Im not saying they would look as good as the Xbox but they will certainly look better than the ports, that's all.
Isnt MSG2 60fps btw?
 

pawel86ck

Banned
Im not saying they would look as good as the Xbox but they will certainly look better than the ports, that's all.
Isnt MSG2 60fps btw?
MGS2 is 60fps game, but on PS2 you only can play in 480i instead of 480p.

I'm playing PS2 games lately and 480p makes a huge difference even on emulator. Guys, there's some easy method to run all PS2 games at 480p?

480i
pcsx2-2019-07-21-22-52-05-49.png


480p
pcsx2-2019-07-21-23-00-00-32.png


And some more screenshots

pcsx2-2019-07-21-22-56-24-53.png


pcsx2-2019-07-21-22-54-13-82.png


pcsx2-2019-07-21-22-55-47-99.png


Tekken 4 on PS2 was really impressive back in 2001, especially water looked very nice.
 
Last edited:
Your 3.8gb/s figure for wiis main bandwidth is a complete fabrication. I could give you 10 years and you wouldn't be able to prove that nonsense.

Wii uses gddr3, not ddr as found in Xbox. We don't actually know exactly how fast wii's 64mb pool is, but if we look at xbox 360's 128bit pool of gddr3 ; it's 22.6gb/s.

The best possible scenario for Wii then would be 11.3gb/s On a 64 bit bus ; but it's far from likely. Wii almost certainly uses lower speed chips, but my extremely conservative estimate for it would be 6.4gb/s putting it on par with Xbox minimum. But, there is a strong possibility it is faster than that. There are no concrete figures either way.

wii
243 MHz(memory speed) * 2 (double data rate) * 64 (memory bus) / 8(bits to bytes) = 3888 MB/s

xbox 360
700 MHz(memory speed) * 2 (double data rate) * 128 (memory bus) / 8(bits to bytes) = 22400 MB/s


I got wii speed from here

if you think something is wrong give numbers and then we can do the math
 
Last edited:

Romulus

Member
It's funny, but when RE4 is mentioned Nintendo fans say that's a proof GC was superior compared to PS2, but when games like Splinter Cell on xbox are mentioned suddenly we shouldn't talk about multiplatform games :messenger_tears_of_joy: .


:messenger_sunglasses:

Or any of the other hundreds of multiplatforms.

Not only that, the ps2 is possibly the most difficult platform to code for, but the RE4 port is somehow the poster child for all ports and exact power metrics for PS2 vs Gamecube.
 
wii
243 MHz(memory speed) * 2 (double data rate) * 64 (memory bus) / 8(bits to bytes) = 3888 MB/s

xbox 360
700 MHz(memory speed) * 2 (double data rate) * 128 (memory bus) / 8(bits to bytes) = 22400 MB/s


I got wii speed from here

if you think something is wrong give numbers and then we can do the math
243mhz is the gpu speed '_'

And front side bus speed
 
Last edited:

Doczu

Member
if you think something is wrong give numbers and then we can do the math

Man, i came here fo console wars, not math classes 😐
So can we please get back to the correct way of clasifying consoles?

If GC was SSJ Trunks, then the Wii must be a USSJ Trunks, right?
So is the Xbox base Cell or after absorbing Android 17?

Sorry guys, i just can't understand other measures than DBZ 🤷🏻‍♂️
 
Man, i came here fo console wars, not math classes 😐
So can we please get back to the correct way of clasifying consoles?

If GC was SSJ Trunks, then the Wii must be a USSJ Trunks, right?
So is the Xbox base Cell or after absorbing Android 17?

Sorry guys, i just can't understand other measures than DBZ 🤷🏻‍♂️
Ps2 - ss goku
Cube - super vegeta
Xbox - perfect cell missing his arms
Wii - perfect cell
 

pawel86ck

Banned
Man, i came here fo console wars, not math classes 😐
So can we please get back to the correct way of clasifying consoles?

If GC was SSJ Trunks, then the Wii must be a USSJ Trunks, right?
So is the Xbox base Cell or after absorbing Android 17?

Sorry guys, i just can't understand other measures than DBZ 🤷🏻‍♂️
War never changes and has no rules:messenger_winking:
maxresdefault.jpg
 
Last edited:

They copy and pasted the gpu clocks to their memory spec. I'm pretty sure they didn't make gddr3 chips as slow as 3.8 GB/s!

But if they are correct and it's effectively 486mhz effective, thatd make it 7.9gb/s. So they didn't even follow their own logic in the listing.

7.9 GB/s would actually make sense though.
 
Last edited:
They copy and pasted the gpu clocks to their memory spec. I'm pretty sure they didn't make gddr3 chips as slow as 3.8 GB/s!

But if they are correct and it's effectively 486mhz effective, thatd make it 7.9gb/s. So they didn't even follow their own logic in the listing.

7.9 GB/s would actually make sense though.

no, the effective clock is the base clock multiplied by 2, you dont take the effective clock and multiply again

its not about slow chips, its about the speeds and there are many reasons to keep low speeds such as temperature,

they are not "my numbers" by the way, despite nintendo being secretive they are simple wii specs that anybody can find on google

GPU (Hollywood )
  • Operating speed of 243 MHz
  • 3 megabytes of embedded graphics memory
  • 24 megabytes of internal main memory
  • Internal main memory operates at 486 MHz
  • Maximum bandwidth between GPU and main memory: 3.9 gigabytes per second
  • The GPU of the Wii is identical to the GC's but it is on average 1.5X faster




I am not going to say that just because a page publish specs they are right, there are many cases of wrong information in gaming sites specially when it comes to specs, I dont blame you for not believeing them, but if you dont agree with them you can provide sources to contradict that or at least a good technical explanation of why they are wrong
 
Last edited:
no, the effective clock is the base clock multiplied by 2, you dont take the effective clock and multiply again

its not about slow chips, its about the speeds and there are many reasons to keep low speeds such as temperature

they are not "my" numbers by the way, despite nintendo being secretive they are simple wii specs that anybody can find on google

GPU (Hollywood )
  • Operating speed of 243 MHz
  • 3 megabytes of embedded graphics memory
  • 24 megabytes of internal main memory
  • Internal main memory operates at 486 MHz
  • Maximum bandwidth between GPU and main memory: 3.9 gigabytes per second
  • The GPU of the Wii is identical to the GC's but it is on average 1.5X faster




I am not going to say that just because a page publish specs they are right, there are many cases of wrong information in gaming sites specially when it comes to specs, I dont blame you for not believeing them, but if you dont agree with them you can provide sources to contradict that or at least a good technical explanation of why they are wrong
I see the problem.

The 1tsram (24mbs) is actually the main memory on Wii but a second pool, 64 mbs of gddr3 graphics memory exists. The 1tsram ran at 2.6gb/s on cube... But it would make sense if it ran at 3.9gb/s on Wii to match up with the gpu clocks. This is also why the fsb is 243mhz.

This is the amount of bandwidth available to the cpu, in other words. Which the Xbox has a little over 1 gb/s for its CPU.

The gddr3 graphics memory is not limited by its front side bus ; that's not how it works. Just as the xbox gpu was not limited by its fsb ; the cpu is. Fsb's and memory bandwidth didn't work this way.

And there was absolutely no way the gddr3 in Wii would be as slow as 3.9gb/s in a 64 bit bus.
 
Last edited:

Oemenia

Banned
It's funny, but when RE4 is mentioned Nintendo fans say that's a proof GC was superior compared to PS2, but when games like Splinter Cell on xbox are mentioned suddenly we shouldn't talk about multiplatform games :messenger_tears_of_joy: .
Like I said I'm just glad the internet was big back then. They only get brave with the 16-bit era because the manbabies have infested the internet with their BS.
 
Last edited:
This is also the explanation for when I bring up fsb speed limitations on Xbox. The CPU makes draw calls for geometry, processes physics, runs AI scripts etc. If the fsb is slower than the gpu clocks ; you can hit bottlenecks.

Sure there are workarounds in certain situations, but the cpu is never going to perform at 100% of its capabilities.

Cube and wii were excellently designed in this way ; no cpu bandwidth bottleneck.
 
Last edited:
As for what speed wiis gddr3 runs at ; as far as i know its a complete unknown to this day. Again, a very conservative estimate is 6.4gb/s.

Actually i would like to see what kinds of gddr3 chips were available for purchase in 2005 to see if we can find a potential minimum speed.
 

Oemenia

Banned
This is also the explanation for when I bring up fsb speed limitations on Xbox. The CPU makes draw calls for geometry, processes physics, runs AI scripts etc. If the fsb is slower than the gpu clocks ; you can hit bottlenecks.

Sure there are workarounds in certain situations, but the cpu is never going to perform at 100% of its capabilities.

Cube and wii were excellently designed in this way ; no cpu bandwidth bottleneck.
If you stopped repeating the same BS, I would actually believe that you have some posters on ignore.
 
I see the problem.

The 1tsram (24mbs) is actually the main memory on Wii but a second pool, 64 mbs of gddr3 graphics memory exists. The 1tsram ran at 2.6gb/s on cube... But it would make sense if it ran at 3.9gb/s on Wii to match up with the gpu clocks. This is also why the fsb is 243mhz.

This is the amount of bandwidth available to the cpu, in other words. Which the Xbox has a little over 1 gb/s for its CPU.

The gddr3 graphics memory is not limited by its front side bus ; that's not how it works. Just as the xbox gpu was not limited by its fsb ; the cpu is. Fsb's and memory bandwidth didn't work this way.

And there was absolutely no way the gddr3 in Wii would be as slow as 3.9gb/s in a 64 bit bus.

good, at least is an explanation :) , but embedded memory is usually faster than external memory, it doesnt make sense to have external memory faster than what is inside the chip(are the 24 MB for backward compatibility only? ) GC itself has 16 MB of extra memory that are very slow

let see other specs

for the CPU specs

CPU (Broadway)


  • Superscalar microprocessor with six execution units (floating-point unit, branching unit, system register unit, load/store unit, two integer units)
  • Operating speed of 729 MHz
  • Bus to main memory: 243 MHz, 64 bits (maximum bandwidth: 1.9 gigabytes/sec)
  • 32-kilobyte 8-way set-associative L1 data cache (can set up 16-kilobyte data scratch pad)
  • Onboard 256-kilobyte 2-way set-associative L2 integrated cache
  • Supports three L2 cache fetch modes: 32, 64, and 128-Byte
  • DMA unit (15-entry DMA request queue) used by 16-kilobyte data scratch pad
  • Write-gather buffer for writing graphics command lists to the graphics chip
we see again the speed 243 Mhz that multiplied directly by 64 bit give that number, only GPU uses 2 times that speed

curious...
 
Last edited:

Vorg

Banned
Chozo is clearly an authority on console hardware. There's no arguing with him because he's always right.
 
The absolutely slowest gddr3 chips i found ran at 300mhz. Calculated from
good, at least is an explanation :) , but embedded memory is usually faster than external memory, it doesnt make sense to have external memory faster than what is inside the chip(are the 24 MB for backward compatibility only? )

let see

for the CPU specs

CPU (Broadway)


  • Superscalar microprocessor with six execution units (floating-point unit, branching unit, system register unit, load/store unit, two integer units)
  • Operating speed of 729 MHz
  • Bus to main memory: 243 MHz, 64 bits (maximum bandwidth: 1.9 gigabytes/sec)
  • 32-kilobyte 8-way set-associative L1 data cache (can set up 16-kilobyte data scratch pad)
  • Onboard 256-kilobyte 2-way set-associative L2 integrated cache
  • Supports three L2 cache fetch modes: 32, 64, and 128-Byte
  • DMA unit (15-entry DMA request queue) used by 16-kilobyte data scratch pad
  • Write-gather buffer for writing graphics command lists to the graphics chip
we see again the speed 243 Mhz that multiplied directly by 64 bit giver that number, only GPU uses a mutiplier of 2

curious...
The 1 1tsram isn't embedded memory it's off die. Its main purpose was for bc and to keep its low latency, yes.

Only the 3mb edram is on die.

Curiously I found gddr3 as slow as 300mhz... Which would put it at 4.9gb/s at an absolute minimum. Worst case scenario.

However between the 1 tsram and gddr3 ; even that lowball estimate would still give it more main memory bandwidth than Xbox. Then the eDRAM also has its own speed.
 
The absolutely slowest gddr3 chips i found ran at 300mhz. Calculated from

The 1 1tsram isn't embedded memory it's off die. Its main purpose was for bc and to keep its low latency, yes.

Only the 3mb edram is on die.

Curiously I found gddr3 as slow as 300mhz... Which would put it at 4.9gb/s at an absolute minimum. Worst case scenario.

However between the 1 tsram and gddr3 ; even that lowball estimate would still give it more main memory bandwidth than Xbox. Then the eDRAM also has its own speed.

yes my mistake, is in the package but not inside the die




a better explanation from wiibrew

The GameCube has one 24MB bank of 1T SRAM that is used for all code and data, spread across two external chips; there is also a chip containing 16MB of ARAM, which could be used for storing data.

The Wii moves all 24MB of 1T-SRAM (referred to as MEM1) inside the Hollywood package, and adds an additional 64MB of GDDR3 RAM (MEM2).

the specs are


Nintendo Wii Hardware Summary

  • CPU: IBM PowerPC 729 MHz
  • GPU: ATI graphics 243 MHz
    • 3 MB embedded memory
  • NEC ARM9 243 MHz
  • 512 MB Internal Flash Memory (NAND)
  • 88 MBMain Memory
    • 24 MB "internal" 1T-SRAM
    • 64 MB "external" GDDR3 SDRAM
  • 12 cm optical drive
  • Wi-Fi 802.11b/g
    • Mitsumi DWM-W004
  • Bluetooth
  • 2x USB 2.0
  • SD/SDHC

notice how main memory is sub divided, GDDR3 and 1T-sram are "main memory" this may indicate that the speed is the same 3.9 GB/s no matter if is MEM1 or MEM2 since its the bandwidth "to main memory" not MEM1 or 2


also

MEM1 is slightly faster than MEM2.

so 3.9 GB/s

or 3.9 GB/s for MEM1 and 3.888 GB/s for MEM2 :messenger_grinning_smiling:
 
Last edited:
a better explanation from wiibrew



the specs are


Nintendo Wii Hardware Summary

  • CPU: IBM PowerPC 729 MHz
  • GPU: ATI graphics 243 MHz
    • 3 MB embedded memory
  • NEC ARM9 243 MHz
  • 512 MB Internal Flash Memory (NAND)
  • 88 MBMain Memory
    • 24 MB "internal" 1T-SRAM
    • 64 MB "external" GDDR3 SDRAM
  • 12 cm optical drive
  • Wi-Fi 802.11b/g
    • Mitsumi DWM-W004
  • Bluetooth
  • 2x USB 2.0
  • SD/SDHC

notice how main memory is sub divided, GDDR3 and 1T-sram are "main memory" this may indicate that the speed is the same 3.9 GB/s no matter if is MEM1 or MEM2 since its the bandwidth "to main memory" not MEM1 or 2


also



so 3.9 GB/s

or 3.9 GB/s for MEM1 and 3.888 GB/s for MEM2 :messenger_grinning_smiling:
Well if anyone knows it's the homebrew guys. Kinda crazy if Nintendo went with that low gddr3, talk about cheap, or over engineered.

So that makes it 7.8 GB/s total main memory bandwidth. Plus eDRAM.

Now im wondering if the eDRAM is 50% faster compared to cube as well?
 
The CPU can't ; the gpu can.
GPU (Hollywood )
  • Operating speed of 243 MHz
  • 3 megabytes of embedded graphics memory
  • 24 megabytes of internal main memory
  • Internal main memory operates at 486 MHz
  • Maximum bandwidth between GPU and main memory: 3.9 gigabytes per second
  • The GPU of the Wii is identical to the GC's but it is on average 1.5X faster
 
Last edited:
GPU (Hollywood )
  • Operating speed of 243 MHz
  • 3 megabytes of embedded graphics memory
  • 24 megabytes of internal main memory
  • Internal main memory operates at 486 MHz
  • Maximum bandwidth between GPU and main memory: 3.9 gigabytes per secon
GPU (Hollywood )
  • Operating speed of 243 MHz
  • 3 megabytes of embedded graphics memory
  • 24 megabytes of internal main memory
  • Internal main memory operates at 486 MHz
  • Maximum bandwidth between GPU and main memory: 3.9 gigabytes per second
  • The GPU of the Wii is identical to the GC's but it is on average 1.5X faster
That's your first link ; what does wiibrew say? Got a link to them?

Kinda crazy that Nintendo would limit the bw like that.

although if the edram is 50% faster too thatd put it near 27gb/s!
 
Last edited:
That's your first link ; what does wiibrew say? Got a link to them?

not much, basically the same, no mention of extra access from GPU


but there is no point in having more ram access, GPU still works with specific buffers for textures and frame/z buffer the extra speed its ok for the extra speed in texture buffer

although if the edram is 50% faster too thatd put it near 27gb/s!


now is closer to PS2's 48 GB/s :messenger_grinning_smiling:

it can probably help, on PS2 there are some techniques were they reuse z-buffer when is not needed during frametime for filters, they can probably do that on wii more easily than GC with 50% faster access, also will help with multipass techniques
 
Last edited:

V4skunk

Banned
But where are these games with more polygons and better water reflection tricks compared to xbox? I have played many GC games, but I havent seen games that would show more polygons than xbox games, and when it comes to water rendering pixel shaders in xbox games was unmatched. For example Crimson Skies, not only water surface had sun reflections but also realistic ripples thanks to shaders, while in Rogue Leader water looked ok from up close, but very bland from a distance.

Water in Crimson Skies looked amazing in motion
189695-11.jpg


cshtr-4.jpg


Water in Rogue Leader
3127-star-wars-rogue-squadron-ii-rogue-leader-screenshot.jpg


3124-star-wars-rogue-squadron-ii-rogue-leader-screenshot.jpg


1465350265-1752752858.jpg


And shaders in xbox games were not only used for water rendering but also on many surfaces (especially in Splinter Cell 3 to provide realistic characteristic to many materials).

Xbox could stream textures from HDD, it had pixel and vertex shaders, and also shadows buffers could be used for improved shadows rendering! These are all important features and TEV unit in GC can replace all these things magically. Effects in most GC games looked pretty much like in PS2 games, effects in xbox games however looked clearly differenct. If TEV unit could replicate effects like xbox games like splinter cell wouldnt be so extremely downgraded.
What?
Crimson Skies doesn't hold up compared to Rogue Squadron.


 

Romulus

Member
What?
Crimson Skies doesn't hold up compared to Rogue Squadron.




The water is better, that's what they discussing. Rogue looks better to me too overall, but that's not saying much to me, I'm honestly not a big fan of crimson's visuals compared to other top tier xbox games. I don't like the art either.
 
not much, basically the same, no mention of extra access from GPU


but there is no point in having more ram access, GPU still works with specific buffers for textures and frame/z buffer the extra speed its ok for the extra speed in texture buffer




now is closer to PS2's 48 GB/s :messenger_grinning_smiling:

it can probably help, on PS2 there are some techniques were they reuse z-buffer when is not needed during frametime for filters, they can probably do that on wii more easily than GC with 50% faster access, also will help with multipass techniques
I looked around on wiibrew and couldn't find a mention of a 3.9gb/s limit on hollywoods main mem access.

Either way, I learned something today.
 

pawel86ck

Banned
What?
Crimson Skies doesn't hold up compared to Rogue Squadron.



Crimson Skies on xbox x looks in 4K still looks good

2019-07-22-3.png




Also keep in mind, Crimson Skies has huge and more complex levels compared to Rogue Leader

In Rogue leader you have either 2D scenery background, or small ground scenery

3106-star-wars-rogue-squadron-ii-rogue-leader-screenshot.jpg


3075-star-wars-rogue-squadron-ii-rogue-leader-screenshot.jpg


3129-star-wars-rogue-squadron-ii-rogue-leader-screenshot.jpg


3095-star-wars-rogue-squadron-ii-rogue-leader-screenshot.jpg


3125-star-wars-rogue-squadron-ii-rogue-leader-screenshot.jpg


IMO Factor 5 build small levels because that way they were able to improve graphics fidelity compared to other GC games.
 
Last edited:
MGS2 is 60fps game, but on PS2 you only can play in 480i instead of 480p.

I'm playing PS2 games lately and 480p makes a huge difference even on emulator. Guys, there's some easy method to run all PS2 games at 480p?

480i
pcsx2-2019-07-21-22-52-05-49.png


480p
pcsx2-2019-07-21-23-00-00-32.png


And some more screenshots

pcsx2-2019-07-21-22-56-24-53.png


pcsx2-2019-07-21-22-54-13-82.png


pcsx2-2019-07-21-22-55-47-99.png


Tekken 4 on PS2 was really impressive back in 2001, especially water looked very nice.

Not an easy method but there are a few that work depending on the game. MGS3, SH3, FF12, many others, can run in 480p with xploder boot disk or from xploder with the memory card soft boot mod or whatever, I'm sleepy.

There are also mods, actually hex edits, you can do with the iso to force it too, and true 16x9 as well.

MGS2 is field rendered so no for it and other old ones.

And IMO splinter cell on PS2 looks horrible compared to SH3 on PS2.
 
Last edited:
PS2 has MGS2 and Silent Hill 2.

But I got to give it to Gamecube.

It has the best Zelda, and the second and third best Zeldas are available on it too.

It has the best Mario game.

And many, many happy hours were spent in Nightfire, Monkey Ball and Melee.

 

Daniel Thomas MacInnes

GAF's Resident Saturn Omnibus
2L6ciS8l.jpg


3rP9oyXl.jpg


yXWI0ANl.jpg


44BlaHsl.jpg


While you children endlessly argue about hardware stats and polygon counts, I've been busy playing Sega Dreamcast and having fun. It's still a fantastic console for classic arcade games and innovation that only Sega could provide. I miss these guys.

The controller is slightly uncomfortable and the software library isn't as strong as Saturn, but there's no denying the wonderful awesomeness of Dreamcast. The fact that it still receives new indie games is just icing on the cake.
 
SS Goku pre RoSaT training? That scaling can't be right

Its more like this:
PS2 - Android 18
GC - Android 17
Xbox - Full power 1st form cell (after eating several cities worth of people)
Wii - Android 16
Okay -

Ps2 : super vegeta
GameCube : ss grade 4 goku (cell games)
Xbox : perfect cell
Wii : super perfect cell

N64 : ss4 gogeta
 

SonGoku

Member
Android 17 was like 5% stronger than 18
We never got hard numbers on the Androids power gap, all we know is 17 was powerful enough to beat 18. This is further evidenced on Super
Akira also commented how 17 was the android with most potential afaik

Goku vs Vegeta peak (cell games) and Perfect Cell vs Super Perfect Cell are too big of a gap almost generational jump
 

K.N.W.

Member
I'm playing PS2 games lately and 480p makes a huge difference even on emulator. Guys, there's some easy method to run all PS2 games at 480p?
Yep, I know two methods I've found to work:

1)The harder method is hacking the iso with an hex editor, (more likely to work);

If you do have a modded/soft modded ps2, all you have to do is:
1) Make an ISO
2) Open the ISO with an HEX Editor
3) Change the Value found at this link ps2wide.net / 480p
4) Save the ISO, burn it or load it on USB or HDD
5) Set GSM/Openloader to Display 480P/720/1080I (so the game screen gets centered)

PS: The code works on any ps2 emulator, and improves image quality :)

2)There's also a little program called "ps2force480p.exe", completely automatic, which patches games for 480P. You may try patching them with this method first, try running the iso in PCSX2 and check if on the top of the window it says "Interlaced" or "Progressive". If it doesn't work , try the first method
 

pawel86ck

Banned
Yep, I know two methods I've found to work:

1)The harder method is hacking the iso with an hex editor, (more likely to work);



2)There's also a little program called "ps2force480p.exe", completely automatic, which patches games for 480P. You may try patching them with this method first, try running the iso in PCSX2 and check if on the top of the window it says "Interlaced" or "Progressive". If it doesn't work , try the first method
Thanks, I have tried patching silent hill 3 and PCSX2 window suggest game started in progressive mode, but for some reason game hangs very quickly. But that's emulator, maybe real PS2 will run the same patched ISO without problems. I have to buy component cable to my PS2 first and then I will test it.

BTW. PCSX2 emulation is FAR from perfect anyway. Performance in D3D or openGL is great, but many effects arnt rendered at all. It's very strange to play for example God Of War 2 without shadows, bloom, and fog. Only software renderer can display all graphics effects, but then you can only run games at their native resolution. Also if you will use software renderer requirements are really big, even i7 At 4.6 GHz and 1080ti cant emulate many of these games in software at full speed. On top of that thumbstick sensitivity is totally wrong. On real PS2 hardware I can aim in all fps games with ease, but not in PCSX2.

GC emulator is way better, all effects are rendered even at higher resolutions, performance is great most of the time, and most importantly you can adjust thumbsticks and really enjoy playing these games. There are even texture packs for games like metroid prime and Resident Evil 2. Resident 2 looks like a remaster with these textures.
 
Last edited:

Fafalada

Fafracer forever
it can probably help, on PS2 there are some techniques were they reuse z-buffer when is not needed during frametime for filters
PS2 eDram was fast enough to allow use-cases like writing into a buffer (Frame,Z, whatever) while simultaneously(during same render operation/pass) using the buffer as a texture input - with some shipping games using that. And that wasn't even the most exotic manipulation of memory.
Basically there was no need for 'render-to-texture' operations - you just aliased the addresses to the data you wanted. Unlike the other consoles where you needed explicit, and expensive writebacks (GC/Wii) or wait&flush syncpoints (XBox).

This is also the explanation for when I bring up fsb speed limitations on Xbox. The CPU makes draw calls for geometry, processes physics, runs AI scripts etc. If the fsb is slower than the gpu clocks ; you can hit bottlenecks.
These consoles had single core CPUs - they only did one thing at any one time, so GPU related operations weren't 'interrupted' by other workloads. For what's worth - my understanding of XBox CPU limitations was primarily related to DirectX API inefficiencies - while draw-calls are usually very cheap on consoles, XBox inherited the API overhead other consoles did not contend with.
To actually hit FSB limits, you'd need to manipulate push-buffers directly, but that would max out the GPU before FSB (and some games did).

It's not like GC/Wii didn't have their own CPU limitations - requiring software skinning was a big drain on performance in games with animated characters, and GPU had some performance issues with clipping(on GC, not sure if that was fixed on the Wii) so CPU was needed to help rendering performance at times too.
 
Last edited:

K.N.W.

Member
Thanks, I have tried patching silent hill 3 and PCSX2 window suggest game started in progressive mod, but for some reason game hangs very quickly. But that's emulator, maybe real PS2 will run the same ISO. I have to buy component cable to my PS2 first and then I will test it.
Forgot to say, some games need FMV skip hack, found at the bottom of my link. In Silent Hill 3 you can skip the first FMV
by pressing start when the "this game is violent bla bla bla" message appears but I'll need to skip 2 other videos later in the game, either by selecting skip FMV (mode 4) in OpenLoader or options>gamefix>skipmpeg in pcsx2 :)

PS: Good choice, SH3 performance in 480P are as good as in the original mode ;)
 
Last edited:
Top Bottom