• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

New Matt Casamassina rant up.

Monk said:
So you would rather play a game at 15-20fps at 1080p than a solid 60fps at 480p? But that is beside the point. If the do add hdtv support, they will make the games run at 60fps on hdtv resolution, that's how devs are. So it is a matter of, do you want the game to look better at 480p or not. If you do than hdtv support in't good. If you don't than hdtv support is good. In either case you have the upside and downside.

...

Do you honestly think the result of not having HD in Revolution is going to be that games look BETTER on the system? Because that's just something that should leave everyone here understandably speechless.
 
At 480p, most likely unless devs add aniso and aa at 480p. Because that is the cost in frame rate. I hope that is the case but I didn't see devs doing that last gen. However if they do do that, then lack of hd is a mistake.

But most likely devs would want both those even at HDTV resolutions so it won't look better at 480p. Probably devs will give you the choice between a shakey 30 fps at 1080p, rocksolid 720p or a rock solid 60fps at 480p.

If they aim for 480p as their main target to be 60fps, then all is good for us with 480p. If they aim for 60fps at 1080p, then the 480p users are the ones getting screwed.
 
Monk said:
Proof that resolution does have impact:

doom3.png

AAdoom3.png

splintercellAVG.png

splintercellMIN.png


http://www.anandtech.com/video/showdoc.aspx?i=2406

Resolution is certainly not "free".
Can you run those same tests at TV resolutions? (or near?). Let's see if there is much of a fps improvement at 800x600... 640x480... 260x240.
 
Those are tests from anandtech so no. But it probably has no improvement at 640x480 because it would be "cpu limited". It really depends on how powerful the graphics card & cpu are versus the graphics and cpu requirements respectively for each game.

Basically if it is cpu limited, going down resolution isn't going to make a difference, if it is cpu limited going down will.

If you want, I could do some tests on morrowind at 640x480 versus 1240X1080 if you want with my radeon 9700pro.
 
Amir0x said:
Do you honestly think the result of not having HD in Revolution is going to be that games look BETTER on the system? Because that's just something that should leave everyone here understandably speechless.

Well I'd say it depends on how much cost they want to save, and how the required performance/costs scales with the resolution (meaning better GPU). Looking at the graphs posted you'd think that it's a linear factor, but I'd say that that entirely depends on your shaders.

What I mean to say is following: if you take the X360 and make a game to run at 60fps for a 480p display you can definately make it "prettier" than a game that should run at 60fps on a 720p (the added resolution "only" gives you more clarity). You use the "extra" fillrate on producing nicer looking pixels instead of computing many more pixels.

Now say you need 3x the performance the get the "same" quality picture and frame-rate at 720p. If you only have a 2x performance GPU you can't make that. But you can make your 480p 60fps image "2x nicer" instead.

Now imagine that the costs of the GPU performance doesn't scale linearly, i.e. 2x would cost you $50 extra, but 3x $150. You could then save money going with only 2x and focusing on 480p, while making the picture prettier.

Of course this would assume that you could actually use that extra power at 480p to make distinctively better shaders, that would make the picture prettier than what you'd gain from the added clarity. For example (and this is totally made up to examplify): at 720p your sub-surface scattering shader might become to expensive to use, so you get "opaque" skin, but with the added resolution, you can make more (texture) detail (pores, spots, etc.) visible on the skin surface. At 480p these details could not be made out, but you have enough fillrate to run the expensive shader instead and you can accurately see the light penetrating your characters fingers, ears etc.

The question would be which of those would you prefer if you had to choose? (Again, I'm not saying the 360 can't do sub-surface scattering)
 
Monk said:
Proof that resolution does have impact:

doom3.png

AAdoom3.png

splintercellAVG.png

splintercellMIN.png


http://www.anandtech.com/video/showdoc.aspx?i=2406

Resolution is certainly not "free".
However, noone would take ATI seriously if they then said "As you can see, frame rate drops significantly for 1600x1200. Thus we are going to disable that resolution, so you can all have improved frame rates."

Monk said:
So you would rather play a game at 15-20fps at 1080p than a solid 60fps at 480p? But that is beside the point. If the do add hdtv support, they will make the games run at 60fps on hdtv resolution, that's how devs are.
I didn't see this becoming a problem on Xbox this generation. And considering how many developers didn't care about 60 frames per second this generation, I doubt they will any more next. But if their main concern was making all games the highest possible resolution and frame rate at the expense of other things, all the better!
 
Does anyone know what kind of signal output is required to output 480p and 720p? If the actual signal output for both can be handled by the same hardware at no/or minimal cost, and Nintendo is supporting 480p (therefore requiring the specific hardware), I doubt they would make it impossible for the other resolutions to be output.

I'm taking this more as them saying, they will not support HD as in they will not make games that are built around 720p+. They are not investing in a GPU powerful enough to give full effects and frame rate at 720p+, but rather one that gives excellent results at 480p (possibly with more effects than the competitions 720p optimized GPU). So their games will look comparable to the competition running on a HDTV (minus the nice clarity), and indistinguishable on a SDTV, at a reduced cost.

So would anyone know about the signal output hardware?
 
sarusama said:
So would anyone know about the signal output hardware?
Noone (at least noone who can acknowledge freely) really knows anything about the internal hardware. However, I believe it was IGN who claimed the GCN's digital video out was on the Revolution mockup.
 
sarusama said:
Well I'd say it depends on how much cost they want to save, and how the required performance/costs scales with the resolution (meaning better GPU). Looking at the graphs posted you'd think that it's a linear factor, but I'd say that that entirely depends on your shaders.

What I mean to say is following: if you take the X360 and make a game to run at 60fps for a 480p display you can definately make it "prettier" than a game that should run at 60fps on a 720p (the added resolution "only" gives you more clarity). You use the "extra" fillrate on producing nicer looking pixels instead of computing many more pixels.

Now say you need 3x the performance the get the "same" quality picture and frame-rate at 720p. If you only have a 2x performance GPU you can't make that. But you can make your 480p 60fps image "2x nicer" instead.

Now imagine that the costs of the GPU performance doesn't scale linearly, i.e. 2x would cost you $50 extra, but 3x $150. You could then save money going with only 2x and focusing on 480p, while making the picture prettier.

Of course this would assume that you could actually use that extra power at 480p to make distinctively better shaders, that would make the picture prettier than what you'd gain from the added clarity. For example (and this is totally made up to examplify): at 720p your sub-surface scattering shader might become to expensive to use, so you get "opaque" skin, but with the added resolution, you can make more (texture) detail (pores, spots, etc.) visible on the skin surface. At 480p these details could not be made out, but you have enough fillrate to run the expensive shader instead and you can accurately see the light penetrating your characters fingers, ears etc.

The question would be which of those would you prefer if you had to choose? (Again, I'm not saying the 360 can't do sub-surface scattering)
Wtf, dude? Like anyone's going to notice "sub-surface scattering" at such a pathetically low resolution. It doesn't get much better than RE4 at 480x640. Games will look like an "upgrade" at best. Not a whole new generation in image-quality.



Next-gen is gonna suck for Nintendo fans. :(
 
Pellham said:
While that's likely, lack of HD support is not gonna be the reason people skip over the Revolution.


It won't be the only one. But it will be the biggest reason. 30" Widescreen HDTV sets will be under $400 by the time the Revolution launches in NA. $300 a year after it launches. HDTV is coming, and Nintendo needs to back it. They still have time to change things.
 
Bacon said:
Yeah, it wasn't working before for some reason.
I noticed that happens a lot with the mailbag. Sometimes if I miss one I can't read it for a couple of days.
 
HDTV isnt the biggest problem for nintendo but it reflects badly on them as Matt says. They need a bigger list of reasons to make people want to buy the console. Like online this is something they arent bothering with.

With PS3 having a Blu Ray player the stakes are bigger as Revolution has no Blu Ray or HDDVD playback. Having a format means more people will be interesting in watching HD. HD will gain alot of ground quite soon and most new tv's will be HD in the living room at least. :)
 
Gahiggidy said:
Wtf, dude? Like anyone's going to notice "sub-surface scattering" at such a pathetically low resolution. It doesn't get much better than RE4 at 480x640. Games will look like an "upgrade" at best. Not a whole new generation in image-quality.

Hmmm... I seeeee. That's why I first thought that RE4 was actually live SD television broadcast.
 
Gahiggidy said:
Wtf, dude? Like anyone's going to notice "sub-surface scattering" at such a pathetically low resolution. It doesn't get much better than RE4 at 480x640. Games will look like an "upgrade" at best. Not a whole new generation in image-quality.

Hmm. So all those DVDs and TV programs I watch as SD resolutions (480i not p) that show realistic humans - they actually just look like Tomb Raider 1?
 
I think Nintendo should have supported HD.

However, given that they are not, I think this issue is being blown way out of proportion.

1. For a valid SD/HD comparison, you shouldn't look at crappy/noisy SD broadcast TV and compare it with made-for-showroom HD demo reels. Rather, you should consider SD DVD output and compare it with the same content in HD. One way to do this is with the T2 Special Edition DVD, since it has the HD version you can play on your computer. It's true that if you compare good SD with practical HD, the difference is not necessarily large (though it really depends upon the content you're watching).

2. HD TVs will often have 1 HD input and multiple SD inputs. The HD input is going to be occupied by your DVD/Bluray/HDTivo/HDTuner/X360/PS3/etc box. This will leave the SD input free to plug in your Revolution. :-)

3. Don't forget that any decent HDTV will have nice SD->HD conversion, so it won't be the same as looking at an SD signal on a crappy SD set.

4. Let's see what the final product & price is before we say it's crap & boycott it. If it plays good games and only costs $150, what's the point of complaining?
 
Top Bottom