What will next gen graphics look like?

And it wont happen next gen, because GPUs will be at least 5 times powerful and better designed. I can agree that some game will have some resolution drops or will be rendered in 33ms, but it wont be under 1280x1080 probably. Also on next gen GPUs difference in 720p vs 1080p wont be so noticeable as this gen in terms of performance.
It really doesn't matter how powerful and well designed next gen GPUs are.

More complex games will still tax the system more and lower resolutions/framerates will still free up resources.

Next gen games will obviously run at higher resolutions on average though, that's a given, and they will be more prepared than current gen machines to run complex games at those resolutions. It's part of the generational leap package.
 
We don't even have a top end PC game. Maybe GTA4 with those mods can be classified such, though I am not sure.

GTA 4 enhanced looks like GTA with a bunch of photographs randomly pasted everywhere. If you cherry pick screencaps, it can look amazing. But then you see it in movement and it's like a MUGEN horror movie.

BF3 proves there's people out there (even at GAF!) that don't care about art direction, they just want photorealism and fuck everything else (according to them, the game becomes "ugly" if you lower the resolution, which is proof enough). GTA4 mods are the same idea taken to the extreme.
 
those phrases taken out of context read poorly. why don't you trawl out all my posts if you're going to do "catchphrase" to make it clearer what I was responding to?

My point was that your responses have either been dismissive or a sad attempt at insulting instead of actually considering an opinion other than your own. Never did I claim any opinion was wrong, but instead tried to suggest other ways to look at things.

Basically when people drink the kool-aid it makes discussions like this difficult since they tend to go nowhere since people get defensive over petty things that are essentially nothing or over things that they understand little of.

Next gen games will obviously run at higher resolutions on average though, that's a given, and they will be more prepared than current gen machines to run complex games at those resolutions. It's part of the generational leap package.

Exactly, and as I've said before, it's likely that next gen will start off with higher resolutions and/or higher frame rates but even those will be scaled back when games become more complex.

BF3 proves there's people out there (even at GAF!) that don't care about art direction, they just want photorealism and fuck everything else (the game becomes "ugly" if you lower the resolution, which is proof enough). GTA4 mods are the same idea taken to the extreme.

I don't think it's fair to downplay the accomplishments made by DICE. Art is still involved, even when you're shooting for realism. I think there are plenty of beautiful games out there, realistic or not, but it's usually easier for the average person to gauge a game's graphical accomplishments when they are based on reality since there is a basis for that comparison. At least that's the way I see it.
 
It really doesn't matter how powerful and well designed next gen GPUs are.
That's not completely true. If a GPU is well designed it takes relatively less performance hit when outputting high resolution graphics. Sub HD games on the PS3 and 360 were mostly a consequence of the the fact that certain graphical effects would not run properly. The size of the 360 framebuffer as well as a lack of ROPs on the GPU of both machines are the main reason we're seeing sub HD games if I'm not mistaken.

Next gen GPUs will probably be designed in such a way that they support many more rendering use cases, so that it should be able to do certain resolutions or framerates without restraining what a developer can do with it. Of course high resolution will still be a large performance hit for any game, but there's a certain extent to which this can be migitated by the GPU designers.

For example, one connected source in the Wii U thread has suggested that the Wii U GPU will feature 32MB of embedded memory, presumed to be of similar function as the 10MB of EDRAM on the Xbox 360 GPU. A much larger piece of EDRAM on the GPU allows many more combinations of resolutions and post-processing features to be applied without tiling and other associated performance hits. From memory, 32MB of EDRAM would allow 1080p render targets with some light AA (?) without tiling.

A dev should back me up here since I'm not sure all of this is 100% correct.
 
I wonder how close we're getting to the point where leaps in processing power struggle to offer an obvious difference in image quality perceived by average Joe, or better put, when a lot of the effects produced by raw processing power can be 'faked' on weaker systems, the untrained eye struggling to tell the two apart.

To see people suggest that Battlefield 3 doesn't look significantly better than the console release, or that The Witcher 2 isn't leaps and bounds better looking than basically everything on consoles, makes me wonder if a lot of next gen games simply wont blow many people away due to how desensitised they've become due to the highly optimised and artistically polished games of this generation.
I think a lot of this is that developers have gotten a lot better at hiding pre-rendered things within game-rendered scenes which convince a lot of players the whole game looks way better than it does.

The difference between PC and console for even Skyrim is pretty intense for and I have both the 360 and PC versions; they don't even look comparable to me.
 
It really doesn't matter how powerful and well designed next gen GPUs are.

More complex games will still tax the system more and lower resolutions/framerates will still free up resources.

Next gen games will obviously run at higher resolutions on average though, that's a given, and they will be more prepared than current gen machines to run complex games at those resolutions. It's part of the generational leap package.

yes and no. 30fps is likely to be used often, because you can effectively do graphics 'twice as pretty' compared to running at 60fps. But assuming games will use lower resolutions isn't so simple.

Games this generation seem to have settled on approximately 720p, occasionally dropping the resolution a little when needed. They don't vary wildly, and you don't see many games dropping a lot lower than others just to get a few extra effects onscreen.

I expect that to be the same next gen. A standard will be established as developers get their sea legs. I still think that'll be nice and simple 1080p, with perhaps rare drops to 1280x1080, something like that (if the console has a decent scaler). While extra effects are tempting, developers won't want to be under delivering in resolution compared to their peers (and competing software)
 
Doom 3 on PC with 2005 PC hardware:

1245_full.jpg


Killzone 3 on PS3 with 2005 hardware:

kz3_img001.jpg


We have to keep in mind Killzone 3 came out after several years of optimization on the hardware. Nevertheless, it was still on hardware from 2005. There is also compression on both images, but I couldn't find any cleaner. If anybody has any better-suited Doom 3 shots from hardware from 2004 or 2005, feel free to show me so I can replace this one.

What looking at past generation comparisons with PC's from the same time tells me is that I don't believe that consoles will ever match the IQ shown in PC games. It hasn't happened in any of the past generations. In all other ways, or at least most other ways, though, it seems like it was surpass it. However, by the time it does surpass it, usually there are games out there on PC from newer hardware that look better.

The whole Playstation 3 package is hardware from late 2006, not 2005.
 
I think development studios have reached a point with their technolgy and game design that they can't really make games look much better. Characters already look the part and there's no real point to give characters more polygons because we're not going to notice it 90% of the time we're playing a game.

I think the biggest changes that we're going to see nextgen are procedually generated and physics based solutions to materials and objects in a scene. Cloth and hair simulation on models, more deformative surfaces in the enviroment etc. Things right now look and feel stiff in videogames. A flag for example is just a plane with a normal map that sways in a unconvincing manner, same goes for foliage in most games. This might not neccessarily provide a better looking videogame but it'll "feel" much much better.

We'll also likely see more things in a scene. Things that were once simply textures, like many skyboxes, will become more dynamic. Tiled textures will look less appearent, surfaces will convey their conditions better.

sorepoints in games now, things like pop-in and small crowds or simple crowds will hopefully be fixed, or have better ai.

Again, none of this stuff will make the game look particularly better, in comparisons or screenshots, but the feel of the games overall will be better.
 
That's not completely true. If a GPU is well designed it takes relatively less performance hit when outputting high resolution graphics. Sub HD games on the PS3 and 360 were mostly a consequence of the the fact that certain graphical effects would not run properly. The size of the 360 framebuffer as well as a lack of ROPs on the GPU of both machines are the main reason we're seeing sub HD games if I'm not mistaken.
Both Xenos and RSX have ROPs. The main reason why we got so many sup-720p games this generation is the lack of GPU perfomance to drive these games in higher resolutions. 360's memory pool has something to do with it (in case of Halo 3/ODST at least) but mostly it's just trading of resolution for perfomance.

Next gen GPUs will probably be designed in such a way that they support many more rendering use cases, so that it should be able to do certain resolutions or framerates without restraining what a developer can do with it. Of course high resolution will still be a large performance hit for any game, but there's a certain extent to which this can be migitated by the GPU designers.
He's right when he's saying that there will always be titles which will choose to lower resolution to get more perfomance for more complex effects. That's why nobody should expect 1080p being a standard next generation.

But there is one interesting thing to note here: while going from 720p to something like 640p isn't THAT apparent to general console gaming population on their 1080p TV sets (720p isn't native Full HD TV resolution too thus it's blurry anyway), going from native 1080p rendering to anything lower (even let's say 960p) will be clearly and absolutely noticable to anyone because of all the blur added by TV's interpolation. And that may be a reason why we'll be getting less sub-1080p games next generation.

For example, one connected source in the Wii U thread has suggested that the Wii U GPU will feature 32MB of embedded memory, presumed to be of similar function as the 10MB of EDRAM on the Xbox 360 GPU. A much larger piece of EDRAM on the GPU allows many more combinations of resolutions and post-processing features to be applied without tiling and other associated performance hits. From memory, 32MB of EDRAM would allow 1080p render targets with some light AA (?) without tiling.

A dev should back me up here since I'm not sure all of this is 100% correct.
Xenos doesn't have EDRAM in a strict sense, it have a memory companion chip which can produce MSAA for a relatively low cost but that's not "true" EDRAM in PS2's RS kind of sense.

If you want to have a 1080p rendering resolution then you'll need at least two buffers of that size, both of them in at least 32 bit precision. That's 4 bytes per pixel, 1920x1080x2 pixels in total = 16,5 MBytes of data. If you want to have some AA in there you need to double or quardruple one of these buffers in size meaning 24,9 or 41,5 MBytes of data. If you want to have a high quality HDR then you'll need to render to a 64 bit precision back buffer which means not 4 but 8 bytes per pixel. And if you want VSync then it's a good idea to do triple buffering meaning that you'll need three buffers of such size. If you're not using your EDRAM for front buffer (the one which is displayed; the way it works in 360) you'll probably be able to fit a 1080p 32 bit 4x MSAA back buffer in 32MBs without tiling. Which is quite good really. But consindering todays rendering approaches (meaning off-screen render targets and deffered rendering in general) it's more likely that we'll see a 1080p 32 bits 1x/2x MSAA as a standard for a system with 32 MBs of rendering memory ("EDRAM").
 
it's highly unlikely next next gen is gunna be another closed box and theeres no way you can be sure what's going to happen one way or another. we're talking around 2020 man

Apologies, I thought you were talking about the coming up generation.

I think development studios have reached a point with their technolgy and game design that they can't really make games look much better. Characters already look the part and there's no real point to give characters more polygons because we're not going to notice it 90% of the time we're playing a game.

While I agree that AI and world interaction will play a bigger part next gen, I also think we can do more with the models and such. There is a lot of room for improvement with today's rendering.

Both Xenos and RSX have ROPs. The main reason why we got so many sup-720p games this generation is the lack of GPU perfomance to drive these games in higher resolutions. 360's memory pool has something to do with it (in case of Halo 3/ODST at least) but mostly it's just trading of resolution for perfomance.


He's right when he's saying that there will always be titles which will choose to lower resolution to get more perfomance for more complex effects. That's why nobody should expect 1080p being a standard next generation.

But there is one interesting thing to note here: while going from 720p to something like 640p isn't THAT apparent to general console gaming population on their 1080p TV sets (720p isn't native Full HD TV resolution too thus it's blurry anyway), going from native 1080p rendering to anything lower (even let's say 960p) will be clearly and absolutely noticable to anyone because of all the blur added by TV's interpolation. And that may be a reason why we'll be getting less sub-1080p games next generation.


Xenos doesn't have EDRAM in a strict sense, it have a memory companion chip which can produce MSAA for a relatively low cost but that's not "true" EDRAM in PS2's RS kind of sense.

If you want to have a 1080p rendering resolution then you'll need at least two buffers of that size, both of them in at least 32 bit precision. That's 4 bytes per pixel, 1920x1080x2 pixels in total = 16,5 MBytes of data. If you want to have some AA in there you need to double or quardruple one of these buffers in size meaning 24,9 or 41,5 MBytes of data. If you want to have a high quality HDR then you'll need to render to a 64 bit precision back buffer which means not 4 but 8 bytes per pixel. And if you want VSync then it's a good idea to do triple buffering meaning that you'll need three buffers of such size. If you're not using your EDRAM for front buffer (the one which is displayed; the way it works in 360) you'll probably be able to fit a 1080p 32 bit 4x MSAA back buffer in 32MBs without tiling. Which is quite good really. But consindering todays rendering approaches (meaning off-screen render targets and deffered rendering in general) it's more likely that we'll see a 1080p 32 bits 1x/2x MSAA as a standard for a system with 32 MBs of rendering memory ("EDRAM").

lol wat?

ROP is not the only reason for sub-HD in today's consoles. Memory also plays a large role, along with other factors of course.

Also, I don't know what you'd call it, but the 360 very much has edram.
 
lol wat?

ROP is not the only reason for sub-HD in today's consoles. Memory also plays a large role, along with other factors of course.

Also, I don't know what you'd call it, but the 360 very much has edram.
Ever tried reading? The main reason for sub-720p in current generation is developers wanting more perfomance for their creations. You can do a 1080p game on PS3 or 360 but then you'll need to downgrade it graphically somewhere to the point of it looking like Xbox 1 title.

"EDRAM stands for "embedded DRAM", a capacitor-based dynamic random access memory integrated on the same die as an ASIC or processor." Xenos's memory die isn't integrated on the same die as processor thus it's not EDRAM. It has some processing capabilities (ROPs being one of them) but it's far from a proper GPU with EDRAM solution.
 
The RSX is based on the NV47, that's hardware from June 2005. But yes the whole "package" is from 2006.

I know that. But that argument doesn't make sense. That's like saying The Dark Knight is a 2007 movie because most if not all shots were filmed that year and not 2008.

And while it is based on NV47, it certainly isn't the exact same. Killzone 3 runs on 2006 hardware, no matter how you twist it.

Only because Bluray kept getting delayed.

CPU/GPU and the like were finished in 2005 so he has a point.

How can we be so sure not a single adjustment has been made to cpu/gpu architecture during that whole time period ?
 
Ever tried reading? The main reason for sub-720p in current generation is developers wanting more perfomance for their creations. You can do a 1080p game on PS3 or 360 but then you'll need to downgrade it graphically somewhere to the point of it looking like Xbox 1 title.

I thought you were saying it was the only reason for sub-HD resolutions, sorry. Yes the main reason for sub-HD resolution is to gain more performance, though the bottleneck will differ from game to game. Sometimes it's CPU (extra geometry overhead from tiling on the 360 for ex.), sometimes it's GPU, sometimes it's memory (fitting the frame buffer into memory), and sometimes it's a combination of these factors.

"EDRAM stands for "embedded DRAM", a capacitor-based dynamic random access memory integrated on the same die as an ASIC or processor." Xenos's memory die isn't integrated on the same die as processor thus it's not EDRAM. It has some processing capabilities (ROPs being one of them) but it's far from a proper GPU with EDRAM solution.

I see what you mean now and in that sense you're correct. I still see it as eDRAM, though I agree not in the strict sense that the definition applies.
 
If Next Gen Gives us the equivalent to Crysis ultra/ extreme settings - DX11 at 60 fps with at least 4x AA and 8x AF at 1080P I'd be happy.
Edit: or the Samaritan (or whatever its called) demo at 720P 30 fps 4-8 X MSAA and 8x AF
 
So all games just look like shit to you? What's been posted is literally the pinnacle of videogame graphics. If you're not impressed by them gaming must be a rather drab and visually unexciting experience for you.
You're putting words into my mouth. None of the games in this thread are shit at all, I'm just incredibly underwhelmed.

The gap between the sixth and seventh generation was a relatively big jump, and this looks to be much much smaller. I would personally rather see the companies wait a couple more years, but that wont happen.
 
And while it is based on NV47, it certainly isn't the exact same. Killzone 3 runs on 2006 hardware, no matter how you twist it.
Super Mario Galaxy is running on 2006 hardware too, but the Wii doesn't offer 2006 hardware performance.

The PS3 is a 2006 console only in name since most of its components are from 2005 and offer 2005 performance, which is what we are comparing here.
 
The measure of next-gen graphics is always how much can we make computer graphics look near or identical to real life

such a shame

not necessarily.... how about improved IQ more dynamic environments, etc.
all the effects in the above vids can be used in a stylized manner too
 
Super Mario Galaxy is running on 2006 hardware too, but the Wii doesn't offer 2006 hardware performance.

The PS3 is a 2006 console only in name since most of its components are from 2005 and offer 2005 performance, which is what we are comparing here.

To be fair, the Wii is more 2001 hardware than 2006 hardware.

Though I agree with you that the design for both the PS3 and 360 were likely finalized late 2004/early 2005.

At Least give me something of this quality+ DX11 effects
http://www.youtube.com/watch?v=7lnqXbj_6Qw&list=FLglms1mkfQ2CHuPivdM4nAQ&index=8&feature=plpp_video
after 1 minute in this vid

Oh god no. I do not want over exposed games with washed out whites because some people think it's realistic.
 
To be fair, the Wii is more 2001 hardware than 2006 hardware.

Though I agree with you that the design for both the PS3 and 360 were likely finalized late 2004/early 2005.



Oh god no. I do not want over exposed games with washed out whites because some people think it's realistic.
while there are numerous effects going on - in regards to the lighting, I just want super high quality lighting and shadowing
 
To be fair, the Wii is more 2001 hardware than 2006 hardware.
Sort of agree, but it's still 2006 hardware.

In 2001 you couldn't have made it consume so little power and be so small.

It's just that it's using a really old architecture and performance-wise is extremely slow compared to you could release in 2006 at an affordable cost/price.

But in the case of both 360 and PS3, they are really 2005 consoles performance and architecture-wise, no more no less.
 
while there are numerous effects going on - in regards to the lighting, I just want super high quality lighting and shadowing

While I don't think there's high quality lighting and shadows in those vids, I do believe that'll be one of the points of attention next gen.

Sort of agree, but it's still 2006 hardware.

In 2001 you couldn't have made it consume so little power and be so small.

It's just that it's using a really old architecture and performance-wise is extremely slow compared to you could release in 2006 at an affordable cost/price.

But in the case of both 360 and PS3, they are really 2005 consoles performance and architecture-wise, no more no less.

Ok, so you're looking at it from the point of production process while I'm looking at it from an architectural point of view? At least that's what I gather from your post.

If so, the only flaw I see with your point is that your comparing production process to architecture design, which is kind of pointless IMO. Otherwise you would be comparing the most revised versions of today's consoles, which is 2010 IIRC for the 360 and 2011 for the PS3, to the 2006 design of the Wii to keep the comparison consistent.
 
now we are finally seeing some examples of what next gen will look like. I thought it was time to pump this thread again.

agnis philosopy:;

http://www.youtube.com/watch?v=hbEcQCqCezg

way beyond anything of current high end pc.

Serious question, if that is "way beyond anything of current high end pc" and it is running in real time (as it says at the start of the video), what is being used to run it? I honestly want to know.
 
That video just makes me sad. All that expensive tech and assets and what for? To be tortured by horrible direction and writing. Do we honestly expect games built to look like that demo, to be any more open or interesting as FFXIII was?
 
Serious question, if that is "way beyond anything of current high end pc" and it is running in real time (as it says at the start of the video), what is being used to run it? I honestly want to know.

Prepare to have your mind blown.

It's run on a
pc
.
 
It will surely not look as the tech demo's. didn't the samaritan need 3 gtx680?

Things i really want to see is realisitc cloths movement. and dynamic and realistic weather, beautiful lightning etc. but i doubt we will see the first one.

Also allot of developers are talking nonsense. take Square Enix for instance. They talk about how they need better machines. but if they really wanted to take advantage of a powerful machine they would do that on the pc. they just talk nonsense. The arguments that people wouldn't buy the console versions is also not true, because people chose to play console instead of pc and are aware that the pc will always have an advantage.
 
Square Enix develop all their games on the same PCs that we have. It's why Versus XIII has taken so long - they keep getting distracted by Steam sales.
 
It will surely not look as the tech demo's. didn't the samaritan need 3 gtx680?

Things i really want to see is realisitc cloths movement. and dynamic and realistic weather, beautiful lightning etc. but i doubt we will see the first one.

Mafia 2 has selective realstic cloth movement if you have PhysX enabled.

It looks pretty good as well - making character animations (even though they blend pretty well) kinda janky by comparison.

The are other examples of real time cloth movement - and Agni's philosophy also features that stuff.

I figure that real time cloth movement will be selectively put on a lot of next gen characters - with designs been pronounced to emphasize the tech near the beginning of the gen - things like cloaks, hair, trench coats, skirts... while stuff like t-shirts and jeans might continue to use the normal/texture map blending tech of uncharted if they do have 'cloth animation'.
 
Mafia 2 has selective realstic cloth movement if you have PhysX enabled.

It looks pretty good as well - making character animations (even though they blend pretty well) kinda janky by comparison.

The are other examples of real time cloth movement - and Agni's philosophy also features that stuff.

I figure that real time cloth movement will be selectively put on a lot of next gen characters - with designs been pronounced to emphasize the tech near the beginning of the gen - things like cloaks, hair, trench coats, skirts... while stuff like t-shirts and jeans might continue to use the normal/texture map blending tech of uncharted if they do have 'cloth animation'.

Hi,

But in the past we got ps2 tech demo's and other demo's that showed allot of cool stuff and in the end none of it was true. So I would take this with a picnh of salt.
 
Agni's philosophy is cool, but am I the only one bothered by the way particles are used? Or rather how the particles look. It's almost as if particles are without any filters and appear too..."sharp".
 
I just love how some guys predict 60 FPS standard just like in the previous cycle XD
Hi,

But in the past we got ps2 tech demo's and other demo's that showed allot of cool stuff and in the end none of it was true. So I would take this with a picnh of salt.
Nope, it's something pretty likely to happen. Some engines already have the toolset for this. It is not present because cuarrent hardware is too weak to apply a heavy use of it. Plus, it does not require a huge time investment for the animators since its driven by a physical simulation. This is the type of things that will impact visuals greatly but most people arent paying attention to it.
 
Agni's philosophy is cool, but am I the only one bothered by the way particles are used? Or rather how the particles look. It's almost as if particles are without any filters and appear too..."sharp".

There wasn't any motion blur on the particles right? That was the only flaw with the whole demonstration imo.
 
I just love how some guys predict 60 FPS standard just like in the previous cycle XD

This.

60fps will never be the standard. There was no reason it couldn't have been mandated in the PS2/GC/Xbox era, never mind on the current consoles. Developers will always choose graphics and effects over 60fps, and the same will be true next-gen.
 
Top Bottom