Agni's Philosophy runs at 60FPS on a GTX 680, uses 1.8GB VRAM. Can next-gen run it?

i-Lo

Member
if next gen couldn't run it, then there would be no point in developing it...why would Square Enix gather all the resources to make an engine that only runs on PC? Yes, it will run at the highest spec on PC but next gen systems should be able to handle it.

Do you remember the earliest FF13 trailer and the downgrade (sharp knees) individual models received in the final retail product while still retaining the overall vibe and graphical fidelity quite well? I think it'll be the same here.
 
if next gen couldn't run it, then there would be no point in developing it...why would Square Enix gather all the resources to make an engine that only runs on PC? Yes, it will run at the highest spec on PC but next gen systems should be able to handle it.

one more thing, didn't someone from Square enix confirm this demo will be shown running on a next gen system next e3?
 

zoukka

Member
Do you remember the earliest FF13 trailer and the downgrade (sharp knees) individual models received in the final retail product while still retaining the overall vibe and graphical fidelity quite well? I think it'll be the same here.

Of course. Models created for tech demos are way different than optimized final assets for actual videogames.
 

DJIzana

Member
Along with the survey that went with the tech-demo when it was shown, I'd say that quote is a pretty clear indication that Final Fantasy XV/Agnis Philosophy will be shown at e3.

Yeah... good luck with that one. A. Considering it took them a full YEAR to make a 3 minute tech demo and B. With Versus not out yet... fat chance you'll see that happening. Give it another couple of years.
 

i-Lo

Member
Yeah... good luck with that one. A. Considering it took them a full YEAR to make a 3 minute tech demo and B. With Versus not out yet. Fat chance you'll see that happening. Give it another couple of years.

Ah... this is misconception. It took them a year to create tools and bring them up to speed for next gen development. The actual demo took around 3-4 months to make iirc.
 
after what happened with Versus XIII, I doubt they will show the actual game...or maybe Versus XIII now runs on Luminous engine and is a proper next gen game...Shamalayan twist.jpeg
 

thuway

Member
@720p 30fps ... may be ;P

Let me make this a bit more realistic for you:

At sub-720p, average 27 fps, NO AA, baked lighting, and half the poly count. If Square Enix is pulling this off, or claiming to- I have to speculate what are Naughty Dog, 343 Industries, Epic, Santa Monica, and Guerilla up to. The graphical kings are silent- this is very peculiar.
 
Ah... this is misconception. It took them a year to create tools and bring them up to speed for next gen development. The actual demo took around 3-4 months to make iirc.

Yeah, and now that quite a bit of assets are already produced, I'm sure they could produce a vertical slice for E3. Man, if Square announced at E3 that Agni is Final Fantasy XV and that Hiroyuki Ito is directing the game, I will forgive all the bullshit that they've been pulling off recently. And I'll shit my pants.
 

RoboPlato

I'd be in the dick
Let me make this a bit more realistic for you:

At sub-720p, average 27 fps, NO AA, baked lighting, and half the poly count. If Square Enix is pulling this off, or claiming to- I have to speculate what are Naughty Dog, 343 Industries, Epic, Santa Monica, and Guerilla up to. The graphical kings are silent- this is very peculiar.

If this can run at 1080p/60fps on a 680 then I don't think 1080p/30fps will be out of the cards for the graphics kings. They can't really talk at all since they're almost all first party studios and are privy to more info than any other dev.
 

iavi

Member
Let me make this a bit more realistic for you:

At sub-720p, average 27 fps, NO AA, baked lighting, and half the poly count. If Square Enix is pulling this off, or claiming to- I have to speculate what are Naughty Dog, 343 Industries, Epic, Santa Monica, and Guerilla up to. The graphical kings are silent- this is very peculiar.

Not really. 4/5 of those studios are first-party studios that can't give any indication to the next-gen consoles when the console makers haven't even announced them yet.

Epic has and continues to show off what they plan for the next gen.
 
People will get used to this level of graphics within the first few minutes and then get bored of the game because developers focus too much on making their game prettier but not fun to play.

I would rather have the console manufacturers try something new such as integrating Virtual Reality headset or something that'll change the way we see or play the game.
 

Durante

Member
Let me make this a bit more realistic for you:

At sub-720p, average 27 fps, NO AA, baked lighting, and half the poly count. If Square Enix is pulling this off, or claiming to- I have to speculate what are Naughty Dog, 343 Industries, Epic, Santa Monica, and Guerilla up to. The graphical kings are silent- this is very peculiar.
If it runs with unoptimized assets and 8xMSAA in 1080p on a single 680 I hope that it wouldn't need to be downgraded quite so drastically for next-gen consoles.

Also, Square's A team are still graphics kings in my book, just really unproductive ones :p
 

Kagari

Crystal Bearer
If it runs with unoptimized assets and 8xMSAA in 1080p on a single 680 I hope that it wouldn't need to be downgraded quite so drastically for next-gen consoles.

Also, Square's A team are still graphics kings in my book, just really unproductive ones :p

But who is the A team?
 

UrbanRats

Member
If it runs with unoptimized assets and 8xMSAA in 1080p on a single 680 I hope that it wouldn't need to be downgraded quite so drastically for next-gen consoles.

Also, Square's A team are still graphics kings in my book, just really unproductive ones :p

Weren't people saying that a 680 was impossibly expensive for next gen consoles? That was a while ago, admittedly.
 

iavi

Member
If it runs with unoptimized assets and 8xMSAA in 1080p on a single 680 I hope that it wouldn't need to be downgraded quite so drastically for next-gen consoles.

Also, Square's A team are still graphics kings in my book, just really unproductive ones :p

Yep, and at 60fps too. A little optimization and 30fps (we all know it's not going to stay at 60 on consoles) and it might not look any worse at all.
 

Durante

Member
Yep, and at 60fps too. A little optimization and 30fps (we all know it's not going to stay at 60 on consoles) and it might not look any worse at all.
I do think that the IQ will be worse. If you excuse my PC gamer elitism for the moment, running 8xMSAA on a 1080p console game seems like casting pearls before swine.
 
People will get used to this level of graphics within the first few minutes and then get bored of the game because developers focus too much on making their game prettier but not fun to play.

I would rather have the console manufacturers try something new such as integrating Virtual Reality headset or something that'll change the way we see or play the game.

Or a really innovative tablet controller, right? It may be a shock for you, but you can have both good graphics and good gameplay. Current gen games already have very solid gameplay, imo they don't even have to innovate here radically to offer a really great experience with nextgen graphics. I would totally happy with Uncharted / Batman / Mass Effect / Battlefield etc. with better graphics. Nothing boring about that.
 
People will get used to this level of graphics within the first few minutes and then get bored of the game because developers focus too much on making their game prettier but not fun to play.

I would rather have the console manufacturers try something new such as integrating Virtual Reality headset or something that'll change the way we see or play the game.
This is just... so weird to me.

I am constantly admiring the beautiful graphics in the games I play. I wouldn't even bother being a PC gamer if I didn't care about my games being as incredibly gorgeous as is humanly possible.

You don't "get used to it". Unless by "get used to it" you mean "get used to it being beautiful and not an aliased, blurry, pixelated mess", which is a matter of increasing standards more than desensitization.
Weren't people saying that a 680 was impossibly expensive for next gen consoles? That was a while ago, admittedly.
It is possible that direct-to-metal programming can make up the difference such that the next-gen consoles can match a 680 (but that may be wishful thinking).
 

i-Lo

Member
I'm usually fine with 4xMSAA, but i have low standards.

Perhaps next gen, we'll get newer forms of AA which may be more efficient while remaining as effecting as MSAA or MLAA.

One thing is for certain, you definitely require AA for next gen.
 

thuway

Member
But who is the A team?

:lol

I'm usually fine with 4xMSAA, but i have low standards.

The best solution is: FXAA + 2X MSAA. If you were to go any higher you would be wasting resources. The image would be damn near clean as fuck.

I would rather they stick with the above configuration and work out how to get Shadows on Ultra quality. Shadows, I've found out through my PC gaming vices, are the second most important factor in a game's IQ.

Perhaps next gen, we'll get newer forms of AA which may be more efficient while remaining as effecting as MSAA or MLAA.

One thing is for certain, you definitely require AA for next gen.
My good man, one would hope the GPUs in next gen consoles are designed with architectural strengths for techniques such as tesselation, anti-aliasing, high resolution shadows, HDAO, and AF. The ED RAM in 360, gave MSAA 2X away for free.
 

Perkel

Banned
People still don't get what is low API and what it can do with hardware.
Most of graphic cards which we use would have lightyears better effectiveness if there would be low API for them. Consoles from their nature are created with low API in mind. That is why we see games like God of War 2 on hardware which could not render it if we compare it to PC hardware.
Sure low API won't make from 512MB 1GB but power will be better utilized. Also because consoles are not bounded to x86 x64 nature (windows os) they can utilize CPU that are just better for lower price.

I fully expect new consoles to get some amazing looking games for 1-2 years and in 1-2 years PC will again catch up and will lead again. It always was the case and i can't see it any different today even with shelf parts.
 

RoboPlato

I'd be in the dick
The best solution is: FXAA + 2X MSAA. If you were to go any higher you would be wasting resources. The image would be damn near clean as fuck.

Hell, I think that a good SMAA solution would be enough for most games at 1080p with the distance that you sit from a TV. Cleaner image and better coverage than FXAA at very similar hardware cost. To me, SMAA looks pretty close to 2x MSAA and it would save on VRAM usage.
 

Nirolak

Mrgrgr
Let me make this a bit more realistic for you:

At sub-720p, average 27 fps, NO AA, baked lighting, and half the poly count. If Square Enix is pulling this off, or claiming to- I have to speculate what are Naughty Dog, 343 Industries, Epic, Santa Monica, and Guerilla up to. The graphical kings are silent- this is very peculiar.

I went through their tech papers and honestly this is one of the most humble next-gen demos we've seen in terms of hardware requirements.

It's not even remotely unfeasible. Their environmental lighting model is even heavily based off of what Naughty Dog is doing on *current* consoles.
 

Biggzy

Member
I went through their tech papers and honestly this is one of the most humble next-gen demos we've seen in terms of hardware requirements.

It's not even remotely unfeasible. Their environmental lighting model is even heavily based off of what Naughty Dog is doing on *current* consoles.

Which is why myself and a lot of other people are very interested in this demo. Because, as you have said, nothing in it seems unfeasible from a hardware standpoint.
 

i-Lo

Member
My good man, one would hope the GPUs in next gen consoles are designed with architectural strengths for techniques such as tesselation, anti-aliasing, high resolution shadows, HDAO, and AF. The ED RAM in 360, gave MSAA 2X away for free.

True. The reason AA is still a suspect (for me) unlike most other things is because of the all the talk surrounding 1080p resolution and how it brings cleaner and crisper visuals which may negate the requirement of performance penalty imposed by further addition of AA.

I went through their tech papers and honestly this is one of the most humble next-gen demos we've seen in terms of hardware requirements.

It's not even remotely unfeasible. Their environmental lighting model is even heavily based off of what Naughty Dog is doing on *current* consoles.

By the power of Grayskull, if your claims come to fruition in time, then you shall be revered among the overly worried people such as myself. You're like the weed to ease our wretched nerves.

Btw, when you say not infeasible does it include the 1080p resolution (not worried about the framerate being either 30 or 60)? Also, what sorcery have you conjured up to see these "next gen" demos and requirements *Will call Uther Pendragon shall this creature fail to answer with some detail*
 

Nirolak

Mrgrgr
By the power of Grayskull, if your claims come to fruition in time, then you shall be revered among the overly worried people such as myself. You're like the weed to ease our wretched nerves.

Btw, when you say not infeasible does it include the 1080p resolution (not worried about the framerate being either 30 or 60)? Also, what sorcery have you conjured up to see these "next gen" demos and requirements *Will call Uther Pendragon shall this creature fail to answer with some detail*
For a requirements example:
-This runs at 60 fps, 1080p, and runs 8xMSAA + FXAA. You can find this in the article linked in the OP.
-The Unreal Engine 4 tech demo runs at 1080p (90% of the time), 30fps, and only uses FXAA.

These two demos were run on the exact same card, and given frame setup time, you generally have about three times the resources to work at 30 fps despite it being only half the framerate. MSAA is also a pretty expensive form of anti-aliasing, which implies they had large amounts of processing power left over even at 60 fps/1080p in the Luminous demo.

A lot of what looks great in this demo, to the point many people prefer it over the Unreal Engine 4 showcase, is that Square Enix actually made really high end art assets that work well with their technology, whereas Epic just made a bunch of art assets that helped show off whatever features they were hoping to highlight. One of these was targeted at consumers (and their perception of Square Enix), while the other was mostly to show developers what UE4 does.
 
For a requirements example:
-This runs at 60 fps, 1080p, and runs 8xMSAA + FXAA. You can find this in the article linked in the OP.
-The Unreal Engine 4 tech demo runs at 1080p (90% of the time), 30fps, and only uses FXAA.

These two demos were run on the exact same card, and given frame setup time, you generally have about three times the resources to work at 30 fps despite it being only half the framerate. MSAA is also a pretty expensive form of anti-aliasing, which implies they had large amounts of processing power left over even at 60 fps/1080p in the Luminous demo.
Also, UE4 does everything in realtime as opposed to Agni's which may also partly explain the performance gulf.
 
The best solution is: FXAA + 2X MSAA. If you were to go any higher you would be wasting resources. The image would be damn near clean as fuck.
True. The reason AA is still a suspect (for me) unlike most other things is because of the all the talk surrounding 1080p resolution and how it brings cleaner and crisper visuals which may negate the requirement of performance penalty imposed by further addition of AA.
Hell, I think that a good SMAA solution would be enough for most games at 1080p with the distance that you sit from a TV. Cleaner image and better coverage than FXAA at very similar hardware cost. To me, SMAA looks pretty close to 2x MSAA and it would save on VRAM usage.
You guys are all clearly victims of the console quality compromise. No self-respecting PC gamer would ever consider post-AA alone or post-AA + MSAA to be "good enough".

But that's okay. As long as you guys are happy with it, that's all that matters. Those of us who actually care about having pristine, stable images even during motion will simply stick with our $2000 PCs.

[/PCelitistmode]
 

RoboPlato

I'd be in the dick
I went through their tech papers and honestly this is one of the most humble next-gen demos we've seen in terms of hardware requirements.

It's not even remotely unfeasible. Their environmental lighting model is even heavily based off of what Naughty Dog is doing on *current* consoles.
This is mind blowing to me. I originally thought that Agni was going to be out of reach for next gen adn now it's seeming like it will be easily attainable. If S-E can create an engine like this I can't wait until others get more optimized and to see what other devs can do.
 

Nirolak

Mrgrgr
Also, UE4 does everything in realtime as opposed to Agni's which may also partly explain the performance gulf.

Right, Agni uses a lighting engine similar to Geomerics (without the ability to turn it to realtime) in that they have lighting rendering in realtime on their development machines, giving a fully realtime workflow, but they have a tool called Raybundles that then takes this lighting and turns it into a baked lightmap that is loaded into the game when it is actually run.

Battlefield 3 does a very similar thing on the console versions.

Most of Square Enix's realtime lighting effects and alterations in this demo are only applied to the character models, which means that they're focusing their resources on things that really make a big difference to user perception, which is a great way to make their game seem much more impressive than it is. And hey, the user's impression is what really matters in the end as long as it doesn't negatively impact development or gameplay.
 

i-Lo

Member
For a requirements example:
-This runs at 60 fps, 1080p, and runs 8xMSAA + FXAA. You can find this in the article linked in the OP.
-The Unreal Engine 4 tech demo runs at 1080p (90% of the time), 30fps, and only uses FXAA.

These two demos were run on the exact same card, and given frame setup time, you generally have about three times the resources to work at 30 fps despite it being only half the framerate. MSAA is also a pretty expensive form of anti-aliasing, which implies they had large amounts of processing power left over even at 60 fps/1080p in the Luminous demo.

A lot of what looks great in this demo, to the point many people prefer it over the Unreal Engine 4 showcase, is that Square Enix actually made really high end art assets that work well with their technology, whereas Epic just made a bunch of art assets that helped show off whatever features they were hoping to highlight. One of these was targeted at consumers (and their perception of Square Enix), while the other was mostly to show developers what UE4 does.

Thanks for that pdf.

So the question is why is there a gap like this when Agni demo looks even more stunning and there are few reasons I can think of:

  • Apex clothing (and perhaps other real time physics simulation)
  • All lighting is done real time (Luminous engine may still rely on pre-baking like current gen)

EDIT: Horse Armour... damn you... for beating me to the lighting factor...
 

RoboPlato

I'd be in the dick
You guys are all clearly victims of the console quality compromise. No self-respecting PC gamer would ever consider post-AA alone or post-AA + MSAA to be "good enough".

But that's okay. As long as you guys are happy with it, that's all that matters. Those of us who actually care about having pristine, stable images even during motion will simply stick with our $2000 PCs.

[/PCelitistmode]
Well, we are talking about performance in a console setting and are still hoping for an improvement in that area over the current gen.

Also, aren't you the one that posts the impossibly blurry GW2 shots in the PC thread? If you are it's hilarious that you're mocking others' preferences in IQ.
 
I hope sony goes with 2GB of GDDR5 and 6 GB of slower RAM.

Seems like to run something like this, fast RAM is a requirement, but you don't need a massive amount of it.
 
Thanks for that pdf.

So the question is why is there a gap like this when Agni demo looks even more stunning and there are few reasons I can think of:

  • Apex clothing (and perhaps other real time physics simulation)
  • All lighting is done real time (Luminous engine may still rely on pre-baking like current gen)
If you read the PDF you'll see that they're doing some very impressive stuff that isn't being done in Agni's such as more than a million rendered and directly/indirectly lit particles on screen, realtime voxel lighting, impressive realtime AO amongst some other impressive effects.
 

Nirolak

Mrgrgr
Thanks for that pdf.

So the question is why is there a gap like this when Agni demo looks even more stunning and there are few reasons I can think of:

  • Apex clothing (and perhaps other real time physics simulation)
  • All lighting is done real time (Luminous engine may still rely on pre-baking like current gen)

EDIT: Horse Armour... damn you... for beating me to the lighting factor...

Luminous has pretty great cloth modeling judging by their former videos. SVOGI (UE4's lighting model) is incredibly expensive however, but can be really cool.

Unreal Engine 4 also seems to be at a somwhat less advanced state of development than Luminous in terms of visuals since they seem to have focused tremendously on building their tools first, whereas Square Enix focused on converting and rendering assets from a CG production into their game engine, but the engine is still not usable to even start development a game in a fulltime manner, whereas UE4 is already being used by a lot of studios.
 

i-Lo

Member
I hope sony goes with 2GB of GDDR5 and 6 GB of slower RAM.

Seems like to run something like this, fast RAM is a requirement, but you don't need a massive amount of it.

I think it's short sighted for a console that needs to remain relevant and developer friendly for the next 6 years. So unified memory structure is more like it. 4-8 GB of DDR3 (or DDR4 if we're lucky) with healthy dose of eDRAM should lend enough flexibility to developers for asset management.
 
Luminous has pretty great cloth modeling judging by their former videos. SVOGI (UE4's lighting model) is incredibly expensive however, but can be really cool.

Unreal Engine 4 also seems to be at a somwhat less advanced state of development than Luminous in terms of visuals since they seem to have focused tremendously on building their tools first, whereas Square Enix focused on converting and rendering assets from a CG production into their game engine, but the engine is still not usable to even start development a game in a fulltime manner, whereas UE4 is already being used by a lot of studios.
*Come on son.gif*. When do they anticipate that the first games using this engine will enter production?
 
Well, we are talking about performance in a console setting and are still hoping for an improvement in that area over the current gen.

Also, aren't you the one that posts the impossibly blurry GW2 shots in the PC thread? If you are it's hilarious that you're mocking others' preferences in IQ.
My images got better.

And blurring is still >>> shimmering, temporally aliased images (which is what you get when you have nothing but post-AA).

Personally, I'm hoping we see universal usage of SMAA 2x/4x in next-gen games. I'd say TXAA as well, but seeing as how all the new consoles are running with AMD GPUs, well...
 
Top Bottom