Crytek Says It's Getting Increasingly Difficult to Wow People With Graphics

yeah, there's never gonna be a leap like PS1/N64 era to PS2/Xbox/GC era again where everybody and their mom can see the improvement and be impressed. It's been about refinement since.
 
Back in 2005 it would of been.

Why didn't they ever make something like dark souls or demon's souls? Too ambitious? If they're going to make a fantastic idea like the Souls series, then it wont be all about graphics. You'll get enough to be pleased as a player.

I think that's half the reason Godzilla on PS3 looks the way it does or EDF. They're making a game that wouldn't normally get made and they're not focusing on graphics. That doesn't mean it wasn't about the budget, but they're making a game because the game was a good idea to begin with. People aren't amazed maybe because graphics were the reason they started gaming and they want to be pleased constantly from then on.

There's a ton of open world RPG like Two Worlds and Risen that looked good graphically, but I couldn't enjoy them just because they looked good. I can bet there's a large group out there who play those games based on graphics. I think games can get dated and that decreases some of their enjoyment, but I don't feel that's the case all the time. There's still quite a few games that were made with less graphical feel that still immerse the player or there's something that you can take away from it. If a game just sits there with pretty graphics it can lack every other aspect of what made a great game.
I'm really not sure what you're saying. I'm not talking about how fun the game will be. Trust me, I'm not a graphics whore who cant enjoy a game because it isn't the best graphics ever. Far from it.
 
yeah, there's never gonna be a leap like PS1/N64 era to PS2/Xbox/GC era again where everybody and their mom can see the improvement and be impressed. It's been about refinement since.

There might be with AI. I've noticed how they'll just barrel out of somewhere. I think games will be made to look complete rather than spawning zones and locked locations. I think games will become larger. We still don't have games where the 6th floor is a throwing a party and we're jumping out the window onto a bus that has its own story going on all at the same time. I'm saying fully acting characters that aren't just a single phrase.

God knows Watch Dogs isn't it.

I'm really not sure what you're saying. I'm not talking about how fun the game will be. Trust me, I'm not a graphics whore who cant enjoy a game because it isn't the best graphics ever. Far from it.

What I'm getting at is how they finally made a game like the Souls games, but where is it going in the long run. Was the game only good because it came out during this specific time with this set of consoles.

In 20 years will the souls game be considered our indie games? From a technical standpoint.

When I saw Deomon's Souls in person and on websites from other importers, I saw some things that weren't that impressive in terms of large scale technical performances, but it was something no one else was doing. They knew what they were doing and they did it right. I see your point. I think they increase the graphics because the game became a cult hit. I remember getting the game from NCSX and being blown away by dragons sitting on a cliff who'd attack you if you got too close. They didn't look like Skyrim dragons either. They're not going back to that look because people would get mad. Some might of just played Dark Souls 2 and if Bloodbourne didn't have some technical advances, then that same fan base might not be interested (possibly) or they'd complain a lot.
 
Original Crysis wasn't shallow though.


What bothers me a bit more is that he is blaming GPUs for not being able to do 60FPS on 4K screens. There are 2 things wrong with that statement.

1) GPUs can do that now. You just have to manage the scope of your graphics. Ironically if they did that in the first place they wouldn't be in this situation where it is harder to wow people with graphics.
2) The GPU's aren't the problem but the display cables. They can only carry so much information and until DP 1.3 and HDMI 3.0 is finalized games simply can't do 60 FPS on 4K TV or monitor screens.
Obviously a principle 3D rendering programmer knows his way around a gpu. Im pretty sure he knows that theoretically, you can render at whatever resolution/frame rate you want as long as you tailor what you render.

He's obviously commenting on rendering high fidelity graphics at those specs. What you say about the cables isnt wrong but you run into issues far before the data even arrives at that point. One of the main issues at this point is internal memory bandwidth, as he pointed out. This is why advancements like HBM are eagerly anticipated.
 
Really dude? I get opinions and all that, but...

It does, its technically much more superior. I think only the cloth tech physics on main characters is more precise in The Order. Everything else is inferior and thats not even considering scale of both games.

What people actually are not realizing is that SC havent even started to look amazing.
I think many people will be truly mindblown next year when RSI will release trailer for Squadron 42.
This gen will be much different than last gen, because of lower spec hardware and Star Citizen impact so early in generation. And i hope that this will cause shorter generation.

---
What bothers me a bit more is that he is blaming GPUs for not being able to do 60FPS on 4K screens. There are 2 things wrong with that statement..
If You read original interview, You would notice that he was specifically talking about Ryse running at 60fps in 4k on a single GPU
Not general idea of running games in 60hz in 4k.
 
I believe we won't see another major leap in graphics until 3D stacked memory / TSV / HBM on GPUs is implemented, but that should become a reality in the next 1-2 years.

Check out this prediction from nearly a decade ago about how much bandwidth might be needed for future games:

By Scott M. Fulton JULY 7, 2005

"The bandwidth requirements of game platforms and graphical applications have been growing exponentially," Steven Woo, Rambus' senior principal engineer at Rambus, told Tom's Hardware Guide. "About every five or six years, it goes up by a factor of 10. PlayStation 3, for example, will have a memory bandwidth capability of 50 GByte per second." If this trend continues, projected Woo, a theoretical 2010 model "PlayStation 4" could require ten times the memory bandwidth as next year's PlayStation 3. A statistical projection made in 2004 by NVIDIA's Vice President of Technical Marketing, Tony Tamasi - cited by Woo - anticipates that a top-of-the-line 3D game could conceivably require memory bandwidth of 3 TByte per second.

http://www.tomshardware.com/news/xdr2-quintuple-memory-data-transfer-speeds-2007,1152.html

Fast forward to GTC 2013 where Nvidia announces Volta


Volta will arrive after Maxwell, and will provide GPUs with an insane amount of memory bandwidth. Volta-based GPUs will provide up to 1TB per second of bandwidth, made capable by stacking the DRAM on top of the GPU itself, with a silica substrate between them. Then, cutting a hole through the silicon and connecting each layer provides the ability for this insane level of bandwidth. Something Huang has said has the ability to shift "all of the data from a full Blu-Ray disc through the chip in 1/50th of a second."

http://www.tweaktown.com/news/29204...iding-the-gpu-1tb-sec-of-bandwidth/index.html

29204_02_nvidia_unveils_volta_their_next_gen_gpu_capable_of_providing_the_gpu_1tb_sec_of_bandwidth.jpg


NVIDIA is targeting a 1TB/sec bandwidth rate for Volta, which to put things in perspective is over 3x what GeForce GTX Titan currently achieves with its 384bit, 6Gbps/pin memory bus (288GB/sec). This would imply that Volta is shooting for something along the lines of a 1024bit bus operating at 8Gbps/pin, or possibly an even larger 2048bit bus operating at 4Gbps/pin. Volta s still years off, but this at least gives us an idea of what NVIDIA needs to achieve to hit their 1TB/sec target.

Volta_575px.jpg


http://www.anandtech.com/show/6846/nvidia-updates-gpu-roadmap-announces-volta-family-for-beyond-2014


More recently, Nvidia pushed Volta back, and announced Pascal instead. That or Volta became Pascal. In any case, Pascal should be 2016-2017 and obviously Nv will evolve the architecture further in 2018-2019. If Volta was really that different from Pascal, then it'll be Volta by the end of the decade.

Nvidia has an even more advanced architecture beyond Volta / Pascal: the Einstein architecture, and they say almost nothing about it. Einstein is thought to be the GPU architecture to be used for the U.S. 'Echelon Project' exacale / ExaFlop supercomputer, originally due in 2018, now more like 2020

I don't wanna toot Nvidia horn, because AMD had several generations of GPU architectures in the pipeline as well, and those will be more relevant to future next-gen consoles. AMD has been very aggressive in working on 3D stacked memory also.


Keep in mind, this is just the bandwidth aspect of graphics, not the shading / compute side, which will also have to increase obviously. Yet if the bandwidth isn't there, there's little point in having far greater compute and pixel performance.
 
That's the thing, it's not, but it still looks a lot better than Dark Soul's 2 PC. Just the way things progress, with PC's mainly getting ports of console games, whilst the consoles have more exclusives to make better use of their specific hardware.
Talk about missing the point. It is indicative of a bigger issue plaguing PC's, and holding them back from making better use of the more powerful hardware afforded to them, that is console ports themselves. This isn't as much of an issue with consoles as they have so many first party (and third party) exclusives that can better optimise due to the closed platform nature. It's kind of a shame that Crysis 3 is still the main benchmark for PC graphical prowess, when PC hardware has evolved a good degree since the games release.
Ummm, I make plenty of use of my two GTX 670s without games specifically being made for the hardware. Throwing in stuff like HBAO+, tessellation (so far exclusive to PC ports), and especially good anti-aliasing makes a huge difference in my book.

It IS unfortunate that we'll never get to play those first-party exclusives and really wow people with how incredible they can look with actual good image quality and the like, I'll say that much.

What impresses me most about graphics in any sense is design. Art. Without the design and the art you don't have anything to look at anyway.

Everything from textures, to geometry, shaders, IQ, and framerate merely exist to support the art. Something that a lot of game creators struggling to replicate reality seem to forget, making bland photorealistic games that look ugly in just a few years, when technology improves again.

With the right art, any piece of hardware can produce a pleasing image. In fact, at this point, the hardware for polygon based visuals is so advanced that the main benefit from more powerful hardware is to art styles that are generally realistic - and tipped relatively far towards the "photographic" end of the scale. The other current advantage is games with large and open world designs, since scale does impact granular quality.
This is a fair point, but I would argue that we still need someone pushing those boundaries even if the benefits are not readily apparent to the layman and/or the game ages poorly for it, as the research is quite useful for allowing other developers with perhaps stronger art direction to improve their technical prowess at minimal cost to themselves.

Bloodborne for example, looks better than Dark Souls 2 PC largely because it has far, far superior art and visual direction.Though in a purely technical sense, it does seem to use a lighting system more advanced than DS2 (more like the lighting in the DS2 vertical slice).
I would argue against this. In fact this is completely and demonstrably false; Bloodborne has far more complex geometry, shaders, higher-resolution textures, and is generally MUCH more demanding on a technical level than DS2 ever is. Yes, some of it may be because of the art direction but I would argue that the game's visual appeal would be diminished dramatically if they tried to make the game on the previous generation of consoles.

If it's getting harder to wow and boggle people with raw hardware, it's because those naughty diminishing returns mean that people are used to decent IQ, higher resolutions, 60fps (on PC), and good textures. As has been said many times the dawn of accelerated 3D visuals can't happen twice. The biggest wow moment was at the beginning, and each subsequent wow has been smaller.
Yes, diminishing returns are real. We know this already. However....
I disagree.

There's at least one more HUGE jump to be made in graphics, which will majorly wow people.
Yep. Until we reach the point where we can have per-triangle collision detection, physically modeled terrain and object destruction and deformation, fully modeled hair and fur as standard, and real-time, noise-free path tracing, we still have a long way to go.
 
He's obviously commenting on rendering high fidelity graphics at those specs.

...

One of the main issues at this point is internal memory bandwidth, as he pointed out. This is why advancements like HBM are eagerly anticipated.
Yes it was obvious but he was still making the overarching point that wowing people with graphics is getting much harder. So his attempt to put in that context is a red herring because graphical techniques at lower framerates will always be easier to manage than that ludicrously high benchmark so he should've made in the context of 1080p at 30 FPS because that is the standard benchmark the industry targets.

If you have to bump the frame rate and resolution to double the standard then trying to wow people is the least of your concerns. You would be more concerned if the graphics look like a step back from the lower resolution and lower framerate version.

As a side note I would like to say I disagree to an extent because of what was seen with Capcom's Deepdown. It still remains to be seen what performance problems will come up across their entire game but it still is very mind bogglingly impressive graphically.

You make a good point about the bottlenecks that can come up elsewhere though I still see it as unfair as to pin these problems on the GPU. It's an unfair misrepresentation of what is actually going on.

1: It can't be done with decent fidelity.
2: Doubt it, pc gamers can already play in 4K.

1. That is all subjective.
2. You are confusing 30HZ for 60HZ at 4k.

HDMI 1.4 has 8 Gb/s bandwidth
DP 1.2 has 17 Gb/s bandwidth

Those numbers are before overhead costs

4K at 60 HZ with 24 bit colors at minimum needs 12 Gb/s bandwidth but due to overhead even DP 1.2 can't support that resolution at that frequency.
 
Yes it was obvious but he was still making the overarching point that wowing people with graphics is getting much harder. So his attempt to put in that context is a red herring because graphical techniques at lower framerates will always be easier to manage than that ludicrously high benchmark so he should've made in the context of 1080p at 30 FPS because that is the standard benchmark the industry targets.

If you have to bump the frame rate and resolution to double the standard then trying to wow people is the least of your concerns. You would be more concerned if the graphics look like a step back from the lower resolution and lower framerate version.

As a side note I would like to say I disagree to an extent because of what was seen with Capcom's Deepdown. It still remains to be seen what performance problems will come up across their entire game but it still is very mind bogglingly impressive graphically.

You make a good point about the bottlenecks that can come up elsewhere though I still see it as unfair as to pin these problems on the GPU. It's an unfair misrepresentation of what is actually going on.



1. That is all subjective.
2. You are confusing 30HZ for 60HZ at 4k.

HDMI 1.4 has 8 Gb/s bandwidth
DP 1.2 has 17 Gb/s bandwidth

Those numbers are before overhead costs

4K at 60 HZ with 24 bit colors at minimum needs 12 Gb/s bandwidth but due to overhead even DP 1.2 can't support that resolution at that frequency.

You should probably read the OG interview. I though his point was perfectly coherent and easy to follow. Certainly no red hering to speak of.

Those are two separate questions.
This is about Ryse running in 4k
DSOGaming: Ryse: Son of Rome was timed exclusive to Xbox One and ran at 900p and 30fps. The PC version, on the other hand, will support 4K and 60fps. Do you believe that current-gen consoles (Xbox One and PS4) are far behind high-end PCs? And did Xbox One limit the things you could achieve with Ryse? Would the game look more beautiful and more impressive if it was developed as a PC exclusive title as the original Crysis was back in the days?

Nicolas Schulz: The current generation of high-end GPUs is unfortunately still far from being able to reach 60 FPS at 4K resolution. Please keep in mind that with 4K versus 1080p, you have four times the amount of pixels that need to be shaded. This is very quickly saturating the available bandwidth. The consoles are clearly behind high-spec GPUs in terms of raw horsepower, however on the positive side, they share the same modern architecture which enables a wealth of interesting optimization techniques. Due to the console differences we had to work a bit harder on the final optimizations but I’m happy that we never had to sacrifice any visual quality.


SOGaming: The first Crysis game was a title with unbelievable (for its time) visuals and physics. And while Crysis 3 looks incredible, it did not feel as ground-breaking – visually – as the original part. Will you ever create a similar game that will push the graphical boundaries to new heights or do you feel that something like what Crysis achieved back in 2007 is not possible today?

Nicolas Schulz: I think with its advances in material quality, lighting and the quality of facial animations, Ryse is extending the boundaries of realtime graphics quite a bit again. Generally though, as opposed to the times of the original Crysis, we as an industry have reached a quality level now where it is getting increasingly more difficult to really wow people. That said, there’s still enough areas to explore and we will definitely keep pushing the boundaries as much as possible.


The quote about wowing people is in response to making another game with the impact of Crysis. Which is obviously exceedingly harder in this day.

He said nothing of it being impossible to still advance graphics, and I don't know why he would have to say that looking at Ryse.
 
Top Bottom