Interesting. I'd completely flip that. Mirror's Edge's urban environments back in 2008 looked mind-blowing, and they've only gotten better. I think manmade structures are easier to mimic in a game due to their relatively less-complex nature. TLOU Part II's and RDR2's outdoor scenes are beautiful, but they're nowhere near real, imo. Can't think of a game where the outdoors look betterWe're close if you don't zoom in and analyse too much.
Urban environments still way off due to the complexity, but natural environments are very good looking.
I dunno, man. There are lots of examples of CG in movies people didn't even realize were CG. This is an old article, so I bet there are even better and more recent examples, but if you like movie trivia, it's pretty interesting:I'm not someone who plays for 'the graphics' but, games now are rendered better than CG in modern movies.
I mean, can you blame them?!People always say it's really close, but a few years later everyone agrees it really wasn't. Been like that for decades. I remember reading a review for some basketball game on PS1 back in the 90s where the reviewer stated the graphics were so realistic that some of his family members thought they were watching an actual game of basketball on TV.
Oh, you're one of those people who have shit eyes and can't see the difference in 30/60/144/240fps. It doesn't become more fake. It's actually the opposite. What you're actually seeing is more real, leading to you being able to tell, for example, when actors are on a fake set. If they had shows in 240fps it'd be even more apparent.
Framerate importance is incredibly overblown. 60fps games look MORE fake than 30-40 fps. Just like 60 fps movies look like soap opera garbage, so they stopped making them.
The human eye doesn't see 60fps in reality, so when we see it in media our brain thinks "fake".
Lighting and material representation, and that materials interaction with that lighting, is the holy Grail of photo realism. We're 10-15 years away.
Not so sure about that. I'd say the tradeoff is far worse for little gain. If you see a 24fps video recording on TV does it look any less real? the massive amount of graphical settings you would need to lower or forgo to reach something like 500fps for such little gain trying to get realism just isn't worth it. Not even TV considers that worth it.I think frames per second is a huge factor that is often overlooked when talking about realism. If a game looks really good but has 30fps it will feel substantially less real than if you had 500fps. Smooth movement in addition to great graphics gets us closer to real life.
I have just the monitor for you:I think frames per second is a huge factor that is often overlooked when talking about realism. If a game looks really good but has 30fps it will feel substantially less real than if you had 500fps. Smooth movement in addition to great graphics gets us closer to real life.
I have just the monitor for you:
A PC monitor with a 500 Hz refresh rate is coming from Asus
Upcoming 500 Hz monitor targets PC gamers with beefed-up systems, various skill levels.arstechnica.com
You think it's overlooked...........I think frames per second is a huge factor that is often overlooked when talking about realism. If a game looks really good but has 30fps it will feel substantially less real than if you had 500fps. Smooth movement in addition to great graphics gets us closer to real life.
I'd bang those tripletsCurrent? We've reached that point years ago
Super ez to tell left pic is real, right one is fake/cgi, just look at those dead/doll eyes .You couldnt tell me which of the below images is CGI and which is a photo:
No, I have no problem believing that actually happened and that they were serious. I saw it myself when 3D games first started popping up in the 90s. People who weren't into video games and associated the term with shit like Pong and Frogger legit thought they weren't watching games when they saw 3D gameplay, because in their mind that wasn't what games looked like.I mean, can you blame them?!
My money is on both being CGI lit by different HDRIs. Everything is literally in the exact same place, down to the tiniest hair. That shit doesn't happen in real life.You couldnt tell me which of the below images is CGI and which is a photo:
Close but no cigar.Super ez to tell left pic is real, right one is fake/cgi, just look at those dead/doll eyes .
But the concept isn't different, a couple of tech guys decide what's closer to reality based on the choices they make.I wouldn't say offline is miles away. It's gotten pretty freakin' close. But games are far, far from reality.
The concept is different, because offline doesn't have the restrictions of a game.But the concept isn't different, a couple of tech guys decide what's closer to reality based on the choices they make.
Then who made these tools?The concept is different, because offline doesn't have the restrictions of a game.
"Same tools, different results"Then who made these tools?
Close but no cigar.
Both are CGI
in screenshots we are basically indistinguishableIt's clear that significant progress has been made in terms of approaching reality with video game graphics, but I'm curious where people think the "upper limit" is, and where we are in relation to it. If realism is one end goal of video game graphics, where do you think we are right now on a scale from 1-100?
For the sake of conceptualizing and showing the trajectory of graphics towards realism, let's say Battlezone (1980) is the baseline (a "1"):
In keeping with the tank theme, about twenty years later, there was Battletanx in 1998:
And another ~20 years later, there was Battlefield 2042 (2021):
And here's a shot of an actual tank firing:
Yes, a picture of a tank is not reality, but hopefully you see what I'm getting at.
_______________________
I want to say we're at about 85%. But I remember standing in Walmart watching Ocarina of Time on the N64 and saying to the kid next to me something like: "Nothing will ever look better than this." I was proven wrong on how much video game graphics would progress then, and I hope I'm proven wrong again.
I wouldn't call Matrix that realistic, the green simulated filter keeps it from looking like real life since real life... doesn't have a green filter over everything.61-70. Most games still have that cartoony look. In terms of reality we only got FS2020 and the Matrix demo.
Ur joking, right ?
Framerate importance is incredibly overblown. 60fps games look MORE fake than 30-40 fps. Just like 60 fps movies look like soap opera garbage, so they stopped making them.
The human eye doesn't see 60fps in reality, so when we see it in media our brain thinks "fake".
Lighting and material representation, and that materials interaction with that lighting, is the holy Grail of photo realism. We're 10-15 years away.
I didn't think there were still people who use that in a non-ironic way.The human eye doesn't see 60fps in reality, so when we see it in media our brain thinks "fake".
I didn't think there were still people who use that in a non-ironic way.
And no, people perceive movies at over 24 frames as fake because they are used to 24 frame movies. While video games have always had higher framerates (since input time is important). So it is a mental association, not a physical reality of human vision.
Seems to me like 60fps in video games is more natural and accurate to what we can see. I think that should be pretty obvious too judging by how real life looks.... exceptionally smooth compared to any image on a screen displayed below 100hzThat’s especially rapid when compared with the accepted 100 milliseconds that appears in earlier studies. Thirteen milliseconds translate into about 75 frames per second.
Then it proves my point, cgi already can look photorealistic, now just need to patiently wait 20-25years to achieve it in gameplay/real time xDClose but no cigar.
Both are CGI
Whats was your point exactly in regards to my post?Then it proves my point, cgi already can look photorealistic, now just need to patiently wait 20-25years to achieve it in gameplay/real time xD
Did you even read my comment? The article reaffirms what I said. We can perceive information at over 60fps (16.6ms).
Yes, the thing is that if you get a movie at 60fps, besides looking somewhat cheap (by cheap, I mean home movie style), any CGI-generated image will look more "digital", but this is purely due to conditioning. If cinema had been traditionally recorded at 60fps, this would not be the case.You mean close to something filmed right?
Because to actual real life it's not close at all. In fact it's really far from it.
Did you even read my comment? The article reaffirms what I said. We can perceive information at over 60fps (16.6ms).
The thing is that in the case of cinema, our eyes and brains are used to the 24 frames per second cadence, this is a fact. TV shows, news and other live content are recorded at 30fps (our brain perceives it as lower quality content because we associate that rate with that "quality"). And video games (usually between 30 and 60) we perceive them as digital or "fake" for the same reason, pure association.
There are thousands of studies on this, and it is the reason (besides the cost of editing) why cinema has remained "loyal" to 24fps, and experiments like "The Hobbit" at 48fps failed and were not liked.
certainly a hell of a lot more playable than the original 20fps version though....Good points, but a 60 fps N64 game looks like crap.