• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Is 60 FPS killing overall graphical fidelity?

This 30fps hatred and takes is pure hyperbole. 30fps is fine. I'm glad I'm not as sensitive as these people.
With the ability to switch between performance mode and fidelity modes in most PS5 games, it's super easy to do a comparison and…I'm hard pressed to see any advantages to fidelity mode, but I instantly hate 30FPS mode. The ability to play 30FPS PS4 titles in 60FPS is a god send too.

Cutting the frame rate in half to get marginal IQ gains is silly to me. Games still look amazing in performance mode and actually, you know, play better too. Since they're games and gameplay is the point
 
If you turn, or move your head fast, all the environment will blur, so your point is moot.
Also, devs are focusing more in per object motion blur. There's no way around this. Motion blur is a natural effect.
Again. Motion blur from the movement itself is natural. Fake camera/object blur injected though fake effects in games are not natural. That's why different games have different results of motion blur implementations.
 
With the ability to switch between performance mode and fidelity modes in most PS5 games, it's super easy to do a comparison and…I'm hard pressed to see any advantages to fidelity mode, but I instantly hate 30FPS mode. The ability to play 30FPS PS4 titles in 60FPS is a god send too.

Cutting the frame rate in half to get marginal IQ gains is silly to me. Games still look amazing in performance mode and actually, you know, play better too. Since they're games and gameplay is the point
DF sees the benefit in fidelity mode and I'm more inclined to trust in professionals than in hyperbolic users in forums.
Again. Motion blur from the movement itself is natural. Fake camera/object blur injected though fake effects in games are not natural. That's why different games have different results of motion blur implementations.
And it's why it's a technique that is being improved and getting better over time.
 
Last edited:
If you turn, or move your head fast, all the environment will blur, so your point is moot.
Also, devs are focusing more in per object motion blur. There's no way around this. Motion blur is a natural effect.

moving your head has no affect on your vision. you can't smoothly turn your eyes, your eyes will always snap onto something and keep that something in focus even if you turn your head.

this also happens while playing a game. your eyes focus a single point on the screen, meaning the "blur" you have IRL when snapping your view onto something will happen when you do the same in a game.

additional motion blur added by the game is therefore unnatural and not at all in any way reflective of the real world.

it is mimicking movies, not real life.
if you follow an object irl with your eyes it will be perfectly sharp.
and just because you look at a screen, that doesn't magically "turn off" any motion blurring your eyes would have naturally you know...

per object blur is only used for dramatic effect, to make movement seem more pronounced.
 
Last edited:
Absoluty not.
Wave a hand in front of your face, look outside the moving car to the side.... MOTION BLUR.
Viewing is analog. And not a collection of still frames. With 30 or 60 you have big gaps of time inbetween frames... so you need to cheat and blur together frames to make motion look correct.
If You multiply and have 240 or 480 or whatever (blurbusters says 1000hz is the vanishing point), then there are enought data frames to fill the missing information. You dont need motion blur.

The only thing I can agree on is that uncharted 4 is maybe incorrectly tuned but they retuned it for remaster.
Sekiro? That is lcd blur on your screenshot. That game don't have camera motion blur if I remember correctly or it is very limited.

Look at john from DF. He plays on oled with BFI and still uses motion blur. Motion blur is important and needed aspect. If you move your eyes or head fast, you also see that in reality.

I am not going to argue with you no more. I don't know what are you trying to acheive here. Motion blur is correct and should be used unless we have 400+fps/hz panels wth perfect motion handling.
You're completely missing the point here.

Motion blur in real life and motion blur in video games are completely different things. The motion blur in video games is artificial. It's put there by the developer. That's why your argument "Well you can't see it when it's running as a video" is completely false. The screenshot shows a perfect capture of what is being rendered at that moment and the blur is being rendered in the actual footage.

This is also not an argument against motion blur as a whole. The problem is that 30 FPS games require extremely aggressive and extensive motion blur to keep the game playable at all.

Here is a good video that shows you what I mean:


It's about Uncharted 4 which allowed Motion Blur to be disabled in a later update. Notice that with motion blur turned off, the game becomes extremely choppy. That's why aggressive motion blur is needed in 30 fps games and also why motion blur is much less of an issue at higher framerates.
 
You're completely missing the point here.

Motion blur in real life and motion blur in video games are completely different things. The motion blur in video games is artificial. It's put there by the developer. That's why your argument "Well you can't see it when it's running as a video" is completely false. The screenshot shows a perfect capture of what is being rendered at that moment and the blur is being rendered in the actual footage.

This is also not an argument against motion blur as a whole. The problem is that 30 FPS games require extremely aggressive and extensive motion blur to keep the game playable at all.

Here is a good video that shows you what I mean:


It's about Uncharted 4 which allowed Motion Blur to be disabled in a later update. Notice that with motion blur turned off, the game becomes extremely choppy. That's why aggressive motion blur is needed in 30 fps games and also why motion blur is much less of an issue at higher framerates.

Exactly what I thought... he could only show the motion blur taking a picture.
While he is moving the camera in real time it you don't see it.

He doesn't even know if it is bluring or not when rotating the camera lol
 
Last edited:
DF sees the benefit in fidelity mode and I'm more inclined to trust in professionals than in hyperbolic users in forums.

And it's why it's a technique that is being improved and getting better over time.

I wouldn't necessarily trust DF. Not because they're not experts (they are), but because they're obsessive about technology above all else.

Some of the claims I've heard them make border on the ridiculous. John and Alex both said they could never play Ratchet on the normal performance mode because the RT is such a fundamental part of the experience. That's just crazy to me. Most casual observers wouldn't even notice the difference. I'm WAY nerdier about graphics than most gamers and even I only noticed the RT in the odd scene here and there. The idea that it's somehow integral to the visuals is laughable.

But I'm sure it is for THEM, because they're super fucking nerdy about pointless shit, way more than even most hardcore gamers.

How much this is affecting John's choice of modes in HFW I'm not sure, and from what I've seen of the 60fps mode it certainly looks noticeably worse, but then there's very little quality footage to judge from. I wouldn't be that surprised to discover that when actually playing it it's not much of a downgrade.
 
Never go full blind!
His tests are very similar to mine here.
You can't see it until you take the screenshot and wow there is a lot of blur in that frame.

And I'm not blind by anyway... my eyes correction for astigmatism make it 105% of avg. normal vision.
Before that I saw a blurry copy of all objects without glasses.

Something like the left image (the first one I found on google).

astigmatism-main-banner.jpg


Today I see without glasses like the right one.

Do you guys doesn't have astigmatism to see all that blur when rotating the camera?
 
Last edited:
DF sees the benefit in fidelity mode and I'm more inclined to trust in professionals than in hyperbolic users in forums.
Fidelity mode is no doubt awesome for screenshots and shares to Youtube. The ability to show zoom in and see each individual blade of grass and count each pixel is great for online discourse. But to actually opt to play the game at 30 fps over 60 fps is very questionable. Hyperbole or not you are getting a worse gaming experience at 30.
 
Too an extent yes, peoples obsession with it has and will put games in a retardation.
And it doesn't stop there because now 60fps isn't good enough for some.
 
His tests are very similar to mine here.
You can't see it until you take the screenshot and wow there is a lot of blur in that frame.

And I'm not blind by anyway... my eyes correction for astigmatism make it 105% of avg. normal vision.
Before that I saw a blurry copy of all objects without glasses.
You can't have the same perception because the faster movement, but the same blur is still there. You should see the same if your eyes were a high speed camera capturing the images.
 
You can't have the same perception because the faster movement, but the same blur is still there. You should see the same if your eyes were a high speed camera capturing the images.
Yes in frame to frame it is probably there... if I take a screen shot it is there.
The point is... I can't see it playing the game or even making the tests like the video.
It is not noticeable for me.

I will try to make tests with my wife.
Even so after you point it and you start the look for it you end finding it like in the Horizon gif I posted... in normal days (not looking for it) I should say that I didn't notice any blur at all in that GIF.

And I guess majority of gamers are like me... and not like you guys that notice and get annoyed that blur because never in any event talking about games and techs I reached that subject... nobody ever around me talked bad about that blur.

It is something new to me in over 30 years of playing games.
 
Last edited:
DF sees the benefit in fidelity mode and I'm more inclined to trust in professionals than in hyperbolic users in forums.
I don't think "I'm hard pressed to see any advantage" is hyperbolic or discounts what a professional can see, my point is one attribute is instantly noticeable and has an effect on the experience to me, and the other seems hardly noticeable. Maybe in Horizon I'll change my mind seeing in on my TV, but thus far the negligible advantage of fidelity mode hardly makes cutting the FPS in half worth it
 
Last edited by a moderator:
It'll never die, so you better buy a beefy PC or make your peace with it. I know you're trying to look superior with the bolded but honestly, it makes you look sad and petty. 🤣
Sad and petty? I thought it was funny. 🤷‍♂️

And I will be building a PC. 4060ti is the goal once it releases. Less inclined right now with everything being 60fps on current gen systems. If things go back to 30fps, i'll buy a prebuilt immediately if I can't get my hands on a GPU.
 
Yea guys, no idea what this blur thing is you guys are talking about.
XRI6dlt.jpg
EnWd94B.jpg
jXtY900.png
Wa4jXG1.jpg
Funny because most of the time motion blur can be turned off and guess what... motion blur is still there in performance modes too. Spin the camera while the game is running at 60fps and take a screenshot at the same time. Show me how clear it looks.
 
looks like screen picture to me
The last one is indeed screen picture.
You can see even the border of the TV/Monitor.

Edit - I just checked and you were talking about that pic in Sekiro... look at the TV border... there is no way to that to be a internal screenshot... it is offscreen camera/mobile capture... you can even see the sun (or another source of light) reflected on the screen too.

Wa4jXG1.jpg
 
Last edited:
Every game I've played on PS5/XSX, I could barely notice the difference between the picture quality in framerate/resolution modes.

The difference in how the game feels though, that's night and day.
 
Every game I've played on PS5/XSX, I could barely notice the difference between the picture quality in framerate/resolution modes.

The difference in how the game feels though, that's night and day.
I'm having the opposite issue since I moved from a 1080p TV to 4k TV in mid 2021... before it was all good.
Returnal was gorgeous on 1080p TV and now it is not anymore in a 4k CX TV.

Horizon Performance mode is really bad... I mean more bad than usual Sony 1st party Perfomance modes... the CBR used by Gerrilha is not helping their case... it looks like another game in 4k30... people here on GAF even create a topic that use Performance to compare with PS4 and say it is barely a upgrade (I agree).

4k30 is the mode the game looks next-gen.
 
Last edited:
I'm having the opposite issue since I moved from a 1080p TV to 4k TV in mid 2021... before it was all good.
Returnal was gorgeous on 1080p TV and now it is not anymore in a 4k CX TV.

Horizon Performance mode is really bad... I mean more bad than usual Sony 1st party Perfomance modes... the CBR used by Gerrilha is not helping their case... it looks like another game in 4k30.
Were you downsampling on the 1080p tv?
 
Were you downsampling on the 1080p tv?
I believe PS5 do that automatically, no?
I mean when running games with render in 4k the PS5 automatically downsample to 1080p.

That is not the case for Returnal... it is native 1080p... so no downsample involved in 1080p TVs but upscaling in 4k TVs.

My PS5 was set before to 1080p output... it not let me choose 4k output until I plugged it in my new 4k TV.
 
Last edited:
Not gonna say anyting groundbreaking but thx to options we can chose now w/e we prefer even on consoles- u wanna get beter visuals but feel clunky movement- go with 30fps, or if u are normal human being that doesnt give a damn about "cinematic experience" 24-30fps presentation, simply go with 60fps mode, current gen consoles, even tho vastly more powerful compared to ps4/xbox one are still closed boxes so something always gotta give, be it res, fps or graphic fidelity, no two ways about it.
 
How can graphics be killing graphics? Framerate is part of the equation.

Last gen, consoles were bottlenecked by the CPU, so even PS4 Pro and Xbox One X with their massively upgrading GPU's and memory bandwidth were still stuck with 30fps because of the Jaguar CPUs. This gen however, both MS and Sony went with beefy CPU's, so that's no longer an issue and won't be for a good while.
 
Last edited:
Not gonna say anyting groundbreaking but thx to options we can chose now w/e we prefer even on consoles- u wanna get beter visuals but feel clunky movement- go with 30fps, or if u are normal human being that doesnt give a damn about "cinematic experience" 24-30fps presentation, simply go with 60fps mode, current gen consoles, even tho vastly more powerful compared to ps4/xbox one are still closed boxes so something always gotta give, be it res, fps or graphic fidelity, no two ways about it.
What define "normal"? Because I'm sure most gamers plays on 30fps instead 60fps on consoles.
That is the normal.

Now on PC I rarely see anybody playing in 30fps... always 60fps or higher... so on PC 60fps is normal.
 
I believe PS5 do that automatically, no?
I mean when running games with render in 4k the PS5 automatically downsample to 1080p.

That is not the case for Returnal... it is native 1080p... so no downsample involved in 1080p TVs but upscaling in 4k TVs.

My PS5 was set before to 1080p output... it not let me choose 4k output until I plugged it in my new 4k TV.
Shit I forgot Returnal is running at a pretty low resolution. It feels fantastic and the particle effects are great but it does look muddy on my 4K tv.
 
This thread is funny to me. 60 FPS is the bare minimum a game should run at. By this time consoles should be at 120 FPS standard.

The real problem is this obsession with photorealism. Every console game has to look "omg so real" and most of them aren't even games but rather interactive tech demos. Mario Odyssey running at 60 FPS on Switch is much more impressive to me than some 30 FPS "cinematic" Sony game where you are sluggishly walking through environments and triggering animations.
 
What define "normal"? Because I'm sure most gamers plays on 30fps instead 60fps on consoles.
That is the normal.

Now on PC I rarely see anybody playing in 30fps... always 60fps or higher... so on PC 60fps is normal.
The problem is PS5 and XSX were setting expectations too high with boosting last gen games to 60 FPS. It's going to be pretty jarring going from a 60 FPS Horizon on PS5 to a 30 FPS Horizon 2. Does Horizon 2 look miles better? Sure but the play will feel sluggish until you readjust.

There are a ton of console gamers who have 'seen the light' as to why 60 FPS is superior but now will have to settle back into 30 FPS as true next gen games start coming out.
 
Shit I forgot Returnal is running at a pretty low resolution. It feels fantastic and the particle effects are great but it does look muddy on my 4K tv.
Maybe I should not use it as example if I had not experienced it on 1080p TV before :unsure:
But now I know how it looks in a native display and upscalled to 4k.
 
The problem is PS5 and XSX were setting expectations too high with boosting last gen games to 60 FPS. It's going to be pretty jarring going from a 60 FPS Horizon on PS5 to a 30 FPS Horizon 2. Does Horizon 2 look miles better? Sure but the play will feel sluggish until you readjust.

There are a ton of console gamers who have 'seen the light' as to why 60 FPS is superior but now will have to settle back into 30 FPS as true next gen games start coming out.
Yeap... there is a hardware limitation here.
Even accounting for all the advances in the GPU Arch you will still be close to the same IQ if you move from a 30fps Pro/XB1X game to 60fps PS5/SeriesX sequel.
The improvements won't have that wow next-gen factor because well framerate is really expensive and you need to compromisse.

After all in GPU terms... PS5 is around 2x PS4Pro and Series X is around 2x XB1X... you will use almost all the GPU power improvement groing from 30fps to 60fps.

120fps is another issue imo... I don't thing actual console hardware is enough to accomplish it unless you drop your IQ to PS3 levels... and the fact that it creates an unfair playfield in PVP multiplayer is bad.
I think we should let 120fps for late generations.
Now 30fps vs 60fps is a good discussion... perhaps singleplayer in 30fps and PVP in 60fps is a good trade off for each side... but for a fair playfield everybody on console should have the same framerate in PVP (60fps is possible if you don't combine with old generation pools).
 
Last edited:
No reason for a game not to have 60fps at this point. Hardware is there. Unless you're just really obsessed with a bunch of forum nerds analyzing your screenshots.

30fps is never worth the graphical gains in 2022.
 
Last edited:
No reason for a game not to have 60fps at this point. Hardware is there. Unless you're just really obsessed with a bunch of forum nerds analyzing your screenshots.

30fps is unplayable.
It's quite obvious the hardware is not there. Neither platform holder accounted for native 4K resolution for PS5 or XSX. Without a hardware based, AI solution to resolution there will always have to be a compromise.

Hopefully Cerny smartens up for PS5 Pro and builds that shit in.
 
The problem is PS5 and XSX were setting expectations too high with boosting last gen games to 60 FPS. It's going to be pretty jarring going from a 60 FPS Horizon on PS5 to a 30 FPS Horizon 2. Does Horizon 2 look miles better? Sure but the play will feel sluggish until you readjust.

There are a ton of console gamers who have 'seen the light' as to why 60 FPS is superior but now will have to settle back into 30 FPS as true next gen games start coming out.
Yup, you need to readjust to 30fps

Console gamers have "seen the light" way before this generation. Last gen had quite a lot of 60fps games. If anyone played Doom on last gen probably thought... I can't go back to 30fps. Then TLOU2 came out and they played it anyway and loved it (most people who didn't like it, didn't like it for the story, not the graphics).

Not one single person complained that TLOU2 was 30fps on last gen. They thought it was beautiful and one of the best looking games of the generation. Same thing with RDR2. I didn't once hear anyone complain about the frame rates in those games because the graphics quality made it worth it.

Why is this generation any different? If the graphics look a lot better at 30fps (like Horizon Forbidden West or Guardians of the Galaxy) then yeah, you either play it at 30fps and enjoy it, or you play it at 60fps and deal with the lower resolution and blurriness.
 
Yup, you need to readjust to 30fps

Console gamers have "seen the light" way before this generation. Last gen had quite a lot of 60fps games. If anyone played Doom on last gen probably thought... I can't go back to 30fps. Then TLOU2 came out and they played it anyway and loved it (most people who didn't like it, didn't like it for the story, not the graphics).

Not one single person complained that TLOU2 was 30fps on last gen. They thought it was beautiful and one of the best looking games of the generation. Same thing with RDR2. I didn't once hear anyone complain about the frame rates in those games because the graphics quality made it worth it.

Why is this generation any different? If the graphics look a lot better at 30fps (like Horizon Forbidden West or Guardians of the Galaxy) then yeah, you either play it at 30fps and enjoy it, or you play it at 60fps and deal with the lower resolution and blurriness.
Just to add a bit...

Most people that buys a new console buys it for better graphics... you can discuss if it is right or not but that is the console market.
You buy a new generation console expecting a big jump in graphic quality.

You see that jump in 30fps.
But 60fps move killed that and that is where the claims about the generation not delivering come.
People on GAF are enthusiastic (a lot plays on PC too) that priorize framerate but that is not the norm in the console market... consumers whats better graphics and they really don't notice or care about higher framerate.

My wife will be more impressed with a game in 30fps looking stunning than playing it in 60fps... she loved TLOU and TLOU2.

If you really have issues with framerate then your best place is PC Master Race... it always was and will continue being the place where you can archive what you want.
 
Last edited:
Yeap... there is a hardware limitation here.
Even accounting for all the advances in the GPU Arch you will still be close to the same IQ if you move from a 30fps Pro/XB1X game to 60fps PS5/SeriesX sequel.
The improvements won't have that wow next-gen factor because well framerate is really expensive and you need to compromisse.

After all in GPU terms... PS5 is around 2x PS4Pro and Series X is around 2x XB1X... you will use almost all the GPU power improvement groing from 30fps to 60fps.

120fps is another issue imo... I don't thing actual console hardware is enough to accomplish it unless you drop your IQ to PS3 levels... and the fact that it creates an unfair playfield in PVP multiplayer is bad.
I think we should let 120fps for late generations.
Now 30fps vs 60fps is a good discussion... perhaps singleplayer in 30fps and PVP in 60fps is a good trade off for each side... but for a fair playfield everybody on console should have the same framerate in PVP (60fps is possible if you don't combine with old generation pools).
Not to mention, in order for a game to run at a LOCKED 60fps, it needs to constantly be able to have enough headroom to technically run HIGHER than 60fps so they can vsync to cap it at 60. So it requires even more power to stay at a stable 60fps.
 
What define "normal"? Because I'm sure most gamers plays on 30fps instead 60fps on consoles.
That is the normal.

Now on PC I rarely see anybody playing in 30fps... always 60fps or higher... so on PC 60fps is normal.
Normal should be what 60hz screens + 60fps games did back there before 3D games.
 
No reason for a game not to have 60fps at this point. Hardware is there. Unless you're just really obsessed with a bunch of forum nerds analyzing your screenshots.

30fps is never worth the graphical gains in 2022.


I mean if you want your games to look like The Matrix UE5 demo, then you will have to eventually go to 30fps at some point, but for now 60fps is fine and cannot be compared to the issue last gen with the Jaguar CPU causing a bottleneck that prevented even Xbox One X from achieving 60fps .
 
Funny because most of the time motion blur can be turned off and guess what... motion blur is still there in performance modes too. Spin the camera while the game is running at 60fps and take a screenshot at the same time. Show me how clear it looks.
In PC games you can turn it off in a lot of cases. On consoles, absolutely not. Very few games offer the option to turn it off. Performance mode typically runs at 60 fps or higher with far less aggressive motion blur.
 
Most people that buys a new console buys it for better graphics... you can discuss if it is right or not but that is the console market.
You buy a new generation console expecting a big jump in graphic quality.
Majority on consoles plays Fortnite, CoD, GTAV (a ps360 game), and FIFA. So not because of graphics nor performance but for the new stuff on the new hardware...
 
Yup, you need to readjust to 30fps

Console gamers have "seen the light" way before this generation. Last gen had quite a lot of 60fps games. If anyone played Doom on last gen probably thought... I can't go back to 30fps. Then TLOU2 came out and they played it anyway and loved it (most people who didn't like it, didn't like it for the story, not the graphics).

Not one single person complained that TLOU2 was 30fps on last gen. They thought it was beautiful and one of the best looking games of the generation. Same thing with RDR2. I didn't once hear anyone complain about the frame rates in those games because the graphics quality made it worth it.

Why is this generation any different? If the graphics look a lot better at 30fps (like Horizon Forbidden West or Guardians of the Galaxy) then yeah, you either play it at 30fps and enjoy it, or you play it at 60fps and deal with the lower resolution and blurriness.

Last gen had 60fps games in specific genres like FPS and fighting games. It was really just the usual suspects. I think people are less likely to notice the difference when those games are limited to specific genres: they'll probably tend to just attribute the smoothness of a COD to it being a shooter rather than the frame rate per se.

Virtually all big cinematic games were 30fps last gen. People didn't complain because they just accepted that was the situation - whether you knew about the weak CPUs or not, it was easy to infer that 60fps just wasn't a viable option.

This generation is different because the decent CPUs actually make it a relatively easy concession just by lowering resolution and settings. It's basically the first time since we moved to 3d visuals that this has been the case.

Just to add a bit...

Most people that buys a new console buys it for better graphics... you can discuss if it is right or not but that is the console market.
You buy a new generation console expecting a big jump in graphic quality.

You see that jump in 30fps.
But 60fps move killed that and that is where the claims about the generation not delivering come.
People on GAF are enthusiastic (a lot plays on PC too) that priorize framerate but that is not the norm in the console market... consumers whats better graphics and they really don't notice or care about higher framerate.

My wife will be more impressed with a game in 30fps looking stunning than playing it in 60fps... she loved TLOU and TLOU2.

If you really have issues with framerate then your best place is PC Master Race... it always was and will continue being the place where you can archive what you want.

I would actually love to do a blind test of non-gamers to see what they think looks better. I strongly suspect it's a fallacy that the preference for fidelity at the expense of everything else is some kind of casual position. It looks much more like a graphics nerd's prerogative to me.

Obviously there's a matter of degree: I'm sure a casual player would prefer, say, Uncharted 4 at 30fps over Uncharted 3 at 60fps. But Uncharted 4 for at 1440p60 vs 4k30? I bet the vast majority would think 60 looks better.

So I'm not sure why 60fps should be the preserve of "PC Master Race". Who's to say that 4k and RT and ultra settings aren't the real nerdy features that people should have to buy a $3k PC to indulge in?

Not to mention, in order for a game to run at a LOCKED 60fps, it needs to constantly be able to have enough headroom to technically run HIGHER than 60fps so they can vsync to cap it at 60. So it requires even more power to stay at a stable 60fps.

Doesn't the same apply to locked 30fps modes??
 
Majority on consoles plays Fortnite, CoD, GTAV (a ps360 game), and FIFA. So not because of graphics nor performance but for the new stuff on the new hardware...
My cousin play these games and he screen to everybody in his school class room how Fortnite looks way better on PS5 than PS4.
He wants a PS5 just because that now.

Ask him about the framerate? lol
He is fine with whatever the game runs.
 
Last edited:
Top Bottom