Bayonetta out on Steam (4K, dual audio), $19.99, PC launch trailer, dev diary

You do realize that the image actually lack of details when no AA solution is applied and that there are no benefits of such an image whatsoever, right ?
The absolute aim for "sharpness" is something that I will probably never understand in the gaming community and it might come from a misunderstanding of the aliasing problem, especially when so much players seem fond of Reshade presets using LumaSharpen while it actually does more harm than good.

Aliasing (and not sharpness) is a loss of informations.

In the case of real-time computer graphics, this loss of informations mostly happens during the rasterization step in the graphic pipeline. During this step, we go from representing our world/scene using vectors (hence a 3D space representation) to representing it using matrices (hence a 2D represensation) so that our screen, being is a matrix itself, can display it.

The big deal with this step is that we go from a continuous problem to a discrete problem, which means that, in order to display your scene of a screen/matrix that has a fixed size (hence a finite problem), you WILL lose informations, and there is nothing you can do about that fact.

This is why, in order to mitigate this issue, we need anti-aliasing. As you might already know, there are many types of AA solutions, and many of them come with something along the line of a slight blur or ghosting artifacts for example. While we can say that these are problems, we can see that the benefits easily outweight the drawbacks.

It is also to be noted that pushing resolution higher and higher is not a solution to the aliasing problem (not that it is an actual solvable problem in the first place anyway).
It is hiding it better indeed, but it does not actually address it. Which explain why in most case, SGSSAA is the best solution over DSR which do not necessarily help for certain types of aliasing.

In the end, your image without AA applied is not sharp, it is just very badly sampled.

This is very informative. Thank you.

When I say "sharpness" I'm primarily referring to texture clarity. Anti-aliasing solutions, particularly SGSSAA with poor negative LOD sampling, blur the overall image. The aliasing issue is solved by texture detail is lost over distance in the same way that FXAA and other post processing solutions mangle image clarity.

Given this can impact clarity of not just textures, but shaders applied to said textures, I find the blur detrimental to the overall presentation. Though I loathe aliasing, I have an equal of not greater detest for loss of texture clarity. One of the benefits I enjoy with playing at high resolutions is not the better hidden aliasing, but the sharper texture clarity. Not artificial sharping like with ReShade, but clarity of display pixels relative to resolution.

Is there a MSAA solution for Bayonetta? I'd probably prefer that to SGSSAA as it likely won't have the IQ blur.
 
...The Wonderful 101 is a Nintendo IP, developed by Platinum games.

Ah. That means unless Platinum Games buys the IP back from Nintendo - which they won't - the game will never see the light of day on the PC, and we can all forget about a 4K version of the game.

Don't like saying this but... Cemu anyone?


God I wish they were able to put the sequel on Steam as well.
Even though Saur and I do not like the sequel of the game too much (it's not the same combat quality as the first one as it's more restricted, somewhat broken, and the genius aspect of what makes the first game awesome has been removed), I wished the game was available on Steam too. It sucks so much for without a Wii U (and cemu), there is really no other way of playing the game at all. That's a painful pill to swallow I'd have to say and one that we cannot do anything about. And no, we cannot blame Platinum Games at all and we cannot blame Nintendo either. There is only one company to blame, only one to point all fingers at and that company is.... SEGA.
 
This is very informative. Thank you.

When I say "sharpness" I'm primarily referring to texture clarity. Anti-aliasing solutions, particularly SGSSAA with poor negative LOD sampling, blur the overall image. The aliasing issue is solved by texture detail is lost over distance in the same way that FXAA and other post processing solutions mangle image clarity.

Given this can impact clarity of not just textures, but shaders applied to said textures, I find the blur detrimental to the overall presentation. Though I loathe aliasing, I have an equal of not greater detest for loss of texture clarity. One of the benefits I enjoy with playing at high resolutions is not the better hidden aliasing, but the sharper texture clarity. Not artificial sharping like with ReShade, but clarity of display pixels relative to resolution.

Is there a MSAA solution for Bayonetta? I'd probably prefer that to SGSSAA as it likely won't have the IQ blur.

The fact is that while SGSSAA will result in a (very and bearable) subtle blur on textures, you still eliminate most of the texture aliasing present on your AA-free screen, such as the strong dots of the stone on the stairs and the pillars which mostly appear because of lighting "conflicts" between the normal and the specular maps, and which are even more visible while moving. In this case, the textures are not sharp, they are just badly sampled and filtered.

It reminds me of a test I did some time ago. I was showing two textured objects to my friends which are not particularly knowledgeable in computer graphics, one with a nearest filter, and the other with a linear + anisotropic filter applied on the textures.
Turned out most of them preferred the former because "it looks sharper and more detailed". While it may look so, it is not, and it is not a matter of taste or preference but a matter of fact.
A badly filtered texture is not a sharp texture. And this is a problem that cannot be solved using higher texture resolution, quite the contrary actually. Using high resolution normal and roughness textures in a physically based rendering pipeline without any efficient texture/specular aliasing solution is absolutely nightmare fuel for the artists.

Regarding MSAA, I think I saw some pages ago people applying it using Nvidia Inspector. The ingame MSAA x2 is supposed to work too.
But given its nature, MSAA will only work on the geometry of the scene, and will do next to nothing for texture, alpha or specular aliasing for example. And there is a lot of the latter in your image.
 
So...

You can just turn on super sampling in the nvidia control panel and it works. No bits or finagling, just straight up SSAA.

7RPoXhY.png



If it looks a bit *over* sharp it's because I forgot to reset my LOD settings after disabling SGSSAA.
 
Funny to see how SSAA by itself is useless for shader/transparency aliasing as you can see on the geometry of Bayonetta when there is this purple effect and smoke around her on the second screen especially.
 
The fact is that while SGSSAA will result in a (very and bearable) subtle blur on textures, you still eliminate most of the texture aliasing present on your AA-free screen, such as the strong dots of the stone on the stairs and the pillars which mostly appear because of lighting "conflicts" between the normal and the specular maps, and which are even more visible while moving. In this case, the textures are not sharp, they are just badly sampled and filtered.

It reminds me of a test I did some time ago. I was showing two textured objects to my friends which are not particularly knowledgeable in computer graphics, one with a nearest filter, and the other with a linear + anisotropic filter applied on the textures.
Turned out most of them preferred the former because "it looks sharper and more detailed". While it may look so, it is not, and it is not a matter of taste or preference but a matter of fact.
A badly filtered texture is not a sharp texture. And this is a problem that cannot be solved using higher texture resolution, quite the contrary actually. Using high resolution normal and roughness textures in a physically based rendering pipeline without any efficient texture/specular aliasing solution is absolutely nightmare fuel for the artists.

Regarding MSAA, I think I saw some pages ago people applying it using Nvidia Inspector. The ingame MSAA x2 is supposed to work too.
But given its nature, MSAA will only work on the geometry of the scene, and will do next to nothing for texture, alpha or specular aliasing for example. And there is a lot of the latter in your image.

I find your posts informative and I welcome that, but the reality is regardless of objectivity I personally do not always prefer what appears to me to be excessive texture blur, and anti-aliasing solutions that blur the IQ can range from tolerable to awful. If your technical eye can see improvement and appreciate it where I cannot, so be it. Personally I think the SGSSAA blur in Bayonetta is noticeable and unfortunate, and the game would look better with clarity improvements.

I'll try MSAA and see if I find the unsolved aliasing on alphas and spectaculars a welcome tradeoff for IQ clarity.
 
I find your posts informative and I welcome that, but the reality is regardless of objectivity I personally do not always prefer what appears to me to be excessive texture blur, and anti-aliasing solutions that blur the IQ can range from tolerable to awful. If your technical eye can see improvement and appreciate it where I cannot, so be it. Personally I think the SGSSAA blur in Bayonetta is noticeable and unfortunate, and the game would look better with clarity improvements.

I'll try MSAA and see if I find the unsolved aliasing on alphas and spectaculars a welcome tradeoff for IQ clarity.

Don't misunderstand me, I perfectly understand that you don't like a certain result and prefer another, since we would then talk about tastes and I do not have a say about yours.
What I was trying to suggest with my previous post is that you seem to confuse IQ clarity (IQ is something that is not that easy to define since it depends on so many parameters, and I am not sure about the "clarity" part) with aliased textures.

The reason is that for most people, aliasing on textures implies the presence of pixels that will stand out a lot (and for no good reason) that are actually confused with details, which they are not. On the other hand, a properly anti-aliased and filtered texture could appear bland or flat.
The same thing happens with screen color accuracy since many people find the image a correctly calibrated screen very bland (aka my whole family), and usually prefer "vibrant" (ewww) colors despite them being completely incorrect.

An anti-aliasing solution that try to address virtually every kind of aliasing types, such as SGSSAA or TAA on the post-processing side, without any drawbacks such as a slight blur or ghosting, which we can try to minimize of course, is extremely difficult, if not impossible to achieve. At least in real-time.
 
Hmm. Not super crazy about the 4x SGSSAA quality blur reduction via LOD bias, but I'll probably use it anyway.

b194on6.png

Honestly, I prefer the results from a straight up 4x DSR factor.

You do realize that the image actually lack of details when no AA solution is applied and that there are no benefits of such an image whatsoever, right ?
The absolute aim for "sharpness" is something that I will probably never understand in the gaming community and it might come from a misunderstanding of the aliasing problem, especially when so much players seem fond of Reshade presets using LumaSharpen while it actually does more harm than good.

Aliasing (and not sharpness) is a loss of informations.

In the case of real-time computer graphics, this loss of informations mostly happens during the rasterization step in the graphic pipeline. During this step, we go from representing our world/scene using vectors (hence a 3D space representation) to representing it using matrices (hence a 2D represensation) so that our screen, being is a matrix itself, can display it.

The big deal with this step is that we go from a continuous problem to a discrete problem, which means that, in order to display your scene of a screen/matrix that has a fixed size (hence a finite problem), you WILL lose informations, and there is nothing you can do about that fact.

This is why, in order to mitigate this issue, we need anti-aliasing. As you might already know, there are many types of AA solutions, and many of them come with something along the line of a slight blur or ghosting artifacts for example. While we can say that these are problems, we can see that the benefits easily outweight the drawbacks.

It is also to be noted that pushing resolution higher and higher is not a solution to the aliasing problem (not that it is an actual solvable problem in the first place anyway).
It is hiding it better indeed, but it does not actually address it. Which explain why in most case, SGSSAA is the best solution over DSR which do not necessarily help for certain types of aliasing.

In the end, your image without AA applied is not sharp, it is just very badly sampled.
Yup, this is a wonderful explanation, and I agree that SGSSAA is much more effective, I just don't like it over DSR in this particular instance.
 
Sorry if this has been asked before, but what are the steps to downsample this from 4k to 1080p? I got a 1070 and 6700k if this helps.
 
Sorry if this has been asked before, but what are the steps to downsample this from 4k to 1080p? I got a 1070 and 6700k if this helps.

Open Nvidia control panel. Go to 3D settings. Go to global tab.


Go to DSR and check all boxes.


Apply.


Now in game, select a 4K resolution.



Also, you can adjust how soft or sharp that supersampling is by going back to the global tab and under DSR, change the slider from 30% to wherever you prefer. 0% is more shimmering, but "sharper" image.
 
Is anyone else noticing a bit of hitching(frame pacing?) Issues in the game? I'm on a 1080 so I can't imagine why this would be happening.



It's slight but I notice it here and there when panning the camera.
 
Is anyone else noticing a bit of hitching(frame pacing?) Issues in the game? I'm on a 1080 so I can't imagine why this would be happening.



It's slight but I notice it here and there when panning the camera.

Yea I had some last night but I honestly just figured it was because I had my PLEX server open streaming some 1080p shit to my TV while playing.
 
Do you have to set negative LOD manually when using SGSSAA? I tried it with driver-controlled LOD and the picture was really, really blurry.
 
You do realize that the image actually lack of details when no AA solution is applied and that there are no benefits of such an image whatsoever, right ?
The absolute aim for "sharpness" is something that I will probably never understand in the gaming community and it might come from a misunderstanding of the aliasing problem, especially when so much players seem fond of Reshade presets using LumaSharpen while it actually does more harm than good.

Aliasing (and not sharpness) is a loss of informations.

In the case of real-time computer graphics, this loss of informations mostly happens during the rasterization step in the graphic pipeline. During this step, we go from representing our world/scene using vectors (hence a 3D space representation) to representing it using matrices (hence a 2D represensation) so that our screen, being is a matrix itself, can display it.

The big deal with this step is that we go from a continuous problem to a discrete problem, which means that, in order to display your scene of a screen/matrix that has a fixed size (hence a finite problem), you WILL lose informations, and there is nothing you can do about that fact.

This is why, in order to mitigate this issue, we need anti-aliasing. As you might already know, there are many types of AA solutions, and many of them come with something along the line of a slight blur or ghosting artifacts for example. While we can say that these are problems, we can see that the benefits easily outweight the drawbacks.

It is also to be noted that pushing resolution higher and higher is not a solution to the aliasing problem (not that it is an actual solvable problem in the first place anyway).
It is hiding it better indeed, but it does not actually address it. Which explain why in most case, SGSSAA is the best solution over DSR which do not necessarily help for certain types of aliasing.

In the end, your image without AA applied is not sharp, it is just very badly sampled.
That's a great explanation, cool article you linked there too!
:P
 
god damn this game is so fun

it's the first Kamiya directed game on Steam too

Where the fuck is Okami HD Capcom (with 60fps or uncapped framerate)
 
Yea I had some last night but I honestly just figured it was because I had my PLEX server open streaming some 1080p shit to my TV while playing.


I just double checked Resident Evil 7, Automata, Witcher 3, and Dark souls 3 and they're all fine on my system. I guess this is a port of a 7 year old game so weird little issues like this are bound to creep up.
 
Is anyone else noticing a bit of hitching(frame pacing?) Issues in the game? I'm on a 1080 so I can't imagine why this would be happening.



It's slight but I notice it here and there when panning the camera.

Not sure if this 100% gets rid of it, but switching from Borderless Windowed to Fullscreen and the ingame Vsync option did wonders.
 
Is anyone else noticing a bit of hitching(frame pacing?) Issues in the game? I'm on a 1080 so I can't imagine why this would be happening.



It's slight but I notice it here and there when panning the camera.

There's definite frame pacing issues of some sort, yes. I initially thought I was over-pushing my 970 etc. with too much downsampling, but when I finally used a hardware monitor saw the game was barely using my hardware. Now I just force myself to try to ignore it.
 
The frame pacing arises from the 59 fps cap, and the game being 60 hz refresh in full screen mode, running in boderless or window with 60 desktop refresh fixes it.

Metal Gear Rising had a similar issue, but its fix is not working. Basically have to force the game to render at 60 hz. A custom 59 hz resolution is also not working, it seems the frame rate is 59.9 or some such.
 
The frame pacing arises from the 59 fps cap, and the game being 60 hz refresh in full screen mode, running in boderless or window with 60 desktop refresh fixes it.

Metal Gear Rising had a similar issue, but its fix is not working. Basically have to force the game to render at 60 hz. A custom 59 hz resolution is also not working, it seems the frame rate is 59.9 or some such.

Doesn't running in borderless introduce input lag though? I'll try it right now and see if my game still hitches here and there.
 
So i forced AA for Bayonetta in Radeon Settings.
I chose 4xEQ Adaptive Multisampling. I am having FPS drops with Supersampling on all 3 levels 2x, 4x and 8x. Also there are some drops with 8x Adaptive Multisampling, with 4xEQ it should be fine.

I also changed Texture Filtering Quality to High and Tessellation level to 64x, but i am not even sure is there any visual difference. But i am pretty happy with AA results.

No AA:


4xEQ MSAA:



My only problem now is lack of colors. Game just looks too washed out sometimes.
 
I'm on the Burning City chapter and nothing I seem to do can make this part run good. Even on low preset, my 970 seems to get super stuttery, which is an absolute killer in this kind of game.
 
So i forced AA for Bayonetta in Radeon Settings.
I chose 4xEQ Adaptive Multisampling. I am having FPS drops with Supersampling on all 3 levels 2x, 4x and 8x. Also there are some drops with 8x Adaptive Multisampling, with 4xEQ it should be fine.

I also changed Texture Filtering Quality to High and Tessellation level to 64x, but i am not even sure is there any visual difference. But i am pretty happy with AA results.

No AA:
4xEQ MSAA:

My only problem now is lack of colors. Game just looks too washed out sometimes.

Actually, i think that there might be some difference in textures:

qocg3yh.png
CWNacTo.png


8QOrEtZ.png
gljOky2.png


Or maybe it's just me??
 
Doesn't running in borderless introduce input lag though? I'll try it right now and see if my game still hitches here and there.

I believe there is a workaround for this on W8, but yes it does on W10 because all windows are triple buffered first.

Fortunately the input lag is extremely small so it might be worth trying regardless.
 
well I found the best way to play,

Using a batch script to change resolution to 4k 60 hz(for DSR), and run the game at 4k in borderless window. Then another batch command which changes the resolution back to native, after you quit the game of course. Oh VSYNC on in the game, otherwise the frame pacing is all over the place.

Played a bit and it run fine, no frame pacing hitches or the like and DSR benefits.
 
AMD FX 8320, 1366x768 (the "default" for my monitor). CPU is old, I know, but things had been fine up to this point.

The game evidently doesn't like AMD's cpus very well, so I would peg that as the issue, especially as the Burning City is one of the more intensive levels of the game owing to the effects - it similarly bashed performance back on consoles.
 
Those quick and aggressive claw monkeys in chapter V used to give me so much trouble back in the day. They used to be a real roadblock for me.

Just now I defeated both of the them by bomb-dodging them via the rosary without taking a single hit and only listening to their sound cues.

So incredibly satisfying. This is my personal magic moment of the day. This really scratches that itch of learning a mechanics system that Monster Hunter and Dark Souls gave me.
 
Because of you gafers hyping me up with screenshots I had to finally finish Bayonetta!
I only have the wii u version

and it was so amazing!
Now I play Bayonetta 2 and I hope it will also somehow release on steam
even if it is nearly impossible
, because it is such a shame this amazing game isn't played by many gamers
 
The frame pacing arises from the 59 fps cap, and the game being 60 hz refresh in full screen mode, running in boderless or window with 60 desktop refresh fixes it.

Metal Gear Rising had a similar issue, but its fix is not working. Basically have to force the game to render at 60 hz. A custom 59 hz resolution is also not working, it seems the frame rate is 59.9 or some such.

Good to know, thanks!

Unfortunately I personally don't want to put up with the input lag from those fixes, but at least I understand now what's going on.
 
The game evidently doesn't like AMD's cpus very well, so I would peg that as the issue, especially as the Burning City is one of the more intensive levels of the game owing to the effects - it similarly bashed performance back on consoles.

I remember on the 360, I want to say I remember the WiiU version playing a bit better there. Nothing nearly as bad as what I'm experiencing with this.
 
The frame pacing arises from the 59 fps cap, and the game being 60 hz refresh in full screen mode, running in boderless or window with 60 desktop refresh fixes it.

Metal Gear Rising had a similar issue, but its fix is not working. Basically have to force the game to render at 60 hz. A custom 59 hz resolution is also not working, it seems the frame rate is 59.9 or some such.


Switching to borderless windowed fixed the frame pacing issues for me thank you!

They were slight but it was still bugging me.
 
Isn't the input lag with triple buffering lower than with 'normal' Vsync?

This shit is confusing me to no end.

My understanding is that if the game is holding a stable framerate at 60, triple buffering in DirectX will increase input lag by a frame. Apparently this isn't true of OpenGL. Not really familiar with the specifics to be honest.
 
Shelving this until I figure out the deal with the framerate. Chapter VII was nearly unplayable -- got a gold due to the sheer choppiness of things making dodging a total guess -- and Chapter VIII actually is unplayable.
I know that it doesn't necessarily help you, but I just got up to Chapter IX, and since fixing the issue with HPET on my R7-1700X/GTX 1070 system, it hasn't dropped a frame at 4587x1920 resolution. (1.78x 3440x1440 DSR)
I tested Chapter VIII on my i5-2500K/GTX 960 system too, and at 1080p it was mostly running at 60, with a handful of drops to ~55 FPS which seemed CPU-related not GPU.
The game performs better with Windows 10's Game Mode disabled on that system.

I have an i7. The AMD-related fix did not work.
For what it's worth, the HPET issue is not AMD-specific, it's just likely to have been enabled on Ryzen systems because AMD's Ryzen Master software used to require it.
Intel systems can be using HPET too, it's just far less likely unless you have enabled it yourself for some reason. Sometimes I see "performance guides" that recommend it.
Either the game doesn't like AMD GPUs, or your i7-920 is just too old to keep it locked to 60, assuming that you've actually confirmed that HPET is not being used. Enabling HPET killed performance on my i5-2500K system too.

I've also seen unusually low performance when system monitoring tools like Process Explorer were running, or when Afterburner was recording a video, despite the game never dropping below 60 even if I lock it to a single core (two threads) on the 1700X system. So if you have anything like that running, you might want to disable it.
I haven't had a chance to investigate those results, but it seems like there's maybe something in the game/engine where, if your CPU can't keep the performance above a certain level, framerate just drops off a cliff instead of only dropping a few frames like most other games would.

How did you do the Chapter 5 Alfheim where you have to stay up for a minute without the whip? I didn't try for particularly long but it seemed pretty hopeless when I did. I think the most I got was 20 seconds.
Fake edit: LOL I'M STUPID YOU GET THE WHIP IN CHAPTER 3 AND I MISSED THE LP.
Huh, I must have missed the whip too. I just beat that challenge by using certain combos in the air that you could then jump from, and from jumping on top of the enemies.
I assumed that they wouldn't have put a challenge in the game that I wouldn't be able to beat without upgrades/items.

Honestly, I prefer the results from a straight up 4x DSR factor.
A lot of people confuse texture aliasing for "detail".
It's probably why NVIDIA never bothered to fix the Negative LOD Bias Clamp.
Trying to play older games that need it, is a complete mess on modern GPUs.

Can someone come out with a cheat save file?
I don't wanna go through this shit again lol
I just checked with CheatEngine, and the game uses the exact value to store the number of Halos that you have (some games try to obfuscate it) so it's trivial to lock it to 99999999 if you wanted to.
Just be careful when installing CheatEngine as I think it has some crapware bundled with it that you have to opt out of now.

If you've never used it before:
  1. Select Bayonetta.exe from the process list.
  2. Scan for your current value of Halos
  3. Change that number in the game. Collect more, buy an item in the shop etc.
  4. Scan again for the updated value.
  5. This should give you a list of five or so memory addresses. Changing one of these - it was the top one in my case - should set the number of Halos that you have.
I'm playing through the game normally as it's my first time, but CheatEngine is a great way to get around bad game design that forces you to waste hours farming for materials in games where that's not a challenge, just a time sink.
Be sure not to leave it running or try to use it in online games though.

The frame pacing arises from the 59 fps cap, and the game being 60 hz refresh in full screen mode, running in boderless or window with 60 desktop refresh fixes it.
Metal Gear Rising had a similar issue, but its fix is not working. Basically have to force the game to render at 60 hz. A custom 59 hz resolution is also not working, it seems the frame rate is 59.9 or some such.
What I had to do with previous Platinum ports was run them in borderless windowed mode, then switch my desktop to 59Hz and enable V-Sync.
Fullscreen mode in Platinum games switches the display to 60Hz for some reason, and their framerate cap appears to be something less than that (possibly 59.94) which causes them to tear constantly if you have V-Sync disabled, or stutter constantly with it on.
Even with a G-Sync monitor, forcing it to run in "60Hz G-Sync" rather than "100Hz+ G-Sync" can cause it to tear depending on how you have G-Sync configured.

Metal Gear Solid V had the same problem too.
With MGS, editing the config file to unlock the framerate, and then enabling V-Sync to cap it to 60 fixed the stuttering problems that I had.

Now need to figure out how to get DSR working with borderless window, gonna try GeDoSaTo Tool.
You have to manually set the desktop resolution to match the game if you want to use DSR in windowed mode. Very few games change the desktop resolution in borderless mode.

Seems like ReShade completely disables SGSSAA settings in NVIDIA inspector. Has anyone found a work around?
Unfortunately that's not specific to Bayonetta. It happens in every game that I've tried it with so far. Reshade just seems to conflict with SGSSAA.
 
Always been that way. The game has some really cool designs and nice VFX work and stuff but it always felt weirdly muted-looking. Especially compared to DMC4.

I agree Bayo 1 is very muted in a lot of its levels due to the questionable sepia filter, but DMC4 wasn't exactly that much better. Each of the environments there were all very homogenous, in that the castle was gray, the ice sections were blue, the fight with Berial was red, etc. I'd also say it, like Bayo, had fairly washed out lighting without any real contrast.
 
Top Bottom