I wish pCars was out already. Still stuck with junior account since I forgot to upgrade before it was too late. So now I have so few cars to play around in and screenshotting is no fun because of it.
Dirt 3 is such a beautiful game, especially at high resolution. Was one the first I tried with downsampling.
Those EW shots are amazing. Makes no sense to me how they thought that initial aspect ratio was any kind of a good idea.
They wanted it to feel narrow and claustrophobic. Makes perfect sense. (Also saves rendering resources because they clearly struggled)
Would've been nice if it was actually 2:35:1 like 1920x800 as well.
Do you know if is possible apply some AA filter out of the game?
Game is OpenGL (I am assuming anyway. Unless they changed that part of IT5 too), so unfortunately it is not possible outside of driver downsampling + in game Post-AA
Oddly enough, it's DX11
Pinball FX2 - South Park tables 1440x2560
Oddly enough, it's DX11
Great, just another game that needs good AA badly that runs on DX11 that we could theoretically get good AA for if Nvidia gave us the equivalent functions for DX11 to use SGSSAA.
At least since it's DX11 if it happens someday, there may be a solution. But not any time soon.
DSR when it's working will have to suffice
helifax said:Gave a look..
Apparently is an OGL engine using a Dx11 context to draw...
I dunno who is playing here (either nvidia or the dev) but WHY THE FCK would you render an OpenGL context inside a DX11 one????
So unless we can make the engine render in OpenGL (the proper way) can't fix anything....
Btw CM mode is not working....
Looking in the dev console I see OGL calls...
My wrapper is being detected on launch but that's it.... it switches to DX11 after....
OGL renderer:
OGL in DX11 emulated renderer ( Lazy port?!?!?!)
Now I wish I haven't bought this ....
helifax said:ForgottenProdigy said:I don't get it, so it's using both dx11 and opengl to render the game? Aren't they completely different engines? But then if it's using dx11 can't migoto work on it?helifax said:Gave a look..
Apparently is an OGL engine using a Dx11 context to draw...
I dunno who is playing here (either nvidia or the dev) but WHY THE FCK would you render an OpenGL context inside a DX11 one????
So unless we can make the engine render in OpenGL (the proper way) can't fix anything....
Btw CM mode is not working....
Looking in the dev console I see OGL calls...
My wrapper is being detected on launch but that's it.... it switches to DX11 after....
It's an OpenGL engine that uses DX11 render targets. Possibly the same way my wrapper works by using some sort of inter-operability layer. Basically all the draw calls and internal format is OpenGL. Just the Back Buffers + swap mechanism is done in DirectX11. You can share framebuffers between OGL and DX9-11.
I haven't seen an engine behave like this...
I tried will all the OGL debuggers out there...none of them is picking anything up....so something is very very weird with this game....
TBH telling me that some weird cinematic 21:9 AR and 30fps cap is HOW THEY INTENDED is bullshiet..... now we get a hybrid engine as well....
3DMigoto won't work since the draw calls are not DX. My wrapper will not work since is not drawing in the OGL buffers....but rather DX11...
TBH it deserves the bad review is getting on Steam....
bo3b said:Wow, really weird. Strange approach on their part.
I sort of see what's happening though- The game itself was written using OpenGL, and compiles at runtime like usual. Which is why Helixfax's wrapper can see the shaders.
Then, they use a DX device as their target, probably because that's what runs on nextgen consoles. As far as I know there is no OpenGL layer/driver on next gen consoles.
Edit: actually looks like PS4 is native OpenGL, Xbone is DX11. So this is maybe the layer to handle that stupidity so it works on both consoles.
In either case, since it's got a DX11 target as the destination, 3Dmigoto can see it and wrap it. By that time, the shaders are binary, and 3Dmigoto only works on binary anyway, and can see and turn them back into HLSL. So for this game, we have an OpenGL to HLSL converter.
As far as the 3D not kicking in, that's part of automatic mode, and could be all sorts of things. Since the render target is DX11, that's where we'll need a fix to get it to run.
Are there games where profiles affect the cut-scenes? Make them 3D? If so, that profile might work here.