Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
Yikes😂😂 you take this console war shit deeply huh? And if we're being honest Ninja theory's HB2 looked better than the tech demo 🤷🏾‍♂️

Who did say it didn't look as good/better? But what hardware was it running on? You know there are professional PC's that 2080Ti could look like a toy to them, which they use to produce CGI offline, just like last Far Cry 6 trailer. Is it running in real time or pre-rendered as it's 21:9 aspect ratio at 24fps and Digital Foundry confirmed it's not running in real time (they work closely with Microsoft)?

This is a playable demo running on PS5 in real time, they don't compare. You may correct me if you have a proof that Hellblade 2 trailer was running in real time on Xbox Series X and I would be glad to acknowledge that and if you are empty handed, and you will be, don't try to make such smart posts.
 
Last edited:
How dare you forget '12 teraflops' and 'Velocity Architecture'? There's still plenty of space there, what do you need rating and game name for?

I think I have it covered now...

VnGcjgYnSzOawxPx9CtLQYs3gR6Px7klqhw-5NpXPGz-vjAP0VHKzPVR8QFw-Y2aQ_zv_b2GQggQR20ULgJy5rj1CpvVprjQhRqBEizbEEt_0N4G5Zqd6hur7mPX9k5vg28LR_sN_rhH-VnNvtgBiJU6e_QkoxR8o1Tu4t195B7jAN34PdYMop9uLamrAhSt_KyCS_F2szgwp-brnBmMmz9H5VgB8QZw5_DtEJbUEZDliRsBE5w-7gjZqHRujCO_Gtc6VDTJM8XKcofCs89oLv6BKhiEkA5DqxdOhjwjEVMr_m3xFnWTp9m5_rzuttOU8NSE22-4LCOB-lrZ8wfb8KFWEYs2b1bVWHX7LMxhNm3o0N6gkKdjHNhXSFBPzSxpsqilC4J2rwgXLF7X0gqRQggGZLr13ADZkJxdvdGDyfE6atcS2n86pHljRlk3WFCP-wGdXRZtrruy-a7kf7OTfNFvIJeQDqh1zc-ngNbtta6uPwQhLsycU48oY9bn9OjWFMrJpKgPIs6tAXyouw0wEcW_m7nspXn_iILq6g1H_o4IEs_xdvT3k6KGomaeJIDlZsvfZGlf8svkgN5v3sfHPIQr88FyXSw6fSv5QAca329VgHne_vr214mbNLnU11h1DzP22Bjxo6wgO6woHChUn7H2_VCmjbP1ojyyIS1Olc68PL_LgQWTLuWHBoIF=w980-h1269-no
 
There's a few hours till the end of the day, he could still be right. :messenger_winking:

And pigs might fly, cats to if they get near my boot.
Let me be exact, i am talking about FORUM cats here. The vermin that live in these pages.

Come at me PUSSIES>. YEP, i can get away with saying that. :messenger_beaming:

Ps £399 or bust, let the dream come true. Hype.
 
Btw does anyone know what happened to xacto.

was one of my favourite cats on this forum, i did threaten to run him in a car over in an attempt to use up some of his nines lives, now he is MIA. Woops

Also, stuck with fatcat garfield, not a good replacement. Lazy thing.
 
That optimised for XSX badge was damage control at its finest.

Wrong thread.

Any backlash against that guy yet?

He's got away with it on a technicality. Because world wide people are saying pages are going up. Lucky dude.
Still reckon it drops next week. Before xbox show.

Then probably an early august show.
 
Last edited:
Who did say it didn't look as good/better? But what hardware was it running on? You know there are professional PC's that 2080Ti could look like a toy to them, which they use to produce CGI offline, just like last Far Cry 6 trailer. Is it running in real time or pre-rendered as 21:9 aspect ratio at 24fps and Digital Foundry confirmed it's not running in real time (they work closely with Microsoft)?

This is a playable demo running on PS5 in real time, they don't compare. You may correct me if you have a proof that Hellblade 2 trailer was running in real time on Xbox Series X and I would be glad to acknowledge that and if you are empty handed, and you will be, don't try to make such smart posts.

I don't think the HB2 footage was real time. The Senua model has far greater fidelity than the Project Mara model.

You'd expect Project Mara to have equal fidelity with HB2, considering the scope of the game.

18D0DD238AE9A5E48C1574F1D5154D04B8A6974F


JQaW6kmg_o.gif
 
I don't think the HB2 footage was real time. The Senua model has far greater fidelity than the Project Mara model.

You'd expect Project Mara to have equal fidelity with HB2, considering the scope of the game.

18D0DD238AE9A5E48C1574F1D5154D04B8A6974F


JQaW6kmg_o.gif

Just like the latest Far Cry 6 CGI trailer. I think we'll reach that in 7-10 years.

06a309669c39c18aa908ebb197079e49a171cf93.gifv
 
I dont understand why Ubisoft don't just start up animation studio. They are better at cgi story teases then showing real gameplay.
They should really start an animation studio. I think they would kill it.
 
Just like the latest Far Cry 6 CGI trailer. I think we'll reach that in 7-10 years.

06a309669c39c18aa908ebb197079e49a171cf93.gifv

This is actually pretty much at TLOU2 level, so i think it will be sooner, and next ND game will probably surpass this i think

Also just to say, maybe unpopular opinion but Gustavo is my actual favorite character in Breaking Bad series :lollipop_sunglasses:
 
This is actually pretty much at TLOU2 level, so i think it will be sooner, and next ND game will probably surpass this i think

Also just to say, maybe unpopular opinion but Gustavo is my actual favorite character in Breaking Bad series :lollipop_sunglasses:

He's a very interesting character in the best series in history (undebatable). :messenger_sunglasses:
 
This is actually pretty much at TLOU2 level, so i think it will be sooner, and next ND game will probably surpass this i think

Also just to say, maybe unpopular opinion but Gustavo is my actual favorite character in Breaking Bad series :lollipop_sunglasses:

No it's not lol. It really isn't. TLOU2 doesn't touch this or HB2.

Yes, Gus was the best.
 
That's the next step. I believe by mid-gen refreshes will get at least 50% of that level, as static environments are already doing photorealism much easier.
There will always be a new level to aim for.... that's why streaming will never be a longterm thing either, you can't compress and transfer and decompress all that information efficiently.
 
And pigs might fly, cats to if they get near my boot.
Let me be exact, i am talking about FORUM cats here. The vermin that live in these pages.

Come at me PUSSIES>. YEP, i can get away with saying that. :messenger_beaming:

Ps £399 or bust, let the dream come true. Hype.
YOU CUT THAT OUT, PONYO!!! :messenger_squinting_tongue:
 
And again Xbox fans compare UE5 demo to something not on real-time, not even running on XSX? Come back here when you REALLY have something to discuss.
I said what I said and I meant what I meant 🤷🏾‍♂️
I invite you to come back and quote me in 10 days after the July event. If I'm wrong I'm wrong but I stand on my confidence in Ninja Theory.
 
Last edited:
But that is not logical because the CPUs run at 3.5 / 3.6 Ghz which is well above the GPU 2.23 Ghz, and lets face it Zen 2 PC parts boost way over 4 GHz already. A Finfet is the same it does not know its in a CPU or GPU part of the APU.

So its not a transistor speed or leakage issue, its likely just as Cerny said, the logic on the way GPU works means you can go higher but the logic does not keep up so its not worth it - performance stops giving.....

Cerny also said the 3.5 Ghz CPU and 2.23 Ghz GPU were equally easy to cool (I cant recall exact words but it was to that effect),, that is the important point (unless of course Cerny is lieing again lol)
I haven't checked this thread for a few days so not sure if your above discussion is still relevant, I must be atleast 10 pages behind so I'm posting this blind :lollipop_sunglasses:

There's likely a few factors at play why GPU can't be clocked as high as CPU, gonna 'try' explaining some seemingly unrelated stuff first cos I believe it'll be relevant. This info can be for anybody who are interested in some 'simple' basics. Apologies for being a smartarse :messenger_beaming: but if it irritates sircaw then it's worth it:messenger_horns:

Every conductor has resistance that will convert some of the current flow to heat, and in return the heat causes more resistance in the conductor. Alternating current also get 'impeded' as current generates a magnetic field at right-angles to the flow (EMF) which in turn opposes the changes (long story short) pushing back on the changing current. So higher the frequency higher the impedance, longer the conductor more resistance and impedance, and more wasted heat further increasing the resistance

For these reasons the industry is packing everything in as close as possible, smaller distances means less resistance less impedance therefore less heat generated/wasted so can go much higher frequencies with less power (power is a product of current and voltage). There's a limit on how small they can go before they start hitting up against the laws of physics, get too small and quantum tunnelling will cause shit load of problems

Transistors in modern digital electronics are MOSFETs (metal-oxide-semiconductor field-effect-transistor) commonly connected in complementary pair called CMOS (one FET for pulling voltage high '1' and other for pulling low '0') the refined FinFET works on similar principle. FETs configured this way only draw power (and generate heat) when switching states. The state change is NOT instant, because of 'parasitic capacitance' the state change (switching) takes time to settle and contributes to the 'propagation delay'

The FETs are arranged as logic gates AND OR NOT XOR etc and FlipFlops (flipflops are a Latch that can be SET RESET or Toggled)

CPU/GPU registers, cache, and SRAM are made up of flipflops as memory. DRAM are different, each bit is made from 1 FET and 1 capacitor. The capacitor is the memory that retains the electrical charge (voltage), which needs regular refreshing due to leakage. DRAM have much higher memory density opposed to SRAM which needs 6 FETs to make up it's 1 bit memory flipflop. DRAM are slower mainly cos the heavily multiplexed interface needed to access substantially more memory

The CPU/GPU ALUs, control units, memory controllers, and all the parts that make up the compute unit cores an etc.. are made from combinations of logic gates and flipflops. As mentioned the FinFETs inside the GPU and CPU are the same. There are 2 reasons I can think of why the CPU can be clocked higher than the GPU:

1) This one is gonna be obvious. The average number of transistors (FETs) that are getting flipped (switched) 'concurrently' will be far greater on the GPU than the CPU because the sheer amount of parallel work the GPU does. CPU has 8 general purpose cores running in parallel. Whereas GPU there are 64 cores per compute unit, so in total about 2300 to 3300 specialised cores all running in parallel (if ever fully utilised), means GPU is gonna be running way lot more hotter if at the same clock as the CPU!

2) This one is a reason why GPU logic cannot keep up with higher frequencies. The highest clock rate will be limited by the maximum 'propagation delay' of the longest chain of 'combinational logic' that's in the GPU. Propagation delay is the time it takes for a logic gate to settle it's output after its inputs changed (this ain't instant as explained above), these delays will add up with every gate in series to get the final result, examples like full adders or multiplexers. Would infer that GPU rendering pipelines ALU'S have very deep logic chains, lot longer than the ones in CPUs! Hence Cerny's statement that the 'GPU can be clocked higher but the logic won't keep up'

I think the words Cerny referred to explain that GPU and CPU can be cooled equally as easy was 'thermal equilibrium' ?
 
Last edited:
7nm is still 70 angstroms so still thick in quantum terms, and remember the critical dimensions for FinFET is the width of the gate/ Semi is always improving the dielectric materials (High K) and performance of the thinnest gates.

The biggest factor is one of tolerance, as nothing actually looks like the FinFET model they are not perfect "blocks", deposition and etches are all cuves and have +/- to consider. Thats where EUV comes in which has been mentioned for RDNA2, probably improved ALD process as well.

Good luck anyone getting TSMC to spill the beams about 3-7 nm and EUV / gate specs of the transistors though, thats semi secret sauce..

One for saturday morings :messenger_beaming:
Woh, are we talking the same language lol (jokiing, I get what your saying just about) Looks like somebody got their electrical engineering degree, propa champion (y)
 
Last edited:
Yikes😂😂 you take this console war shit deeply huh? And if we're being honest Ninja theory's HB2 looked better than the tech demo 🤷🏾‍♂️
Actually no, Hellblade 2 (except that character model) looked worse than the Unreal 5 demo, the textures, the amount of geometry, lighting and overall scenery are just worse than the Unreal 5 demo, that demo was next-level stuff!

Now I know when you read this comment, you'll be saying "WTF?!" & mark me as a Sony licker but when you take a deep breath & think about it, it makes Hellblade 2 look feasible graphically and you should be happy about that.
 
Just like the latest Far Cry 6 CGI trailer. I think we'll reach that in 7-10 years.

06a309669c39c18aa908ebb197079e49a171cf93.gifv
That's the next step. I believe by mid-gen refreshes will get at least 50% of that level, as static environments are already doing photorealism much easier.
Nah, both next-gen consoles (PS5 and Series X) will be able to pull this off, they're both capable enough. Can you imagine what Naughty Dog's gonna be able to do at the end of PS5's lifecycle?
 
Har-har-har-har old chap... in my British accent 🤓

Wait till you hear what a blackcountry brummy accent sounds like, our accent are extremely unique... Yes really! Dan-of-orion you tell him in how high regard our accents are held, opens so many doors when we go job interviews... doors to the Exit! :pie_roffles:
 
Last edited:
Status
Not open for further replies.
Top Bottom