• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Mod of War: Remastered

Ω
Staff Member
Folks, stop posting random gaming news and opinions in this thread. It’s about the tech in these consoles which includes tools, SDKs, API, etc- not random sales stats, last gen game awards, what some random Twitter clown said to console war, or software reviews.

We have other threads for those, so take the discussions there.

Thank you.
A friendly reminder.

Take random tweets about games and the like to their respected OTs in the Community section.
 
yo Cerny's SSD + "9 TFLOPs of overclocked RDNA1" looks fucking incredible right now.

The whole level changes in like 2 seconds when she swings her hammer at that purple rock. Next gen I/O at use.


kC1Twvq.png

UbyqSJv.gif
Nice spot!
 

SlimySnake

Flashless at the Golden Globes
A friendly reminder.

Take random tweets about games and the like to their respected OTs in the Community section.
BTW, Is posting gifs of next gen games showcasing fancy visuals allowed? We are finally seeing what these next specs can do when it comes to tech in games.

Problem is, device agnostic tools by nature mean more is abstracted than needed, leading to less than optimal results. Jack of all trades, master of none.

If devs focus on XsS, then XsX will just be an upscaled version of that game with a few sliders jacked up, nothing more, nothing less.

To get the best out of something, it needs your entire focus to push things, no need to be concerned how something else needs to cope with it.
I was afraid that devs would port up from series s, but looking at how willing they are at letting the series s versions drop all the way down to 720p, I really dont think they give a shit. BF6 will be a good first indicator. If they target 1440p 60 fps with the PS5 version, i can see them dropping all the way down to 900p for the xss version. If it's like ratchet which is native 4k 30 fps then 1080p-1200p is easily doable with some downgrades here and there.
 
More work on more more platforms can't be easier by all means despite how development environment is. It would be easier for Xbox devs if there is no XSS at all.
The XSS gives a wider audience access to the platform. It wasn't designed to make developers happy it was designed for gamers. The industry is better having more people engaged than less. I'm glad MS made the XSS just like I'm glad Nvidia made more than just the 3090. Not everyone needs the high end device. The good thing is the XSS has the exact same feature set as the other higher end device so there are ways to optimize the games for the box to overcome challenges.
 
Last edited:

Vagos48

Member
The XSS gives a wider audience access to the platform. It wasn't designed to make developers happy it was designed for gamers. The industry is better having more people engaged than less. I'm glad MS made the XSS just like I'm glad Nvidia made more than just the 3090. Not everyone needs the high end device. The good thing is the XSS has the exact same feature set as the other higher end device so there are ways to optimize the games for the box to overcome challenges.
Please read the specs again, it does NOT have the same feature set.
 
The XSS gives a wider audience access to the platform. It wasn't designed to make developers happy it was designed for gamers. The industry is better having more people engaged than less. I'm glad MS made the XSS just like I'm glad Nvidia made more than just the 3090. Not everyone needs the high end device. The good thing is the XSS has the exact same feature set as the other higher end device so there are ways to optimize the games for the box to overcome challenges.
As if a $400 console wouldn't be affordable to the mass market in 2021. Don't even get why you pulled the 3090 to the discussion, the gaming performance is pretty much the same as the 3080 for about double the price (MSRP of course).

Btw, I thought Xbox had this Xcloud thing for casuals, what's up with that?
 
Except for pc

PC is not a platform with a fixed hardware configuration. For development, it's a totally different paradigm.

On PC, you build the game then it's up to the user to make sure they have the hardware to run the game acceptably (which itself is a variable target based on the user's own preferred rendering settings).

Total different from consoles.
 

IntentionalPun

Ask me about my wife's perfect butthole
As if a $400 console wouldn't be affordable to the mass market in 2021. Don't even get why you pulled the 3090 to the discussion, the gaming performance is pretty much the same as the 3080 for about double the price (MSRP of course).

Btw, I thought Xbox had this Xcloud thing for casuals, what's up with that?
The XSS almost surely largely exists because MS would have had developers creating a lower target version for xCloud anyways.
 
Well the ultimate goal is to eventually skip the MoCap and data-cleanup alltogether.
As a (very)loose analogy to motion-matching - start by capturing large database of real data and then fill-the gaps with the trained models. Theory being that eventually with enough data models become sophisticated enough to even self-improve etc. - but ultimately, you can generate new 'captures' without going through the manual iteration of capture etc.
Of course - the above has a good chance of working on gameplay movesets, and not really for performance-capture stuff, but the former is the more pressing problem to solve anyway.

Thanks for the insight, Fafalada.

Sounds utterly wild. Five years ago I wouldn't believe it possible. But seeing how AI and ML/DL has progressed thus far, I believe it for sure.

As a field of study, ML/DL is like freaking black magic with the stuff it's able to achieve today.
 
The XSS almost surely largely exists because MS would have had developers creating a lower target version for xCloud anyways.

Nah. For the datacentre, the cost equation with ROI for a subscription-based streaming service is totally different than for a discrete boxed console product.

A few hundred dollars difference per server node for an exclusively XSX-based xCloud server setup would be considered chicken change. The up-front installation costs would be amortized over a longer period and paid for multiple times over by the subscription-based business model over the course of service life.

So the argument that XSS was necessitated by xCloud needing a lower hw spec, simply isn't at all true.

XSS exists for the reasons MS have already explained, i.e. they wanted a lower barrier to entry for next-gen because launching at $500 with X1 didn't help them against the $400 PS4 and in Spencer's own words, they didn't want to "get caught out of position on price or performance" a second time.
 

IntentionalPun

Ask me about my wife's perfect butthole
Nah. For the datacentre, the cost equation with ROI for a subscription-based streaming service is totally different than for a discrete boxed console product.

A few hundred dollars difference per server node for an exclusively XSX-based xCloud server setup would be considered chicken change. The up-front installation costs would be amortized over a longer period and paid for multiple times over by the subscription-based business model over the course of service life.

So the argument that XSS was necessitated by xCloud needing a lower hw spec, simply isn't at all true.

It's not installation cost that matters... that's like saying "why optimize any workload by 50%, the cost doesn't matter."

Honestly just going to have to seriously disagree w/ you here. The cost of operation isn't quite half, but it's close. If the same server power can host 80 users instead of 50, and those 80 users won't notice a difference because they are streaming to a cell phone, you use the configuration that can host 80 users.

Hosting gaming is like the single most expensive per-user cloud concept ever thought up and 12TF per user is pretty insane compared to anything else done in the cloud. They also can potentially charge more for 4k vs 1080p.
 
Last edited:

reksveks

Member
PC is not a platform with a fixed hardware configuration. For development, it's a totally different paradigm.

On PC, you build the game then it's up to the user to make sure they have the hardware to run the game acceptably (which itself is a variable target based on the user's own preferred rendering settings).

Total different from consoles.
On PC, you build the game with a range of specs in mind. That's the point, if you are building the game for xbox then 90% of the time you are building it for pc. You always have a min spec that you design your game with in mind. That hasn't changed with a xss.

There is a difference between fixed hardware targets and variable ones, I know that and know its easier to optimize to a fixed one but that's typically alot later than game design.
 
It's not installation cost that matters... that's like saying "why optimize any workload by 50%, the cost doesn't matter."

Honestly just going to have to seriously disagree w/ you here. The cost of operation isn't quite half, but it's close. If the same server power can host 80 users instead of 50, and those 80 users won't notice a difference because they are streaming to a cell phone, you use the configuration that can host 80 users.

Hosting gaming is like the single most expensive per-user cloud concept ever thought up and 12TF per user is pretty insane compared to anything else done in the cloud. They also can potentially charge more for 4k vs 1080p.

A general data centre server loaded with EPYC cpus and banks and banks of RAM would pull more power per node than an XSX-exclusive game server rack. So I don't see the power consumption argument as valid.

Also, on a per-user basis, offline VFX rendering for movies and scientific computing is VASTLY more expensive per user than gaming. It's not even close.

On PC, you build the game with a range of specs in mind. That's the point, if you are building the game for xbox then 90% of the time you are building it for pc. You always have a min spec that you design your game with in mind. That hasn't changed with a xss.

There is a difference between fixed hardware targets and variable ones, I know that and know its easier to optimize to a fixed one but that's typically alot later than game design.

On PC you build to min specs that you get the luxury of defining entirely yourself. They're completely arbitrary and you can more or less select what specs you like.

Again, it's totally different from consoles which are discrete platforms with discrete hardware configurations.
 

IntentionalPun

Ask me about my wife's perfect butthole
A general data centre server loaded with EPYC cpus and banks and banks of RAM would pull more power per node than an XSX-exclusive game server rack. So I don't see the power consumption argument as valid.

Also, on a per-user basis, offline VFX rendering for movies and scientific computing is VASTLY more expensive per user than gaming. It's not even close.

The output of those is meant to be used by thousands or millions of people. A small team of VFX renderers produces a game or a film meant to be consumed by many people. I didn't word that correctly, but I've talked about this before. There aren't meant to be many millions of people rendering VFX in the cloud. Same with scientific computing; or any ML done. There might be one VFX artist on the other end, or one programmer working on an ML project meant to optimize a companies shipping routines, but there isn't one user of the output data.

Comparable consumer oriented things like web servers aren't even close, either is streaming video (as a game render is doing everything live streaming video is doing.. and also rendering a game). A single region's worth of game streaming to the point MS thinks it will scale will easily eclipse their worldwide GPU usage for VFX rendering, would be my guess. My dev VMs in azure, a "business use that has one user on the other end" uses a fraction of the power of a game rendering VM.. I use it maybe 20 hours a week.. what a high end gamer would do.. and it costs my company ~$75 a month.. that's the revenue MS generates for one business user using a 2 vpcu machine with 16GB of RAM and zero GPU.

I'm not talking about "power consumption" either.. I don't get why you keep sort of.. over-simplifying but also over-complicating it at the same time. Talking about the number of users you can host on any given set of components, from power consumption, heat output, space, everything. If a web server can host 1,000 users and another can host 2,000.. which do you choose?

It's like.. the entire point of virtualization lol
 
Last edited:
As if a $400 console wouldn't be affordable to the mass market in 2021. Don't even get why you pulled the 3090 to the discussion, the gaming performance is pretty much the same as the 3080 for about double the price (MSRP of course).

Btw, I thought Xbox had this Xcloud thing for casuals, what's up with that?
Nvidia is offering a range of devices for a range of customers. MS is doing no different. Not everyone wants a 3090 or can afford a 3090. Would games look better if the 3090 was the only option? Sure. They might look better but only a tiny fraction of the market would be able to play those games. More expensive devices contract the market not expand.

I'm glad $400 is nothing to you but I have a feeling you wouldn't be willing to purchase consoles for everyone who couldn't afford one. Thankfully MS put out a product that more people can afford making your potential generous offer unnecessary.
 

jroc74

Phone reception is more important to me than human rights
On PC you build to min specs that you get the luxury of defining entirely yourself. They're completely arbitrary and you can more or less select what specs you like.

Again, it's totally different from consoles which are discrete platforms with discrete hardware configurations.

Exactly.

PC games minimum specs can vary from game to game. Hell, it can vary from 1st game to the sequel.

Anyone thats into PC gaming would know this.
 

farmerboy

Member
We already saw what is capable in Demon's. Also, PS5 SSD is doing wonders.

Anyway, looks like loading is damn great. Loading whole damn world in....1 sec. And actually, i've noticed that world is actually loaded when Ratchet hit the stone, immediately before flash animation and then new world shows up

EDk80g6.gif

We'll probably never know the true speed of the i/o in game, as there is still the artistic side to consider. A developer may choose to add a fade in or out simply because they like it.
 
After seeing the new Ratchet and Clank Trailer, Phil Spencer feels better about what he's got to show this summer. SlimySnake SlimySnake

throwback to 2020 but for real I hope Hellblade 2 shows some impressive next gen tech on the MS side

let’s see them 12TF singing

I think Rare would be the ones to produce a cartoony looking next gen game. They seem to have some experience making games like that.
 

SlimySnake

Flashless at the Golden Globes
After seeing the new Ratchet and Clank Trailer, Phil Spencer feels better about what he's got to show this summer. SlimySnake SlimySnake

throwback to 2020 but for real I hope Hellblade 2 shows some impressive next gen tech on the MS side

let’s see them 12TF singing
I think if they are smart like they were with the pixel budget of the hellblade 2 teaser then they can definitely do it. Alex found that the black bars and 24 fps pretty much made it a 1620p game. only 4.9 million pixels rendered instead of the full fat 8.2 million pixels insomaniac is rendering in ratchet. not to mention insomniac is using ray traced reflections, and hellblade 2 wasn't.

Photorealism is very easy to achieve nowadays without ray tracing in unreal engine 4 thanks to their megascans. Especially if you are just rendering vast empty environments instead of filling them up with up hundreds of moving cars and multiple neon lights and reflections like we see in the gif below.

MUeIFG3.gif


this gif reminds me of that cyberpunk intro that was heavily downgraded even on PCs. just so much going on. i really hope its not just salad dressing, and we can fly around this environment and maybe run into these cars and cause some crashes.

Funny story. I flew to London back in september 2013, timed it so that i could attend the eurogamer expo. GG devs had a panel going on and in the QA session, i mustered up the courage to ask if the killzone shadowfall city was fully explorable or if it was onrails. the guy let out a sigh, but admitted it wasn't open world and that it's not that easy to implement. lol

I really hope with the power of next gen, Insomniac lets us fly around some of these levels and let us crash into those cars. Next gen imo should be all about enhanced interactivity.
 

James Sawyer Ford

Gold Member
I think if they are smart like they were with the pixel budget of the hellblade 2 teaser then they can definitely do it. Alex found that the black bars and 24 fps pretty much made it a 1620p game. only 4.9 million pixels rendered instead of the full fat 8.2 million pixels insomaniac is rendering in ratchet. not to mention insomniac is using ray traced reflections, and hellblade 2 wasn't.

Photorealism is very easy to achieve nowadays without ray tracing in unreal engine 4 thanks to their megascans. Especially if you are just rendering vast empty environments instead of filling them up with up hundreds of moving cars and multiple neon lights and reflections like we see in the gif below.

MUeIFG3.gif


this gif reminds me of that cyberpunk intro that was heavily downgraded even on PCs. just so much going on. i really hope its not just salad dressing, and we can fly around this environment and maybe run into these cars and cause some crashes.

Funny story. I flew to London back in september 2013, timed it so that i could attend the eurogamer expo. GG devs had a panel going on and in the QA session, i mustered up the courage to ask if the killzone shadowfall city was fully explorable or if it was onrails. the guy let out a sigh, but admitted it wasn't open world and that it's not that easy to implement. lol

I really hope with the power of next gen, Insomniac lets us fly around some of these levels and let us crash into those cars. Next gen imo should be all about enhanced interactivity.

ratchet isn’t open world so I wouldn’t expect you to be able to explore all of that

however, I think that level of fidelity is absolutely possible in an open world game thanks to the SSD, GPU, etc

ratchet looks almost equally as impressive as the UE5 demo
 
Last edited:

Rea

Member
ratchet isn’t open world so I wouldn’t expect you to be able to explore all of that

however, I think that level of fidelity is absolutely possible in an open world game thanks to the SSD, GPU, etc

ratchet looks almost equally as impressive as the UE5 demo
Ps4 rachet remake has some level which is semi open world and you can explore them. I believe that this new rachet rift apart will aslo has semi open world level.
 

Bill O'Rights

Seldom posts. Always delivers.
Staff Member
Folks you have a choice of how to take this thread forward. Either it's a catch all for any type of games news, tweets, memes etc. and the thread can be closed and a new one started in 'Communities', or renamed into the 'general console news/info/random tweet thread'. The other options is to stay on topic with what this thread is actually meant to be about. While we've used soft touches up to now like deletions, notifications and edits, the overhead for one thread is now too much, and people will start to earn sizeable reply bans in order to course correct and keep the thread useful for those following.


Thanks.
 

sircaw

Banned
Folks you have a choice of how to take this thread forward. Either it's a catch all for any type of games news, tweets, memes etc. and the thread can be closed and a new one started in 'Communities', or renamed into the 'general console news/info/random tweet thread'. The other options is to stay on topic with what this thread is actually meant to be about. While we've used soft touches up to now like deletions, notifications and edits, the overhead for one thread is now too much, and people will start to earn sizeable reply bans in order to course correct and keep the thread useful for those following.


Thanks.
Just out of curiosity are you getting a lot of complaints from users about this thread or is this strict enforcement down to you guys?
 

roops67

Member
We already saw what is capable in Demon's. Also, PS5 SSD is doing wonders.

Anyway, looks like loading is damn great. Loading whole damn world in....1 sec. And actually, i've noticed that world is actually loaded when Ratchet hit the stone, immediately before flash animation and then new world shows up

EDk80g6.gif
On this occasion it's not loading up a whole new world. It's like the same place in an alternate dimension, you can see most the same objects in the same places but with different look. But yeah, it's damn fast nonetheless
 

sircaw

Banned
Well well looky what the cat dragged in. You've had a good trip backpacking around the world? :lollipop_wink_tongue:

FOR YOUR INFORMATION, No stinky furball drags me around.
Was really sick for ages but all well now, good to see you m8.

Happy to be back here as well, missed the forums and the banter.

And, it's a great time to be back with some really good games hitting the market soon.

Still don't own a bloody ps5, but I did upgrade my ps4 pro with an SSD after it decided to stop working. It's given it a new burst of life. :messenger_grinning:

Playing a game called styx atm, once I am done with that I am on to Days gone, really looking forward to it.

I better stop with the penpal log before I get banned, Damn you Bill O'Rights Bill O'Rights , you party pooper. "lollipop_disappointed:
 

ZywyPL

Banned
Folks you have a choice of how to take this thread forward. Either it's a catch all for any type of games news, tweets, memes etc. and the thread can be closed and a new one started in 'Communities', or renamed into the 'general console news/info/random tweet thread'. The other options is to stay on topic with what this thread is actually meant to be about. While we've used soft touches up to now like deletions, notifications and edits, the overhead for one thread is now too much, and people will start to earn sizeable reply bans in order to course correct and keep the thread useful for those following.


Thanks.

I think closing this one and creating a new "general next-gen discussion" thread would be the most suitable solution, as that's what this thread has essentially turned into a long time ago.
 
Status
Not open for further replies.
Top Bottom