because 60 fps is a PROMISE.
Terrible. I hope these are place holdersPotential box art of XseX games from Amazon
![]()
![]()
That's what mid-gen refresh is for!Honestly atm ray tracing is a wate of power of consoles, for not a massive payoff, in a few years sure, and if ms/sony can do a form of DSLL 2.0, then maybe ray tracing can be worth it.
Honestly atm ray tracing is a wate of power of consoles, for not a massive payoff, in a few years sure, and if ms/sony can do a form of DSLL 2.0, then maybe ray tracing can be worth it.
Potential box art of XseX games from Amazon
![]()
![]()
This "optimized for" seems like a bad marketing, it means other games that doesn't have the logo aren't "optimized for" which will be looked negative from people, consumer need to think in his brain that every game is optimized for his console.
Hope Sony won't follow this imo not that good marketing move.
Big BRUHHHH if a cross gen title can't do 60fps without RT we are secrwed
Such a big disappointment.
![]()
Like, even a 2080 Ti ain't doing proper Ray Tracing very well. I'll be surprised as hell if any console hardware is more capable even with all of the advantages it has of being a fixed target to develop for.
Should be the case, can't imagine that it looks like that. Would also think that it's quite early for box art, no?Terrible. I hope these are place holders
I don't know man but they mentioned that a few titles are 60fps but AC Valhalla wasn't one of themI can't see that being true, and 1080p@60fps is just a joke. Didn't see something impressive so far, and AC is with full RT then it would make sense at 30fps but there is no reference there . Or it could be that Windows-based API isn't in good shape.
No, its not. Because you thinking within current gen games and specs.60fps is a reality on a simple GTX 1660 (medium/very high 1440p) :
"Optimized For" is a really bad turn of phrase in my opinion. Makes it sound like they didn't even test the shit to run on lower-end hardware. Maybe it's just me.
Should've just said, "Better on Series X" or "Series X Enhanced"
I don't think it works that way. Otherwise, what would refrain Sony from calling their system a 560 GB/s system rather than a 448GB/s one?
The amount of lanes (i.e. bus width) connected to the chips is important. RAM has a certain transfer rate per pin, the general total max being 14 Gbps, which is what the consoles are using. GDDR6 RAM chips can theoretically reach 16 Gbps, but, that would be pushing things too hard for the consoles, most likely.
All that aside, each chip has a max amount of lanes that can be connected with it, so the more chips you have, the more lanes you require. The whole reason MS went with their weird configuration was to increase the amount of chips so that they can have more lanes, i.e. increase the bus width. If they could achieve the same bandwidth by using 2GB chips, they would have done it, since that is cheaper in all ways.
The only way to achieve a higher bandwidth with the same chips is to increase the data rate higher than 14Gbps. The PS5 for example, would be able to reach 512GB/s bandwidth with its 8x 2GB chips, if the chips were 16Gbps instead of 14GBps.
This "optimized for" seems like a bad marketing, it means other games that doesn't have the logo aren't "optimized for" which will be looked negative from people, consumer need to think in his brain that every game is optimized for his console.
Hope Sony won't follow this imo not that good marketing move.
Especially with last gen ports like uncharted collection, god of war 3 and TLoU all running at locked 60fps (or very close, I could not tell if there were drops).How many times the Playstation 4 is more powerful in comparison with Game Cube for run Resident Evil 1 at 30 fps?
I never get tired of posting this, but unfortunately its up to devs. We probably gonna get more 60fps games or games that already run at 60, but more stable this time.
I agree and tht "optimized" logo is ugly as sin and very off putting.
What of the games run like shit tho? And need patching, doesn't that contradict the term 'optimized'
60fps is a reality on a simple GTX 1660 (medium/very high 1440p) :
30fps is so 2013
Especially with last gen ports like uncharted collection, god of war 3 and TLoU all running at locked 60fps (or very close, I could not tell if there were drops).
Sky is blue from what i heard.
I do, I meant that I can't tell if TLoU has drops or no (I play in the pro at 4k).Kingdom Hearts 2 is by far my favorite because I played on PS2 and those games really show the differencial. I have both God of War 3, TLoU and Uncharted collection, so for those who played in last gen they feel the differencial.
I don't think it works that way. Otherwise, what would refrain Sony from calling their system a 560 GB/s system rather than a 448GB/s one?
The amount of lanes (i.e. bus width) connected to the chips is important. RAM has a certain transfer rate per pin, the general total max being 14 Gbps, which is what the consoles are using. GDDR6 RAM chips can theoretically reach 16 Gbps, but, that would be pushing things too hard for the consoles, most likely.
All that aside, each chip has a max amount of lanes that can be connected with it, so the more chips you have, the more lanes you require. The whole reason MS went with their weird configuration was to increase the amount of chips so that they can have more lanes, i.e. increase the bus width. If they could achieve the same bandwidth by using 2GB chips, they would have done it, since that is cheaper in all ways.
The only way to achieve a higher bandwidth with the same chips is to increase the data rate higher than 14Gbps. The PS5 for example, would be able to reach 512GB/s bandwidth with its 8x 2GB chips, if the chips were 16Gbps instead of 14GBps.
I've had a look into the physical hardware side of things and I'm still not convinced – but am less inclined to fully rule out your claim, having looked extensively at the AMD Infinity Fabric stuff, the Zen2 floor plan, and the shot of the exposed XsX APU – which is all looking very inconclusive.
Conventionally a CPU core will try to offload its outer cache(L3 in this case) through the cache attached memory controller to the pool of memory it is always scheduled to read and write to. And that pool is also the one that provides the lowest latency because of wiring length, which in this case is the 6GB. It is convention to do it this way to avoid data starving or block the cores by being unable to offload or fill the outer cache(to then cascade the copying through the cache hierarchy) and the idea of data starving and blocking the CPU and GPU at once just to avoid an asynchronous copy from 6GB to 10GB, doesn't sound good for parallel processing or good for latency, or good for utilisation(IMHO); especially when the L3 cache typically is used far more for data destined for the 6GB, and that cache data will be stuck waiting for the portion destined for 10GB to get the memory controller to make it happen for it to offload or be filled from the 6GB.
(AFAIK) Conventionally the memory controllers are initialized by the low level bios/uefi system prior to boot strapping -hence why RaspPi and laptops with iGPUs fix the memory split before boot strapping – and in the initialization the controller/s is setup to enqueue commands from the outer CPU cache - so certainly not likely that a to-the-metal approach for developers would let them change how data gets to each memory pool.
But in saying all that, the Infinity fabric design of the Zen2 and Vega, which I assume XsX uses, or the early Infinity Architecture because of the old style 60 CU GPU of the XsX looks like it might be able to decouple the memory controllers (for all chiplets) behind the Infinity Fabric to make CPU memory access to the 10GB trivial, but the split bandwidths and different bus widths suggests something hasn't been decoupled for that optimal setup. A normal Zen 2 would have two 64bit memory controller units (MCU). One for each L3 cache - and one L3 cache per 4C/8T module - and we know that XsX has 192 bit bus width for the Zen2, so has 3x MCUs (and logically will have three L3 caches), but RDNA2 AMD GPUs will have either 256 bit bus or 384 bit, 4 MCUs 2 per side, or 6 MCUs, with 3 per side. But neither of these setups fits with the XsX's 5 MCUs.
If the XsX's ten 32bit chips are interfaced with 3 MCUs in the Zen2, and another 5MCUs in the GPU, then what you are claiming seems highly plausible as the MCUs would likely manage the complexity. However, if the memory is connected to 3 MCUs in the Zen2 and only 2 MCUs live in the GPU, then for latency reasons alone, I would expect the CPU to pass all data to the GPU by copying to the 6GB, and then to the 10GB. The picture of the exposed XsX chip has 5 bright silver units on its North edge, a cluster of 3 in the middle and one further wide on each side – I think they could be the MCUs because they are before the black moat, that I assume is the Infinity Fabric that wires to all the 10 GDDR6 modules, and that matches the Zen2 type design (AFAIK).
AC: Valhalla could be 60 fps easily on the next gen console, even in lower resolution... but it won't.
And you know why?
Not because the developers are lazy.
Not because the new consoles aren't powerful enough.
But because 60 fps is a PROMISE.
A promise that all the next AC titles will also be 60 fps.
Because once you go 60 fps in a cyclic series, there's no turning back.
And it's not an easy promise, especially for open-world games.
Can you imagine that after so many years another Call of Duty only go for30 fps? What a flame on all medias it would be.
No, go back to the year 2013 when AC: Black Flag came out.
This cross-gen title could also run on PS4/XO in 60 frames. Easily.
But it didn't, for a simple reason.
Because Ubisoft already knew at that time that with the more ambitious projects they were planning for AC: Unity and later titles like AC: Origins/Odyssey, they wouldn't be able to deliver 60 fps for the current generation.
Can you imagine how much AC:Unity and later AC:Origins/Odyssey would have be downgraded if they had to deliver 60 fps on the current generation?
Yeah. And it would even have a negative impact on sales if these games were graphically much worse than other titles from competition aiming for 30 fps.
And Ubisoft knows very well that it doesn't matter that they have now enough power to make AC: Valhalla to run at 60 fps on next-gen,
But they also know well that 60 fps target would very limits the ambitions for truly next-gen Assassin's Creeds.
So, that's why do I think that Ubisoft (and many other companies,) do not want to close the 30 fps door so they can deliver much more ambitious things in the future, and the promise that all sequels game will also be always 60 fps, may just not be accomplished.
You obviously never played the last gen ports released on the ps4 (TLoU, uncharted, god of war 3... Among others)... All 60fps, none prevented thw next installment from targeting 30fps (uncharted 4 even had to change it during development).And it's not an easy promise, especially for open-world games.
AC: Valhalla could be 60 fps easily on the next gen console, even in lower resolution... but it won't.
And you know why?
Not because the developers are lazy.
Not because the new consoles aren't powerful enough.
But because 60 fps is a PROMISE.
A promise that all the next AC titles will also be 60 fps.
Because once you go 60 fps in a cyclic series, there's no turning back.
And it's not an easy promise, especially for open-world games.
Can you imagine that after so many years another Call of Duty only go for30 fps? What a flame on all medias it would be.
No, go back to the year 2013 when AC: Black Flag came out.
This cross-gen title could also run on PS4/XO in 60 frames. Easily.
But it didn't, for a simple reason.
Because Ubisoft already knew at that time that with the more ambitious projects they were planning for AC: Unity and later titles like AC: Origins/Odyssey, they wouldn't be able to deliver 60 fps for the current generation.
Can you imagine how much AC:Unity and later AC:Origins/Odyssey would have be downgraded if they had to deliver 60 fps on the current generation?
Yeah. And it would even have a negative impact on sales if these games were graphically much worse than other titles from competition aiming for 30 fps.
And Ubisoft knows very well that it doesn't matter that they have now enough power to make AC: Valhalla to run at 60 fps on next-gen,
But they also know well that 60 fps target would very limits the ambitions for truly next-gen Assassin's Creeds.
So, that's why do I think that Ubisoft (and many other companies,) do not want to close the 30 fps door so they can deliver much more ambitious things in the future, and the promise that all sequels game will also be always 60 fps, may just not be accomplished.
Big BRUHHHH if a cross gen title can't do 60fps without RT we are secrwed
Such a big disappointment.
It's that they were marketed and hyped poorly; and presented as something they were not.
They might do, but I dont, 60 FPS locked and anything over 1440p upscaled they can do whatever they want.
4K I dont notice difference between native, 1600p or Checkerboard or Temporal solution on Pro, I bet most others wont either.
But frame rate is glaring 30 vs 60.
Ray tracing is the holy grail of graphics.ray tracing and teraflops is this gens buzz words
In the real world, the tech is still in beta form, and isnt a game changer yet, I'm sure itll speed up dev time in the future, since no baked lighting, still years away from that tho (imo)
Digital Foundry has a great video on why AC wasn't 60fps, it was because of the anaemic CPUs on the PS4 and Xone. It's that simple.
What did you do?? Fuck that is freaking me out! Shit I need some alcohol to get that image out of my head.Me when I realize after 1870+ page of tech talk then end up being on a 30fps cross gen title
![]()
NOPE.Let me say this. Those who are happy with just a simple logo reveal and a picture of a controller have no right to criticize the amount of games we saw. Yes the games are not for everyone but it beats the shit out of a logo and a controller.
just my opinion![]()
For sure this is it but it completely makes a mockery of, and undermines, the 'Optimized for Xbox Series X' badge. Or worse the whole thing is a lie.....
The PS5 SSD is far superior to the XSX, far superior. But the accelerators/software to remove bottlenecks are 50/50 at this point. Until we know fulle details of XVA we can only say the PS5 will have much higher throughput.We told you guys that the Xbox Series X doesn't have any "secret sauce" in its GPU, or any "secret sauce" in its SSD architecture, the PS5 has everything the Xbox has in terms of GPU but maybe some difference on the API-level.
That's it.
PS5's SSD DESTROYS XSX's SSD!! No comparison, anyone who says otherwise is a Phil Spencer taint lover.
I don't think any company will sell above $499 based off the lessons from the PS3. Phil has already said the price will be competitive.MS has already shown their hand. they wont go until july. sony cant afford to go wait that long, even if that is what their plan was.
I think they believe MS is well over $499 and want to see what they will price the series x at. Lockhart is a red herring imo, i dont think it launches on day 1. I think MS threw that out there to make Sony think twice before launching at $499, and downgrade their specs which as we see from the ram bandwidth downgrade, actually kinda worked.
Sony has no choice but to reveal the console and price in June. They kinda screwed themselves by going with a smaller die, and now they simply cant launch at the same pricepoint as Series X. But if Series X is $549 or $599 then they can go with a $500 PS5.
If the PS5 was $399, sony would have zero reservations, but I think that expensive SSD is the reason why Sony is going over $399. the question is how much smaller is that die really. Especially with the repurposed 3D audio CU, those I/O processors, and coherency engines. If it's around 300mm2, they will likely save $20 compared to the Xbox. $40 if the prices have doubled from last gen. How much cheaper is 448GBps ram compared to 560 MS is going with? 25% more bandwidth for 25% more costs? So $25 if PS5 RAM is $100?
But again, MS was smart and only ordered 10 GB of ultra fast ram, and 8 gb of even slower ram. so i bet it cost them the same as Sony. Best case scenario for Sony, their 825GB of SSD costs the same as 1TB of MS SSD. Their RAM costs $25 less and their Chip is around $40 less. Their cooling solution is inexpensive like cerny said so it's likely another $10-15 in savings That's almost $80 in savings that can help them launch $50 cheaper.
Worst case scenario, their ssd is more expensive, their die size is the same, their ram costs are the same and the cooling solution costs only a fraction less than MS's vapor chamber cooling. So they have a console that has 20% less tflops with the same BOM as MS. They will have to launch at $500 and hope for the best.
Did you expect XSX to have features not on High end PCs?
The type of RT these consoles will support is likely pretty "low tier". We'll get basic support for most games, with less visually demanding games like minecraft showing off higher tiers of RT, as we've seen.I see where are you coming from but a cross gen title should easily do 60fps because in 1 or 2 years we will see many games with RT which is very demanding, it seems 60fps on open world games is just a dream after all![]()
Yes, it's absurd why they haven't clarified why Vallhala is not 4k 60. Maybe they want to push it higher?? Or its going to be 30?? It's weird because it should have been easy to hit 4k 60 considering this is a cross gen game.
Man how do you deflect this on others? First you deflect MS blunder then you deflect your own triggered status on to othersLol I forgot how triggered NeoGaf can get.
I'm sorry guys I forgot that in this relm the Contollers and Logos are whats most important.
I think its shitty to downplay what some of these indie devs are working on as trash. There some very good games on show but I guess most of you only want to see exclusive AA game only.
I will go discuss the games in other forums where discussion is civil and doesn't build down to "BuT ThEy HAvE MoRE LiKEs oN InSTAgRAm AnD THE MEaninG BeHInD thE LoGO is VeRy SeNTiMEntal sLoLOL AaA GamES ARe THe OnLY GaMeS WORth TAlKinG AboUT"
Well he he!......I expected Guerilla to crush toes.....SWWS will lead the pack once again next gen....These guys always juggle between best graphics all gen anyway.
Yes please. I await them wholeheartedly.List of current third party studios working on XSX games
![]()
Can't wait to see some proper next-gen gameplay footage. Today wasn't it.
I dont like this analogy personally. I have watched alot of tech youtube channels and did research on some different websites when I was starting to play AC Odyssey on my PC and almost all of them agree that very high settings in this case is just a waste of resources and not worth the return you get in graphic fidelity. I think this proves more so that lazy developing/non optimization can lead to horribly wasted resources. Ubisoft has a bad rep in the PC community for lazy optimization from my experience doing research trying to figure out the best settings for my pc set up. Its actually funny alot of the time people just say you would be better off just playing on consoles since they had a smoother experience with less headache. IMHO even the cheapest TVs and monitors out there nowadays are 60 Hz and game companies should set the bar there and develop around maintaining that and these new CPUs in the ps5 and XsX should help achieve that easily compared to that joke jaugar.Per Anandtech's testing, at 4k/Very high (What people want from next gen) a 2080 (which is a decent comp for XSX) can only manage upper-mid 40s. For a game 2 years newer and even more demanding, XSX will either have to tone down settings or live with 30. Even a 2080Ti fails to hit a perfect 4k/60, so one would assume for AC:V, a 3070 will be necessary for 4k/60, at minimum. So, 30 FPS is only "so 2013" if you live at 1440, which is what's truly "so 2013"
![]()