THEAP99
Banned
What's that?
Hiya, The CU size is likely the same on both consoles plus or minus not a lot, XSX did save die space mainly by cramming more CU into 4 shader arrays and more stuff is shared per CU is a easy way to think about it.Yes still just rumors but MLiD and RGT(?) stated from industry insiders that concessions were made for MS to fit (squeeze) 54 CU's onto the XSX APU. They ripped out the circuitry for the CU's (or individual shader cores) to alter/tune it's frequencies around it's base clock, this also probably included the 'pervasive fine-grain clock gating' too that AMD had shown on their RDNA2 talk. These deleted features are important for efficiency and keep the GPU within it's TDP when trying to work flat out, may explain partly why XSX is under performing
... where the heck is geordiemp when we need him
The LOD thing interests me, with UE5 demonstrated on the PS5 they stated no more need for LODs.
UE5 stated no more manual authoring of LODs. And presumably continuous LOD refinement.
The former doesn't say anything about existence of LODs (ie. we're all still using texture LODs, and will continue to do so as long as texture sampling is used, including UE5 - but virtually no-one manually generated them in decades).
The latter just means LOD transitions are smooth - it does not mean eliminating discrete LODs (see textures again). While some computational models exist where the lines are a bit blurrier, even for continuous refinement there's almost always some discrete data points represented internally, especially anything that has to operate in realtime.
Market Cap is nonsense in terms of value.
You can change value by 100bln with 100 shares and 1mln trading.
130bln market cap - Sony (1400 subsidaries, several dozens facilities, 30% world music share, secind biggest movie/picture entertainment, many real estate, 50% cmos sensor world share and no1 share OLED viefinders etc).
30bln market cap - some company that have one brick game.
Sony assets:
List of assets owned by Sony - Wikipedia
en.wikipedia.org
I never udnerstand market cap phenomenon on web
Like Elon Musk last "richest man". He's not richest. He only owns some % of stock that skyrocket with computers help. If he want sell those shares value will down -80%. Market cap of Tesla is nonsense because is bigger than Toyota + VW (Audi, Lambo, Skoda, Bentley, Ducatti) + Mercedes-Benz, and Tesla have no profits and much lower revenue (30 times lower than those companies)
PS3 was far from premium in multiplat performance especially in the first wave. The mess around the specs of this console is something else. Kutaragi has completely lost his mind at the time. Can you imagine a PS3 without a "proper" GPU as it was initially intended? Hearing him a derivative of the PS2 GPU was enough.Jump from PS2 to PS3 was huge and PS3 was a premium product for its time. This time console specs are good but they are not premium and they play it is safe with the specs, price POV.
Logically this is correct. However, my illogical mind would buy one in a heartbeat even if there was minimal improvement.I agree. I don't know why people are discussing the idea of a PS5 Pro when we only just start to begin to see what the PS5 is capable of.
Right now Developers are still in the Evolution faze.
It will be a year or 2 before we start to see Developers change there code to be Revolution.
So, I don't see a PS5 Pro happening. Especially when the PS5 is more than capable to last up to 6 years.
It's better to use techniques like Checkerboard Rendering than to go out and spend millions to develop a Pro model.
The technology isn't powerful enough for a Pro model to make a huge difference anyway.
PS6 6-7 years from now makes more sense.
Man you are talking about a site can't discern weird fps ps5 issue without call out the less TF narrative (the last COD) but when series X is involved it's always a "why why why" by them. They are just looking to the high specs numbers and nothing more as a common person would do.once again, Tom and Alex are baffled by the performance drops in the XSX version. Alex literally says they dont know why Bungie would allow these drops. Like are you fucking kidding me? Bungie isnt allowing shit. The XSX simply isnt able to run the game at that framerate at all times. Imagine if Alex reviewed a GPU where the GPU isnt able to do locked native 4k 60 fps in a benchmark test and then blamed the benchmark instead of the GPU for the poor performance.
This is getting ridiculous tbh. This is like the 15th game they have reviewed which has performance problems on the xsx and while I am open to the tools conspiracy theory or excuse or whatever you want to call it, I think it's fair to say that at the moment the XSX isnt as capable as the PS5. There is no need to appear surprised or shocked or make excuses or blame developers. Very bizarre.
You are probably correct, but...UE5 stated no more manual authoring of LODs. And presumably continuous LOD refinement.
The former doesn't say anything about existence of LODs (ie. we're all still using texture LODs, and will continue to do so as long as texture sampling is used, including UE5 - but virtually no-one manually generated them in decades).
The latter just means LOD transitions are smooth - it does not mean eliminating discrete LODs (see textures again). While some computational models exist where the lines are a bit blurrier, even for continuous refinement there's almost always some discrete data points represented internally, especially anything that has to operate in realtime.
Dry humor perhaps lol. I don't get them either.Man, these videos are so painfully unfunny. I just don't get it.
It's fair to compare PS5 and xbox as they are or are at least reported to be the same tech. But as people only look at terraflops, which is pretty meaningless for games, and nothing else they add 2+2 and get 5.Is the XBSX really the most powerful?, is the real question.
Mark Cerny said: "This continuous improvement in AMD technology means it's dangerous to rely on teraflops as an absolute indicator of performance."
Maybe the Flops and CUs in the PS5 are more powerful than in the XBSX.
Or the PS5 is just better engineered.
What's that?
PS3 was far from premium in multiplat performance especially in the first wave. The mess around the specs of this console is something else. Kutaragi has completely lost his mind at the time. Can you imagine a PS3 without a "proper" GPU as it was initially intended? Hearing him a derivative of the PS2 GPU was enough.
To be clear I said nothing about that - it seems obvious it's using a virtualized asset in that demo.I know you are a Big Shot here, and know much more than I do. But according to what I've watched already from Atom View and UE5, it's a frame budget system instead with 1 asset version.
It's entirely possible they encode their geometry into 2d arrays (there was some patent collateral around it that appeared related I recall reading on it around the time demo broke), and there's been prior research on the subject. But that doesn't really say much about performance characteristics of real-time sampling from that data-set.What if they've found a way to encode a model as a 2D signal at each unique quaternion position - (for each channel/characteristic) into an equation? They could then take the four fragment corner locations positions of the pixel they want to render, and then do integrations against those equations to get the channel value for each pixel.
Yikes, sorry to hear that friend. Best of luck in your recovery.Ladies and gents, please let me give you all some sound advice...
Don't get Malaria... or Typhoid.
They suck ass.
I currently have both. Plus an additional bacterial infection, you know, for good measure.
Sorry I know this is OT, but I just need to vent a bit.
Ladies and gents, please let me give you all some sound advice...
Don't get Malaria... or Typhoid.
They suck ass.
I currently have both. Plus an additional bacterial infection, you know, for good measure.
Sorry I know this is OT, but I just need to vent a bit.
Ladies and gents, please let me give you all some sound advice...
Don't get Malaria... or Typhoid.
They suck ass.
I currently have both. Plus an additional bacterial infection, you know, for good measure.
Sorry I know this is OT, but I just need to vent a bit.
Isn't one of MLID's whiteboard topics "AMD vs MS"? I'm really curious about that, regardless how much of it turns out true or not.Yes still just rumors but MLiD and RGT(?) stated from industry insiders that concessions were made for MS to fit (squeeze) 54 CU's onto the XSX APU. They ripped out the circuitry for the CU's (or individual shader cores) to alter/tune it's frequencies around it's base clock, this also probably included the 'pervasive fine-grain clock gating' too that AMD had shown on their RDNA2 talk. These deleted features are important for efficiency and keep the GPU within it's TDP when trying to work flat out, may explain partly why XSX is under performing
... where the heck is geordiemp when we need him
What made you think Xbox made a mistake? Why couldn't it be because the Playstation team did a good job?
It seems you assumed incorrectly that Microsoft is better at making consoles than Sony, when Sony is the actual Hardware company. At some point you need to stop trying to find blame, and give credit to the engeneers at Playstation who did their job well.
Yes, the Xbox team is "proud", but they were never basing their pride on anything. You need to re-evaluate the capability of both sides and change your assumptions.
Actually that’s false, LODs are still there, the big difference this gen is that it’ll be invisible to the human eye. How?
By drawing and rendering polygons at the micro-level (breaking down triangles in an asset to the size of a pixel and then make changes to LOD in real-time as you’re walking through the level, depending on how to close to the object you are and you won’t even notice it happening).
This is where PS5’s I/O solution and cache coherency play a MAJOR role.
(timestamped)
Like NX Gamer said, if you were to run this demo on PC rn with the exact same micro-polygon density as PS5, you would need like 40GB of VRAM. But because of the ultra-high speed SSD, the process is virtualized, pretty much what MLiD said in one of his videos, the PS5 has 825GB of DDR2 RAM thanks to how fast the storage architecture is.
That's really rough going through that hell! Just how you manage to catch it in this day and age, you live somewhere where it's prevalent? Sorry to pry, don't answer if you don't want to. Keep your spirits up, get well real soon!!Ladies and gents, please let me give you all some sound advice...
Don't get Malaria... or Typhoid.
They suck ass.
I currently have both. Plus an additional bacterial infection, you know, for good measure.
Sorry I know this is OT, but I just need to vent a bit.
Damn. How do you even get typhoid?Ladies and gents, please let me give you all some sound advice...
Don't get Malaria... or Typhoid.
They suck ass.
I currently have both. Plus an additional bacterial infection, you know, for good measure.
Sorry I know this is OT, but I just need to vent a bit.
Mark Cerny is really certainly the best to create the best console possible with cheapest price possible. Because we need to remind the price, cause i still see some ignorant comparing consoles to PC best graphics card with super high prices with nonsensical way of thinking.Mark Cerny is definitely a Wizard
Hasn't both VGTech and that spanish youtuber both said PS5 also has better AA?I haven't watched the video but I will take a wild guess and say that it was Alex that said the Xbox doesn't seems to use DRS?
So this "sharper" image might just be his bad vision™ which is strangely always adding benefits to Xbox here and there, which is weird otherwise it would hapen to other consoles as well.
That being said coming from a guy that can't recognize Raytracing in video games is it that surprising?
Yes. DF right now is absolute trash they're constantly making mistakes or missing details.Hasn't both VGTech and that spanish youtuber both said PS5 also has better AA?
Maybe, honestly I don't follow really closely those comparisons videos...I'm too busy playing.Hasn't both VGTech and that spanish youtuber both said PS5 also has better AA?
They need to cover a bit to avoid to offend their sponsors.Yes. DF right now is absolute trash they're constantly making mistakes or missing details.
"Spanish youtuber"Hasn't both VGTech and that spanish youtuber both said PS5 also has better AA?
When you have multiple outlets corroborating similar results, it's a stronger argument than authority."Spanish youtuber"
so you doubt DF but not this spanish youtuber? and ofcourse PS5Tech who just says anything that comes to his head without showing any proof
"Spanish youtuber"
so you doubt DF but not this spanish youtuber? and ofcourse PS5Tech who just says anything that comes to his head without showing any proof
Just look at what they missed in Dirt 5, only thing they spotted was the dramatically reduced quality settings in 120hz mode.When you have multiple outlets corroborating similar results, it's a stronger argument than authority.
Two different corroborating sources, it better than trusting the word of people that cannot even see the differences in quality mode on Dirt 5. They don't portray themselves as amateurs, but their analysis certainly is."Spanish youtuber"
so you doubt DF but not this spanish youtuber? and ofcourse PS5Tech who just says anything that comes to his head without showing any proof
Mark Cerny is definitely a Wizard
Getting a little personal there buddy. Not hard to understand that a Demo as in demonstration was 5 months away from release.It was a demo, not a final game. Not hard to understand that what we saw will bear no resemblance to what we see next. So its pretty dumb to use an unfinished game compared to a finished game. But go ahead and look stupid
DirectX 12 Ultimate: How is it Different from DirectX 12?
DirectX 12 Ultimate is an incremental upgrade over the existing DirectX 12 (tier 1.1). Its core advantage is cross-platform support: Both the next-gen Xbox Series X as well as the latest PC games will leverage it. This not only simplifies cross-platform porting but also makes it easier for developers to optimize their games for the latest hardware.
Nothing showed up.For everyone that believes "Tools" are the answer, please read this:
Pressed the wrong combo of buttons obviously, instead of pasting, it made the post lolNothing showed up.
Anyway, the tools are the ones talking about tools.
But if it was equations they were recovering from disk, and regenerating the data on the fly, by supplying coordinates(as integration limits for each viewport pixel) then the performance would be constant IMO, because the reconstruction (integrations) would be 1 per viewport pixel - per channel/characteristic - so even if the source asset was created with 20million polygons in zbrush, the AtomView Data representation - in my hypothetical solution - would just be 360(deg) x 180 (deg) x model channel count equations, and they'd only need to retrieve the equation sets that match the specific orientation/quaternion (180 around x and 360 around y) of the model and reconstruct at each viewport pixel by integrating between limits for each pixel......
It's entirely possible they encode their geometry into 2d arrays (there was some patent collateral around it that appeared related I recall reading on it around the time demo broke), and there's been prior research on the subject. But that doesn't really say much about performance characteristics of real-time sampling from that data-set.
And without getting very explicit about implementation details it's hard to say much more. Given they showed detail scaling from something around 1pixel/mm2 all the way to 2-4km viewing distance in that demo - you'd be hard pressed to argue sampling everything from the top-level would not be detrimental to performance (whether constantly, or just degrade the farther things get from camera).
Does not mean, they solved XSX performance stack. Obviously it's same for devs, however for MS, to tune it on the background that's the issue. I have my doubts that they are ready, they were late, far behind Sony and on the top of that, since they have to do themselves, to improve performance stack, incrementally. I've been in development around X1X, when it was announced and it was wild, we in Warhorse Studios received nightly builds so it does not run like shit. It was that time when they have to take into account different memory setup on X1X (in comparison to eSRAM on X1S), lot of things were buggy, underperfoming. So I am not sure if I can trust MS, that they do not fucked it up again.For everyone that believes "Tools" are the answer, please read this:
What is the Difference Between DirectX 11 and DirectX 12 | Hardware Times
The way people talk in this thread is as though it's completely different to what they've been using since in the xbox SDK since 2015. It's not new, it's not completely foreign to devs as has been made out previously, it's not something they "need to get a handle on" or something they need to learn, they know it, it does what it does, and that's it.
Expecting something good to come with better tools, is not really going to happen I'm sorry to say.
DirectX is fantastic for BC, the code doesn't care one shit what hardware it's running on if it's got all the same features (This is the "Forward Compatibility" tag line from the previous gen), this is why series X can run things at higher res, with better quality without any fuss what so ever from the previous gen. It also means, that there is a layer taking performance away from gaming at all times, this layers interference gets smaller other APIs like Mantle initially, and latterly Vulkan get huge wins over DX by reducing the overheads which forces MS to update the API.
But the GDK being "new" is a total fallacy, it's renamed and updated SDK and won't make much difference, if any.
So many times it has already been said and written that variable frequencies are not needed to float in values, or because it is impossible to work on stable ones, but in order to maximize the peak of hardware capabilities and the efficiency of each component when necessary. Learn the technical part, please, because the Xbox will have even more problems, because its SoC also has a power limit, but there is no such smart power management as in the PS5. The bottleneck will be exactly in the power limit, when the computational loads on the chip will be unusually high. Thats why Cerny emphasized the need to move away from the old paradigm, due to its low efficiency, to the new one.There is a good chance its winning these cross gen comparisons because the CPU being paired back isnt affecting performance because these cross gen games barely even use the CPU as most PC benchmarks can show us. Once next gen games with heavy physics, massive NPC counts and destruction arrive, we might start to see issues if the variable clocks impact the CPU too much.
So could the Sony Bravia XR be their own version of VRS, and the cognitive algorithms in the Bravia XR processor, also be created in say... PS5 Tempest Engine.
You nailed it, well saidSo many times it has already been said and written that variable frequencies are not needed to float in values, or because it is impossible to work on stable ones, but in order to maximize the peak of hardware capabilities and the efficiency of each component when necessary. Learn the technical part, please, because the Xbox will have even more problems, because its SoC also has a power limit, but there is no such smart power management as in the PS5. The bottleneck will be exactly in the power limit, when the computational loads on the chip will be unusually high. Thats why Cerny emphasized the need to move away from the old paradigm, due to its low efficiency, to the new one.
Think twice before you write another cool story about issues of variable clocks.
Easy: making an xray is a costly activity, so who ever has done it, it wants to cash out. And I can't fault them for trying to getting paid for the effort.Sony didn't stop people from ripping out the APU. Those people could have done this by now.
As for me I would just be happy with actually having a PS5. No way would I sacrifice it for that.
You keep going back to the well on this one. What we saw couldn't even be classified as a "Alpha". It was a gameplay demo, and not even shown on actual hardware! There was no indication it was even using the most up to date assets they had (if you read those supposed developer leaks claim team members held off making check-ins and the build was broken daily.). Clearly not an alpha state, would you agree?Getting a little personal there buddy. Not hard to understand that a Demo as in demonstration was 5 months away from release.
You actually think what we saw in July would have been enough time to polish it and have it ready for Xbox Series X launch and still be considered a title that showcases next gen?
Obviously you know the answer already. Hence why there was such a backlash from gamers. Delaying it was the best outcome for everyone who wants to play it.
The only dumb thing was the decision to Demo Halo Infinite in July, expecting gamers to NOT be disappointed with the state the game was in. Hopefully it has improved significantly since.