3liteDragon
Member
Is this reputable analysis or should we wait for DF
Probably best to wait for DF’s analysis.
Is this reputable analysis or should we wait for DF
Idk shit about specs tbh. But my friend is coping right now after failing to get ps5 saying that his pc build based on this video is superior to the ps5 in specs. Is that true?
He said it's "way better"
Can you like elaborate some more so I can explain that to him?GTX1660 is pretty shit, so no.
I’ll give a shot anyway. I hate those rigid cables but I’ll take what I can right now as long as it works .Thanks.Its flexible enough though it’s pretty stiff.
THe 1660 is like a 2 generations behind GPU, it has no support for RT and it's basically the bottom of the barrel, I think you need at least a 2060 to even have a chance to compete with the ps5 or series x.Can you like elaborate some more so I can explain that to him?
He also said "I have AMD Ryzen 5 with a 1060 so I’m guessing high-ultra" for cyberpunk & "200+ FPS on PC vs what on console? 60-120fps" he also said"THe 1660 is like a 2 generations behind GPU, it has no support for RT and it's basically the bottom of the barrel, I think you need at least a 2060 to even have a chance to compete with the ps5 or series x.
Was just having a little go.
Weirdly enough is that both Series X and RTX 3000 cards doesn't work with the newer 8K Receivers because it output 4k120fps HDR in 4:4:4 chroma... you get a black screen.
PS5 is fine because it output in 4:4:2 chroma.
This guy is saying it is a bug in the HDMI chipset made by Panasonic.
EditHe also said "I have AMD Ryzen 5 with a 1060 so I’m guessing high-ultra" for cyberpunk
Has PS blog always stated VRR was coming in future update? I know we all expected it but didn't know if Sony had actually had that in the blog all this time lol
It's not that powerfull, it's like around X1X in horsepower, but that PC has better CPU than X1X. So most likely games will run at higher FPS, but with lower details and lower resolution. Umm, let say if you would put this GPU in the new consoles, it would run at let say 1440p without raytracing and all that nice stuff, with lower AA and lower details. You basically cannot get GPU at that pricepoint which is powerfull enough to beat XSX or PS5...Can you like elaborate some more so I can explain that to him?
I think it is weirder, that the HDMI bandwidth on PS5 is limited to 32 gbps (or is this a FUD?) and will be raised in an update (?).
It looks like there are a lot of issues with HDMI 2.1. Even the PS5 TV XH90 needs updates to fully support all 2.1 features.
Is HDMI 2.1 very taxing? Could someone test some XSX games with HDMI 2.0 and check, if it somehow slows down the console(checking the tearing and the fps drops?)? It is strange that only MS managed to implement it "correctly" into the console and not Sony.
Thanks. So will he run cyberpunk in "high-ultra" settings? Prob not?It's not that powerfull, it's like around X1X in horsepower, but that PC has better CPU than X1X. So most likely games will run at higher FPS, but with lower details and lower resolution. Umm, let say if you would put this GPU in the new consoles, it would run at let say 1440p without raytracing and all that nice stuff, with lower AA and lower details. You basically cannot get GPU at that pricepoint which is powerfull enough to beat XSX or PS5...
Well in this PC, you posted it would be more like medium at best, with some modest resolution, like 1080p or 1440p on low. My estimate, I would say, those cards does not even have DLSS, that ML upsampling.Thanks. So will he run cyberpunk in "high-ultra" settings
Go ahead and tell meShall I inform him, or will you?
When it's time for the commercials & Sony have people in ads faking like the touch sensors in the DualSense have changed their life Xbox Series X can't really counter that.
Nah lolHe also said "I have AMD Ryzen 5 with a 1060 so I’m guessing high-ultra" for cyberpunk & "200+ FPS on PC vs what on console? 60-120fps" he also said"
He just told me this:Well in this PC, you posted it would be more like medium at best, with some modest resolution, like 1080p or 1440p on low. My estimate, I would say, those cards does not even have DLSS, that ML upsampling.
The PS5 limits the bandwidth to 4k 120fps HDR 4:2:2 12bits that is around 32Gbps.I think it is weirder, that the HDMI bandwidth on PS5 is limited to 32 gbps (or is this a FUD?) and will be raised in an update (?).
It looks like there are a lot of issues with HDMI 2.1. Even the PS5 TV XH90 needs updates to fully support all 2.1 features.
Is HDMI 2.1 very taxing? Could someone test some XSX games with HDMI 2.0 and check, if it somehow slows down the console(checking the tearing and the fps drops?)? It is strange that only MS managed to implement it "correctly" into the console and not Sony.
Yeap... Sony marketing since they started to talk about the PS5 is being better than MS... MS didn’t know what to do and focused in give a lot of tech data with big numbers for their base forgetting consoles is about how you play games.
The PS5 limits the bandwidth to 4k 120fps HDR 4:2:2 12bits that is around 32Gbps.
It will increase when the patch for 8k 60fps goes live.
With the actual PS5 output it works with all receivers without issues.
Yeap... Sony marketing since they started to talk about the PS5 is being better than MS... MS didn’t know what to do and focused in give a lot of tech data with big numbers for their base forgetting consoles is about how you play games.
But I don’t believe people are faking the Dual Sense features... it really surprises you.
I definitely need read into the whole chroma, 4:4:4 HDR and bit rate topic...
For a normal consumer this is all to complicated
I thought HDMI 2.1 is one standard, just pick your preferred resolution, Hz and start playing. Not something with billions of different options, which even do not work correctly depending on console and TV..
And theoretically pick 12 bit. But afaik currently there are no 12 bit TV panels, so it doesn't make sense to use it?? For what is the bit number and is it connected to the 4:4:4? Because that's 12 xD 4:2:2 is 8? .... So 8 bit?Form what I understand.
Best color : 4,4,4
You can see the triggers in action in this video.
From your friends answers it's pretty clear that he's either like 15 years old or completely new to PC gaming and only gets his info from memes. Yes, PC are superior to consoles, yes you can always modify your settings on PC as much as you like, you can have high performance and visuals at the same time (instead of choosing between a performance mode or quality mode). BUT, right now, at this moment, you can absolutely NOT build a PC for $500 that's gonna get you better results than a ps5 or a series X, and I'm saying that with a 2070S and a 3800x. In two years you'll be able to build better PC for that price, but all the ones you linked are quite frankly bad builds. If you want a fighting chance you need to start with at least the 2000 series since they at least support DLSS.He just told me this:
"In what aspect" is the ps5 more powerful hardware? "Basically all games on console you can’t configure your settings it’s all capped. Only thing I am iffy about is 4K but if" our one friend's "pc can run 4k our specs are basically the same."
this is our one friend's pc he's referring to $1000 one
Shop: Dell Site Map of All Products, Solutions & Services | Dell USA
Shop all categories on Dell.com. Explore the site map to find deals and learn about laptops, PCaaS, cloud solutions and more.www.dell.com
That 1000USD one is like power of Xbox One S (well maybe 2, but it still sucks), sure it has potential, if you swap GPU for another like 500+USD and yeah sure you can run CS:GO at 4k (probably), but some more advanced games, it's not going to happened.He just told me this:
"In what aspect" is the ps5 more powerful hardware? "Basically all games on console you can’t configure your settings it’s all capped. Only thing I am iffy about is 4K but if" our one friend's "pc can run 4k our specs are basically the same."
this is our one friend's pc he's referring to $1000 one
Shop: Dell Site Map of All Products, Solutions & Services | Dell USA
Shop all categories on Dell.com. Explore the site map to find deals and learn about laptops, PCaaS, cloud solutions and more.www.dell.com
RGT leaked this a month or two ago, as did a few others and I'm pretty sure MLiD got it from him (as much as he likes to deny it).
Although RGT didn't mention who was developing it because he "could've got his sources in a lot of trouble". He did mention a few interesting details, that it's also coming to PC and that it'll "blow minds once we see it".
EDIT: he also mentioned that Bloodborne is getting a remaster for the PS5
Timestampped link
I don’t know.Do you think they capped it temporarily until the fixes are in store so it works on all 2.1 equipment? Since we know the XSX doesn’t work with those receivers that the PS5 does currently.
Also did Vincent say you don’t get 4:4:4 with 4K/60 on the XSX? Why is that? Or am I misunderstanding?
Remember when we used RGB full and limited? Well RGB full is chroma 4:4:4... it has all ranges of colors.I definitely need to read into the whole chroma, 4:4:4 HDR and bit rate topic...
For a normal consumer this is all to complicated
I thought HDMI 2.1 is one standard, just pick your preferred resolution, Hz and start playing, like on PC or old days... Not something with billions of different options, which even do not work correctly depending on console and TV..
You will see more difference in the Chroma than the bits... and they are different h the units... you can have 4:2:2 with 8, 10 or 12bits... same for 4:4:4.And theoretically pick 12 bit. But afaik currently there are no 12 bit TV panels, so it doesn't make sense to use it?? For what is the bit number and is it connected to the 4:4:4? Because that's 12 xD 4:2:2 is 8? .... So 8 bit?
Going from 10 to 12 bit supposedly removes all banding that you might see in gradients. Example You can see it in sky shots sometimes. TVs have gotten better at blending it better.You will see more difference in the Chroma than the bits... and they are different h the units... you can have 4:2:2 with 8, 10 or 12bits... same for 4:4:4.
So it is not directly related with the chroma.
Bits: the deep of the color, how many colors the panel can display... 12 bits allow colors from 000000000000 to 111111111111... that means 4096 of each primary color and a total of 68719476736 different colors. You won’t see a big difference here because 8, 10 and 12bits will only chance the numbers of tons of the same color... instead 15 yellows you have 18 in a not literal example... 8bits and 10bits already have a lot of colors.
Chorma: is basically a compression of the colors... RGB full use all colors like Chroma 4:4:4 but the Chroma 4:2:2 and 4:0:0 compress the colors so some colors are lost in the process and can’t be show.
Myabe that is the best example:
You see you lose some colors after the compression... that decrease a lot the bandwidth used.