Same, but I need to go through grad school entrance exams too. :/
Damn.
It may have been better for you not to see this lol
;(
Same, but I need to go through grad school entrance exams too. :/
I'd be really surprised to see a better-looking open world game of similar scale. Smaller scale stuff, sure.Looks fantastic, E3 is off to a great start!
Pretty confident this WON'T be the best looking game at the show though -- the character models still leave a lot to be desired, but the environments look great.
But they don't double the effective power of the hardware. Consider a 750Ti, $149 GPU, which runs Titanfall better than the Xbox One
Sounds cool: The Witcher 3: Wild Hunt Will Feature An Entire Underwater World For Exploration: http://www.dsogaming.com/news/the-w...e-an-entire-underwater-world-for-exploration/
Damn.
It may have been better for you not to see this lol
;(
Very simplistically, a XB1 has ~1.2 GPU TFlops while a 750ti sits at ~1.4. The fact that it outperforms XB1 significantly would seem to support what I was saying - just like most multiplat measurements: you can get more out of the (anemic) console CPUs than you would expect, but GPU-wise? No huge difference in utilization.I'm not saying they double the power of the hardware. I'm saying they have access to all of it. The "50%" i threw out was an exaggerated hypothetical.
And i'm fairly sure the 750 Ti is significantly more powerful that Xbox One GPU (raw power, that is). Not sure what's in the Xbox One though, so i can't really be certain.
API differences don't really have that much of an impact on GPU performance, more on CPU usage. You can easily notice this in most (almost all) multiplatform titles. When paired with a decent CPU, the performance of console-equivalent GPUs is generally not far off consoles at all (at equivalent settings).
I understand why CDPR had to delay the game until next February, but it sucks. This summer would have been perfect.
You do realize i'm not comparing it to a 780 Ti right? To think that it's power is equivalent that of a 7850 is already wrong. The perfomance improvements from low level APIs can account for the performance difference between a 7850 and a 7870. I'd even go as far as to say a 7970, but that's probably too much. Again, you can't make comparisons based off specs. I'm not talking about HUGE performance differences, by the way, so i'm not sure where you're getting "magic" from. I'm just saying there's it can't be compared in the way that you guys have done.
![]()
He still looks pretty grizzled. I think some stuff in the trailer is from flashbacks, so that might account for him looking a bit younger.
No, a low level API is not going to give a less powerful GPU more fillrate or more ALU power!
What the APIs do is help with CPU usage (which the new consoles really need).
That is completely incorrect.
The PS4's GPU is not based on Pitcairn, it is based on the second version of GCN (and thus has a bit better tesselator and more compute queues), it also does not have the same core config as the 7870 or 7850.
The reason that people say it is close to a 7850 is due to the fact that it is the closest AMD GPU in regards to Pixel fillrates, Texel fillrates, Floating point performance and bandwidth.
I've never played any games in this series but that trailer has gotten me interested. Does it look like you need to play Witcher 2 before this? I can't, because I don't have a gaming PC or 360.
Keep believing that, but a pc with a 7850 can run Crysis 2 max in 1080p at 60fps(I had one). Which looks better than Ryse on a coded to the metal platform that supposedly has a GPU as powerful as a 7770 or something like that. While low level access does improve performance, it will not make a PS4 run a game like BF4 in 1440p 60fps with 2xMSAA.Note, console GPUs and PC GPUs can't be compared in the way you think.
To just go by specs is not sufficient. What most people miss is that consoles have their own low-level APIs that are different than the inefficient Direct X API that PCs traditionally use. Where PC API may bottleneck your setup by 50%, console APIs work "closer to the metal" to enable the system to extract near 100% of the possible performance from the hardware.
It's seriously hard to make a direct comparison, even less so when the available power on the PC means developers spend less time optimizing. Look at the PS3, for example. 8 year old tech handling the visuals of GTA V shows you how far these GPUs go.
Console APIs are simply very efficient.
Don't remember saying that.it will not make a PS4 run a game like BF4 in 1440p 60fps with 2xMSAA.
HOLY FUCKING SHIT THIS TRAILER
AND CHARLES DANCE TOO?!!!
I hope my PS4 can run this.
Not hopeless at all. Witcher 2 was great on dusty old 360, 3 will be great on PS4.I was pretty sure I was going to buy this game.
Now enter all these announcements.
SO MUCH goodness for pre-order on GOG, including that sweet loyal discount (I have The Witcher 2 on my backlog). This definitely is pre-order done perfect.
Now I watch my laptop, in which I play, it's good enough to play Bioshock Infinite on High. I know that my laptop was better than 360/PS3, that's why I went PC for multi-platform past 2 year-ish.
Then I see my PS4, which I'm happy with.
Hold me GAF, I know the PC master race has it's child on this game, but I know I cannot afford a proper PC to run this as intended.. so.. hoping that they make a good job with the PS4 port.
Am I hopeless?
Keep believing that, but a pc with a 7850 can run Crysis 2 max in 1080p at 60fps(I had one). Which looks better than Ryse on a coded to the metal platform that supposedly has a GPU as powerful as a 7770 or something like that. While low level access does improve performance, it will not make a PS4 run a game like BF4 in 1440p 60fps with 2xMSAA.
I'd be really surprised to see a better-looking open world game of similar scale. Smaller scale stuff, sure.
I've just been looking up details on the books. Can anyone who has read them tell me if I've got this right?
The Last Wish and Sword of Destiny are short story compilations that act as introductions which should be read first.
Then there's (in order) Blood of Elves, Time of Contempt, Baptism of Fire, Swallow's Tower and Lady of the Lake that act as the connected "saga" set after the short story collections.
Sword of Destiny and the final two novels have yet to be translated into English but there are apparently fan translations online that are very good. Then when I'm done, the games take place after all of the books, correct?
It has been modified with certain features likely to be seen (and are currently seen in Hawaii) in newer revisions of GCN, but I would liken the PS4's GPU more to a modified Pitcairn rather than something new. Like the HD 7850/R7 265, it's based on a full 20 compute unit, Pitcairn-like structure with several of its compute units disabled due to yield issues. The 7850/265 have 16 units of the 20 active, the PS4 18, however, the PS4's GPU seems to be clocked at 800 MHz, placing it below any out-of-the-box 7850/265, let alone even a mildly overclocked one. This offsets the small compute unit advantage the PS4's GPU has and makes it practically comparable to a reference 7850/265 in computational ability with, of course, the more advanced tesselator (?) and the 8 ACEs (currently only seen in one other chip, the dubbed "GCN 1.1" Hawaii).
I would still consider it Pitcairn-based, though, certainly. The core architecture and performance per clock of the compute units largely seem completely identical to the existing Pitcairns.
Man the character detail is just stunning
Is it just me or did the gameplay portions look better than the cgi scenes
Man the character detail is just stunning
![]()
Man the character detail is just stunning
![]()
Triss' face doesn't impress me at all. Geralt looks good except for the hair, as usual.
This angle isn't very flattering, she looked great on the.... other scene of the trailer. There's a gif in this thread somewhere.
Triss' face doesn't impress me at all. Geralt looks good except for the hair, as usual.
This angle isn't very flattering, she looked great on the.... other scene of the trailer. There's a gif in this thread somewhere.
I've got a GTX780 (not that old) and I already feel I need to upgrade for this =/