• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Witcher 3 Wild Hunt - E3 2014 Trailer - The Sword Of Destiny

Durante

Member
Looks fantastic, E3 is off to a great start!

Pretty confident this WON'T be the best looking game at the show though -- the character models still leave a lot to be desired, but the environments look great.
I'd be really surprised to see a better-looking open world game of similar scale. Smaller scale stuff, sure.
 

Derp

Member
But they don't double the effective power of the hardware. Consider a 750Ti, $149 GPU, which runs Titanfall better than the Xbox One

I'm not saying they double the power of the hardware. I'm saying they have access to all of it. The "50%" i threw out was an exaggerated hypothetical.

And i'm fairly sure the 750 Ti is significantly more powerful that Xbox One GPU (raw power, that is). Not sure what's in the Xbox One though, so i can't really be certain.
 

CheesecakeRecipe

Stormy Grey
Sounds cool: The Witcher 3: Wild Hunt Will Feature An Entire Underwater World For Exploration: http://www.dsogaming.com/news/the-w...e-an-entire-underwater-world-for-exploration/

201C9C2BB1351AEC7F2D1C3CB4AEAD76DCD10134


Soon.
 
I've never played any games in this series but that trailer has gotten me interested. Does it look like you need to play Witcher 2 before this? I can't, because I don't have a gaming PC or 360.
 

Durante

Member
I'm not saying they double the power of the hardware. I'm saying they have access to all of it. The "50%" i threw out was an exaggerated hypothetical.

And i'm fairly sure the 750 Ti is significantly more powerful that Xbox One GPU (raw power, that is). Not sure what's in the Xbox One though, so i can't really be certain.
Very simplistically, a XB1 has ~1.2 GPU TFlops while a 750ti sits at ~1.4. The fact that it outperforms XB1 significantly would seem to support what I was saying - just like most multiplat measurements: you can get more out of the (anemic) console CPUs than you would expect, but GPU-wise? No huge difference in utilization.
 

RoadHazard

Gold Member
API differences don't really have that much of an impact on GPU performance, more on CPU usage. You can easily notice this in most (almost all) multiplatform titles. When paired with a decent CPU, the performance of console-equivalent GPUs is generally not far off consoles at all (at equivalent settings).

I'd say this depends on whether the dev in question takes advantage of the low-level graphics API available (which of course requires more effort) or sticks to the higher-level DirectX-like API (talking specifically about the PS4 here - not sure if the XBO has two different API levels like that). And with most multiplats the answer is probably the latter. So yeah, there you won't see all the machine really has to offer. That's probably mostly up to Sony's internal studios, as usual. But we know some 3rd party devs have already made rather significant console-specific low-level optimizations (unless I'm mistaken the Tomb Raider devs talked about this), so let's see what CDPR comes up with.
 

TheD

The Detective
You do realize i'm not comparing it to a 780 Ti right? To think that it's power is equivalent that of a 7850 is already wrong. The perfomance improvements from low level APIs can account for the performance difference between a 7850 and a 7870. I'd even go as far as to say a 7970, but that's probably too much. Again, you can't make comparisons based off specs. I'm not talking about HUGE performance differences, by the way, so i'm not sure where you're getting "magic" from. I'm just saying there's it can't be compared in the way that you guys have done.

No, a low level API is not going to give a less powerful GPU more fillrate or more ALU power!

What the APIs do is help with CPU usage (which the new consoles really need).
 

LiquidMetal14

hide your water-based mammals
14169113348_c6f8da3fc8_o.gif


He still looks pretty grizzled. I think some stuff in the trailer is from flashbacks, so that might account for him looking a bit younger.

There is dithering on 3 strands of forehead shadowed hairs and a strange absence of Havoc physics on Geralt's nose hairs. Next gen rewinding!
 

Serandur

Member
That is completely incorrect.

The PS4's GPU is not based on Pitcairn, it is based on the second version of GCN (and thus has a bit better tesselator and more compute queues), it also does not have the same core config as the 7870 or 7850.

The reason that people say it is close to a 7850 is due to the fact that it is the closest AMD GPU in regards to Pixel fillrates, Texel fillrates, Floating point performance and bandwidth.

It has been modified with certain features likely to be seen (and are currently seen in Hawaii) in newer revisions of GCN, but I would liken the PS4's GPU more to a modified Pitcairn rather than something new. Like the HD 7850/R7 265, it's based on a full 20 compute unit, Pitcairn-like structure with several of its compute units disabled due to yield issues. The 7850/265 have 16 units of the 20 active, the PS4 18, however, the PS4's GPU seems to be clocked at 800 MHz, placing it below any out-of-the-box 7850/265, let alone even a mildly overclocked one. This offsets the small compute unit advantage the PS4's GPU has and makes it practically comparable to a reference 7850/265 in computational ability with, of course, the more advanced tesselator (?) and the 8 ACEs (currently only seen in one other chip, the dubbed "GCN 1.1" Hawaii).

I would still consider it Pitcairn-based, though, certainly. The core architecture and performance per clock of the compute units largely seem completely identical to the existing Pitcairns.
 

POPEGTR

Neo Member
I've never played any games in this series but that trailer has gotten me interested. Does it look like you need to play Witcher 2 before this? I can't, because I don't have a gaming PC or 360.

Nah even W1 and W2 doesnt explain everything about the story. They promised to explain everything to new players in W3.
 
Note, console GPUs and PC GPUs can't be compared in the way you think.
To just go by specs is not sufficient. What most people miss is that consoles have their own low-level APIs that are different than the inefficient Direct X API that PCs traditionally use. Where PC API may bottleneck your setup by 50%, console APIs work "closer to the metal" to enable the system to extract near 100% of the possible performance from the hardware.

It's seriously hard to make a direct comparison, even less so when the available power on the PC means developers spend less time optimizing. Look at the PS3, for example. 8 year old tech handling the visuals of GTA V shows you how far these GPUs go.

Console APIs are simply very efficient.
Keep believing that, but a pc with a 7850 can run Crysis 2 max in 1080p at 60fps(I had one). Which looks better than Ryse on a coded to the metal platform that supposedly has a GPU as powerful as a 7770 or something like that. While low level access does improve performance, it will not make a PS4 run a game like BF4 in 1440p 60fps with 2xMSAA.
 

POPEGTR

Neo Member
They got me sh*t I ordered it on GOG and I just realized I planned on playing it on PS4.

System requirements must be like freaking titan-z.
 

Fjordson

Member
Wow. Looks amazing. Best looking game I've seen in a while.

I was pretty sure I was going to buy this game.

Now enter all these announcements.

SO MUCH goodness for pre-order on GOG, including that sweet loyal discount (I have The Witcher 2 on my backlog). This definitely is pre-order done perfect.

Now I watch my laptop, in which I play, it's good enough to play Bioshock Infinite on High. I know that my laptop was better than 360/PS3, that's why I went PC for multi-platform past 2 year-ish.

Then I see my PS4, which I'm happy with.

Hold me GAF, I know the PC master race has it's child on this game, but I know I cannot afford a proper PC to run this as intended.. so.. hoping that they make a good job with the PS4 port.

Am I hopeless?
Not hopeless at all. Witcher 2 was great on dusty old 360, 3 will be great on PS4.
 

NAPK1NS

Member
This is a graphical stunner.

I'm still withholding my excitement. The first two games were marred with technical fumbles at launch. The open world doesn't reassure me that the polish won't be spread thin. It's easy to look at the visuals and lose your mind but I'm going to get impressions post-launch before I sky-rocket.
 

RoadHazard

Gold Member
Keep believing that, but a pc with a 7850 can run Crysis 2 max in 1080p at 60fps(I had one). Which looks better than Ryse on a coded to the metal platform that supposedly has a GPU as powerful as a 7770 or something like that. While low level access does improve performance, it will not make a PS4 run a game like BF4 in 1440p 60fps with 2xMSAA.

It's worth noting that, AFAIK, the XBO doesn't currently have a low-level graphics API quite like the PS4 does. The PS4 has two different APIs - one that's pretty much on the same abstraction level as DirectX (this is what I imagine most multplat devs are using, as it significantly reduces the work they have to put in), and one that provides lower-level GPU access for some serious hardware-specific optimizations. The XBO just has DirectX AFAIK, although the situation there should improve with DX 12.
 

RedSwirl

Junior Member
I've just been looking up details on the books. Can anyone who has read them tell me if I've got this right?

The Last Wish and Sword of Destiny are short story compilations that act as introductions which should be read first.

Then there's (in order) Blood of Elves, Time of Contempt, Baptism of Fire, Swallow's Tower and Lady of the Lake that act as the connected "saga" set after the short story collections.

Sword of Destiny and the final two novels have yet to be translated into English but there are apparently fan translations online that are very good. Then when I'm done, the games take place after all of the books, correct?

Pretty much. Baptism of Fire is already out in the UK and should be coming out in the US at the end of this month. Don't know why they passed over Sword of Destiny. It's pretty important in terms of introducing Ciri.
 

TheD

The Detective
It has been modified with certain features likely to be seen (and are currently seen in Hawaii) in newer revisions of GCN, but I would liken the PS4's GPU more to a modified Pitcairn rather than something new. Like the HD 7850/R7 265, it's based on a full 20 compute unit, Pitcairn-like structure with several of its compute units disabled due to yield issues. The 7850/265 have 16 units of the 20 active, the PS4 18, however, the PS4's GPU seems to be clocked at 800 MHz, placing it below any out-of-the-box 7850/265, let alone even a mildly overclocked one. This offsets the small compute unit advantage the PS4's GPU has and makes it practically comparable to a reference 7850/265 in computational ability with, of course, the more advanced tesselator (?) and the 8 ACEs (currently only seen in one other chip, the dubbed "GCN 1.1" Hawaii).

I would still consider it Pitcairn-based, though, certainly. The core architecture and performance per clock of the compute units largely seem completely identical to the existing Pitcairns.

Pitcairn is a GPU core based on the first gen GCN microarchitecture, just because the raw PS4 chip has as many CUs as it does not make it the same core.
 
Top Bottom