Witcher 3: Official Gameplay Trailer

rashbeep

Banned
I think it'll do better than that.
Looking at previous titles, GTX 770 does better than a ~20 FPS increase at PS4 settings. And that's assuming PS4 will be running W3 at a solid 30 FPS.

This. I haven't really been impressed with the consoles when it comes to multiplatform releases. Not to mention this is CDPR's first game on a Sony platform.

So to answer Serick, you have a very powerful CPU and a great GPU. Get the PC version.
 

PFD

Member
I had to.

witcher3lgkug.gif

This is gonna be me every day until May 19th
 

Kezen

Banned
This. I haven't really been impressed with the consoles when it comes to multiplatform releases. Not to mention this is CDPR's first game on a Sony platform.

So to answer Serick, you have a very powerful CPU and a great GPU. Get the PC version.

I feel compelled to add that initially, I was responding to someone with a 670, not a 770.
 

UnrealEck

Member
Oh I think I should say this too about the GTX 770, being an owner of one (which is away being repaired and taking way too fucking long :p )
I think it came at a bad time. It cost me £315 new in June 2013 and came with 'just' 2GB of memory. The GPU in it is still quite good. The 2GB though is a bit of a pain for those who want to go SLI with it. The 4GB card will be holding up better for those who bought it, but it cost them a good bit more to buy and now just a year or so later we got the 970 with a standard 4GB.
It's a card I kinda wish I hadn't bought, but I'm not too bummed about it since I did get my use out of it.

I think the 700 series in general was a bit late. The 780 Ti is holding up really well against the 970 and 980 but it still has that 3GB memory instead of 4GB. But the cost was pretty substantial for the Ti.

Long story short, I think the 700 series' value dropped like a rock really quickly.
 

Serick

Married Member
That's your personal interpretation, don't put words into my mouth.
20fps is not anywhere near my definition of "a world of a difference" (my actual words). But to each his own.
And I would like to stipulate that this is merely my speculation, I do not pretend to know the future.

"Don't expect a world of difference"
"I would recommend the PS4 version" - At the 670 poster
"20fps is not anywhere near my definition of "a world of a difference" (my actual words)."

"I didn't say marginal"

Okay.

What hardware are we talking about ? PC is mosaic.

Already listed the hardware I am talking about twice now.

It seems I was not crazy :
http://www.techspot.com/review/608-hitman-absolution-performance-benchmarks/page3.html
770 is a rebadged 680.
Same case for Ryse.
http://gamegpu.ru/images/remote/htt...PU-Action-Ryse_Son_of_Rome-test-Ryse_1920.jpg
The 280X which is supposedly AMD's equivalent of the 770 achieves impressive results. My 970 very rarely touch a 290 in this game, yet the former is supposedly more powerful.

When you said 280, I thought we were talking about a GTX 280. Need to clarify we're adding another variable in. We had been comparing nvidia cards to the consoles, not PC radeon to PC nvidia.

Also, what your link shows me is not only does the 770 run Ryse @ higher FPS than the Xbox One but with MUCH higher IQ.... So not sure what you're trying to show here when we're talking about PC hardware vs. the consoles. If anything you just illustrated my point that PC's outperform consoles?


I know people with superior hardware compared to consoles (8800 and Core 2 Duo) can't play games like Crysis 3 at all, because the support for those cards has been dropped, hence my :

I won't promise you 2011 hardware will be supported in 2019, either by ISV or AMD.

The GTX 770 wasn't released in 2011.

Your Crysis comparison also doesn't hold for several reasons. First, 360 and PS3's aren't mobile PC's in a box, XO and PS4 are. Second, the console versions of Crysis 3 were severely gimped to run on the consoles vs. the PC. You are illustrating lack of work/support put into the PC version of a game released 2 years ago, not a power gap caused by optimization of 360/PS3 APIs.

I feel compelled to add that initially, I was responding to someone with a 670, not a 770.

Yet you were responding to and quoting me and I never stated I had a 670.
 

Kezen

Banned
"Don't expect a world of difference"
"I would recommend the PS4 version" - At the 670 poster
"20fps is not anywhere near my definition of "a world of a difference" (my actual words)."

"I didn't say marginal"

Okay.
Indeed. Different words mean different things.
I'm not playing semantics here, those words convey different ideas. "Don't expect a world of a difference" does not mean your 770 is "marginally" more powerful.

When you said 280, I thought we were talking about a GTX 280. Need to clarify we're adding another variable in. We had been comparing nvidia cards to the consoles, not PC radeon to PC nvidia.
I was giving an example of a "weaker" card outperforming a stronger one. Unless that has changed, consoles are AMD GCN powered. My comparison stands. The day a PS4 outperforms a 670 is the day I won't be surprised.

Also, what your link shows me is not only does the 770 run Ryse @ higher FPS than the Xbox One but with MUCH higher IQ.... So not sure what you're trying to show here when we're talking about PC hardware vs. the consoles. If anything you just illustrated my point that PC's outperform consoles?
No you are missing the point, I was not comparing the Xbox One to a 770 but a 770 to a "weaker" GCN cards.
Remember this is what I was trying to demonstrate :
Those "real world results" vary depending on the workload. The paper specs are there for you to check but the actual results depend on a number of factors. We have seen games running better on a "weaker" GPU than a stronger one. Because GPU architecture are not born equal, Hitman Absolution ran better on a 280 than a 770, why is that ? Compute. Yet my 770 is stronger than a 280 on paper.

The GTX 770 wasn't released in 2011.
I won't promise it will be supported by Nvidia and ISV anno 2019.

Your Crysis comparison also doesn't hold for several reasons. First, 360 and PS3's aren't mobile PC's in a box, XO and PS4 are. Second, the console versions of Crysis 3 were severely gimped to run on the consoles vs. the PC. You are illustrating lack of work/support put into the PC version of a game released 2 years ago, not a power gap caused by optimization of 360/PS3 APIs.
My point stands though, those people with better hardware than consoles had to upgrade. What happens when a 670 is no longer supported ? It might not run games as well as a PS4 in this case.
Optimization due to APIs are not (and that's clearly written in my post) the only reason one could have to upgrade to keep pace.
 

Serick

Married Member

Hopefully I'll save you some time here.

I literally do not care how PC Radeons compare with PC Nvidia cards. I also don't care if Nvidia supports my card until 2019. You don't even know for sure Sony will support their console until 2019. None of this serves the discussion of TW3 on console or PC. Save yourself some time.

I am only looking at comparing a GTX 770 to the PS4's hardware and maybe even the Xbox One.

Thanks for the discussion, I've gotten enough information to make up my mind on which version I'm going with.
 

UnrealEck

Member
Bringing up how Ryse runs better on GCN than Kepler to demosntrate how it's possible Witcher 3 could have a similar performance boost on the PS4 is kinda reaching.
I understand what you're getting at, that it's not always black and white, but I don't see how the consoles will see a similar situation.
 

roytheone

Member
Man, this looks so god damn amazing. The world looks gorgeous and incredibly alive, just walking around will be a fantastic experience. There also seems to be a lot of care and character in the dialog, which should help make even the most basic quest much more interesting. May 19 can't come soon enough!
 

Kezen

Banned
Hopefully I'll save you some time here.

I literally do not care how PC Radeons compare with PC Nvidia cards. I also don't care if Nvidia supports my card until 2019. You don't even know for sure Sony will support their console until 2019. None of this serves the discussion of TW3 on console or PC. Save yourself some time.

I am only looking at comparing a GTX 770 to the PS4's hardware and maybe even the Xbox One.

Thanks for the discussion, I've gotten enough information to make up my mind on which version I'm going with.

My argument has always been that architectures are not flat if that makes sense, they have pros and cons. I was trying to illustrate that with Ryse and Hitman.
This very much applies to PS4-PC comparisons, the PS4 GPU while weaker than a 670 may outperform it when it comes to compute which we know will have a very important part in games.
I believe I have been coherent from the very beginning.

So when someone was torn between a PS4 and a 670 equipped PC I advised him/her to go PS4. I never said your 770 would not fare better, but my estimation is that it may not make anywhere near as big of a difference as you believe.
 

Serick

Married Member
My argument has always been that architectures are not flat if that makes sense, they are pros and cons. I was trying to illustrate that with Ryse and Hitman.
This very much applies to PS4-PC comparisons, the PS4 GPU while weaker than a 670 may outperform it when it comes to compute which we know will have a very important part in games.
I believe I have been coherent from the very beginning.

No, I see the point you are trying to make. It just doesn't address my conundrum.

I mean sure, at the end of the day if my 770 ends up not being supported and I need to drop some bills on a new card in 2019 that's fine. TW3 comes out in a few days :)

I get that lack of driver/api support on the PC side can kill off superior cards before their time, that's an inherent risk with PC gaming. I also understand that games designed on GCN architecture (consoles) will benefit PC gamers using GCN cards (your ryse and hitman examples -- again my disbelief came from me thinking you were referencing a GTX 280 not a Radeon 280) and that Nvidia has to tackle this with more horsepower.
 

Kezen

Banned
No, I see the point you are trying to make. It just doesn't address my conundrum.

I mean sure, at the end of the day if my 770 ends up not being supported and I need to drop some bills on a new card in 2019 that's fine. TW3 comes out in a few days :)

I get that lack of driver/api support on the PC side can kill off superior cards before they're time, that's an inherent risk with PC gaming.

It only happens when the card/architecture in question is no longer a significant part of the PC gaming landscape. PC gaming moves on and so must developpers.
Will tears be shed when a 2013 GPU won't be supported by 2019 games ? I don't think so, as a developper it's wiser to focus ressources where it actually matters.

It's a foregone conclusion you should get better results (and even then I won't be shocked if compute genuinely favors the PS4) with a 770 but how better ? I gave my estimation but that's nothing more than that.

I also understand that games designed on GCN architecture (consoles) will benefit PC gamers using GCN cards (your ryse and hitman examples -- again my disbelief came from me thinking you were referencing a GTX 280 not a Radeon 280) and that Nvidia has to tackle this with more horsepower.
I thought it was obvious I was comparing contemporary cards.
 

Slayer-33

Liverpool-2
Me: "Is it May 19th yet?"

Geralt: "No."

Me: "Then fuck off."


Seriously, this game looks ridiculous. Eat your heart out, Bethesda. And I am saying this as a huge ES fan. Just wow.

They should have used that dialogue instead, would have been hysterical.
 

RedSwirl

Junior Member
*sigh*

i5 4670k
GTX 760 2GB
8GB RAM
128GB SSD
1080p Samsung TV

I'm praying I can somehow negotiate 1080p 60fps out of some mix of medium/high settings with the minimum level of AA. You guys think it's possible?
 

Serick

Married Member
I thought it was obvious I was comparing contemporary cards.

I have no idea why my mind went to a GTX 280. I think it's just because I've not bought a radeon in so long it didn't make the connection.

But again, really, thanks for the insight and the discussion. You actually have me considering going radeon when I upgrade next :)

My biggest fear this generation is that my i7 (I use it for video stuff too, so not wasted) will go unused since the consoles' CPUs are so slow.

(I think we derailed this enough :\)

Edit:

Back on topic -- I just wanted to mention that this game gives me chills. I was totally not excited to play it until this trailer today.
 

Loris146

Member
*sigh*

i5 4670k
GTX 760 2GB
8GB RAM
128GB SSD
1080p Samsung TV

I'm praying I can somehow negotiate 1080p 60fps out of some mix of medium/high settings with the minimum level of AA. You guys think it's possible?

Maybe , not easy at 60 fps anyway.
 

Kezen

Banned
I have no idea why my mind went to a GTX 280. I think it's just because I've not bought a radeon in so long it didn't make the connection.

But again, really, thanks for the insight and the discussion. You actually have me considering going radeon when I upgrade next :)

My biggest fear this generation is that my i7 (I use it for video stuff too, so not wasted) will go unused since the consoles' CPUs are so slow.

It's hard to say to which extent will DX12 impact the expected lifetime of our CPUs. With DX12 at least you have much less chances to be limited by your CPU.
On multiplatform DX12 games it might not be that useful since your GPU will be the limiting factor, perhaps if you want to push the LOD much farther than consoles and keep the framerate at 60fps.
 

thebloo

Member
Man, could it be? Could I actually like this one? I loved W1 and hated W2, so I have no idea on which side this will fall. But it seems to be more monster focused with the political world as a backdrop.

I have a feeling they borrowed the Ubisoft "towers" with their small towns that you free, which is amusing.
 

Randam

Member
I will reiterate it, but those shadows are from a time of day change letting the sun shine through a window... and geralt is just standing next to a point light in one and not the other. They are the same basically otherwise.

how do you know it is a different time of day?
 

Daverid

Member
Man, could it be? Could I actually like this one? I loved W1 and hated W2, so I have no idea on which side this will fall. But it seems to be more monster focused with the political world as a backdrop.

I have a feeling they borrowed the Ubisoft "towers" with their small towns that you free, which is amusing.

As someone who loves both games I always find it hard to comprehend how someone could love W1 but hate W2. Although I guess it's entirely possible, I for one loved Mass Effect but felt completely underwhelmed by ME2, but I dunno if I'd go as far as hate. Even with a more political focus W2 still had fantastic characters and a really interesting plot, I guess if you really disliked the politics that much, but plenty of that happens in the books.. Oh whatever, each to their own I guess.

Also I don't think village liberation will be a "Tower" like mechanic. It's entirely possible, but I don't think CDPR would be that stupid. I think it's merely a coincidence based thing and another way to layer the living world. They also spoke in interviews about how clearing monsters on certain roads would increase merchant travel in the area and change the economy, and yet I doubt that is connected to any kind of Map Icon Reveal.
So my bets on it simply being another way of expanding on the living world, but we'll have to wait and see.
 

julrik

Member
How will TW3 run on:

- i7-4710HQ
- 8 GB RAM
- GTX 970M 3GB

@1080p?

Less than 30 FPS on high?

Should I buy it on PS4 instead?
 

thebloo

Member
As someone who loves both games I always find it hard to comprehend how someone could love W1 but hate W2. Although I guess it's entirely possible, I for one loved Mass Effect but felt completely underwhelmed by ME2, but I dunno if I'd go as far as hate. Even with a more political focus W2 still had fantastic characters and a really interesting plot, I guess if you really disliked the politics that much, but plenty of that happens in the books.. Oh whatever, each to their own I guess.

Also I don't think village liberation will be a "Tower" like mechanic. It's entirely possible, but I don't think CDPR would be that stupid. I think it's merely a coincidence based thing and another way to layer the living world. They also spoke in interviews about how clearing monsters on certain roads would increase merchant travel in the area and change the economy, and yet I doubt that is connected to any kind of Map Icon Reveal.
So my bets on it simply being another way of expanding on the living world, but we'll have to wait and see.

As a paradox, I love the books too. It's just that the story in W2 was boring to me and the combat was lame. Anyway.

About the tower thing, yeah, I wouldn't expect CDPR to do that. But:
- monsters dead
- cinematic of villagers coming in
- a hub with a shop opens.

I may be just making weird connections, but I found it amusing.
 

Leonsito

Member
I have the PC CE preordered since June last year... I just need to upgrade my computer next month, tax returns fuck yeah!
 
How will TW3 run on:

- i7-4710HQ
- 8 GB RAM
- GTX 970M 3GB

@1080p?

Less than 30 FPS on high?

Should I buy it on PS4 instead?

Awful. You might as well sell that piece of trash.

You're fine, it'll run probably at least as well or better than the PS4. Plus you have the option of tweaking the graphics for either better visuals or a better framerate.
 

Daverid

Member
About the tower thing, yeah, I wouldn't expect CDPR to do that. But:
- monsters dead
- cinematic of villagers coming in
- a hub with a shop opens.

I may be just making weird connections, but I found it amusing.

Well I'm not going to completely shoot down the possibility, because you do make a good point to the possibility of it.
However I just really, really hope CDPR is smarter than implementing such an obvious shitty AAA pitfall like that. It's bad enough they opted for a Batman Vision mechanic, but at least in that case it makes some level of sense for Geralt (I still don't like it though), but a map icon reveal type mechanic via village liberation would be straight up stupidity.
 
This game isn't based in the melting pot. I am all for diversity, and in a game like say, Mass Effect, or Halo it makes absolute perfect sense to have the most diverse cast of characters possible. However adding diversity just for the sole sake of checking off a box is not the right way to go about it.

This is the socially acceptable way to say the default human is a white male.
 

Lingitiz

Member
Curious how good the mod support will be for this game. REDkit hit pretty late for TW2 and the way the game was setup didn't really support mod integration that well. This certainly seems like it could foster a Bethesda game type mod community.
 

Jarrod357

Neo Member
I've got 6950 crossfire with x6 1090t so I think I may be getting this for ps4. I have been looking at a 970 but the 300 series cards for amd sound pretty intense.
 
My argument has always been that architectures are not flat if that makes sense, they have pros and cons. I was trying to illustrate that with Ryse and Hitman.
This very much applies to PS4-PC comparisons, the PS4 GPU while weaker than a 670 may outperform it when it comes to compute which we know will have a very important part in games.
I believe I have been coherent from the very beginning.

So when someone was torn between a PS4 and a 670 equipped PC I advised him/her to go PS4. I never said your 770 would not fare better, but my estimation is that it may not make anywhere near as big of a difference as you believe.

There is no way the PS4 is performing better than a 670, I think you're underestimating how much more powerful it is. There has not been a single game in which the 7850-70 has outperformed a 670, for all the 'console optimisation ' talk it's never overcome this disparity in raw power.
 
As a paradox, I love the books too. It's just that the story in W2 was boring to me and the combat was lame. Anyway.

About the tower thing, yeah, I wouldn't expect CDPR to do that. But:
- monsters dead
- cinematic of villagers coming in
- a hub with a shop opens.

I may be just making weird connections, but I found it amusing.

Yeah, I would hope it would be more subtle such as regular folks come first. Then farmers, merchants or whatever after a few in-game days. Not just "Monsters are dead", then it's a thriving, bustling city after the civilian cinematic lol.

I know it might be unrealistic given deadlines, scope and all though.
 
Top Bottom