"Is it May19th yet?" - Vampyr
"No" - Geralt
"Then fuck off" - Vampyr
laaawd
Fixed for me
"Is it May19th yet?" - Vampyr
"No" - Geralt
"Then fuck off" - Vampyr
laaawd
I think it'll do better than that.
Looking at previous titles, GTX 770 does better than a ~20 FPS increase at PS4 settings. And that's assuming PS4 will be running W3 at a solid 30 FPS.
I had to.
![]()
This. I haven't really been impressed with the consoles when it comes to multiplatform releases. Not to mention this is CDPR's first game on a Sony platform.
So to answer Serick, you have a very powerful CPU and a great GPU. Get the PC version.
That's your personal interpretation, don't put words into my mouth.
20fps is not anywhere near my definition of "a world of a difference" (my actual words). But to each his own.
And I would like to stipulate that this is merely my speculation, I do not pretend to know the future.
What hardware are we talking about ? PC is mosaic.
It seems I was not crazy :
http://www.techspot.com/review/608-hitman-absolution-performance-benchmarks/page3.html
770 is a rebadged 680.
Same case for Ryse.
http://gamegpu.ru/images/remote/htt...PU-Action-Ryse_Son_of_Rome-test-Ryse_1920.jpg
The 280X which is supposedly AMD's equivalent of the 770 achieves impressive results. My 970 very rarely touch a 290 in this game, yet the former is supposedly more powerful.
I know people with superior hardware compared to consoles (8800 and Core 2 Duo) can't play games like Crysis 3 at all, because the support for those cards has been dropped, hence my :
I won't promise you 2011 hardware will be supported in 2019, either by ISV or AMD.
I feel compelled to add that initially, I was responding to someone with a 670, not a 770.
trained as Witcheress, irc pretty much full training without the chemicals (Trial of Grasses), And she was trained as Sorceress by Yen and TrissHate to be that guy, but Ciri isn't a Witcher. Witcher trained, however.
I feel compelled to add that initially, I was responding to someone with a 670, not a 770.
Indeed. Different words mean different things."Don't expect a world of difference"
"I would recommend the PS4 version" - At the 670 poster
"20fps is not anywhere near my definition of "a world of a difference" (my actual words)."
"I didn't say marginal"
Okay.
I was giving an example of a "weaker" card outperforming a stronger one. Unless that has changed, consoles are AMD GCN powered. My comparison stands. The day a PS4 outperforms a 670 is the day I won't be surprised.When you said 280, I thought we were talking about a GTX 280. Need to clarify we're adding another variable in. We had been comparing nvidia cards to the consoles, not PC radeon to PC nvidia.
No you are missing the point, I was not comparing the Xbox One to a 770 but a 770 to a "weaker" GCN cards.Also, what your link shows me is not only does the 770 run Ryse @ higher FPS than the Xbox One but with MUCH higher IQ.... So not sure what you're trying to show here when we're talking about PC hardware vs. the consoles. If anything you just illustrated my point that PC's outperform consoles?
Those "real world results" vary depending on the workload. The paper specs are there for you to check but the actual results depend on a number of factors. We have seen games running better on a "weaker" GPU than a stronger one. Because GPU architecture are not born equal, Hitman Absolution ran better on a 280 than a 770, why is that ? Compute. Yet my 770 is stronger than a 280 on paper.
I won't promise it will be supported by Nvidia and ISV anno 2019.The GTX 770 wasn't released in 2011.
My point stands though, those people with better hardware than consoles had to upgrade. What happens when a 670 is no longer supported ? It might not run games as well as a PS4 in this case.Your Crysis comparison also doesn't hold for several reasons. First, 360 and PS3's aren't mobile PC's in a box, XO and PS4 are. Second, the console versions of Crysis 3 were severely gimped to run on the consoles vs. the PC. You are illustrating lack of work/support put into the PC version of a game released 2 years ago, not a power gap caused by optimization of 360/PS3 APIs.
Snip
Hopefully I'll save you some time here.
I literally do not care how PC Radeons compare with PC Nvidia cards. I also don't care if Nvidia supports my card until 2019. You don't even know for sure Sony will support their console until 2019. None of this serves the discussion of TW3 on console or PC. Save yourself some time.
I am only looking at comparing a GTX 770 to the PS4's hardware and maybe even the Xbox One.
Thanks for the discussion, I've gotten enough information to make up my mind on which version I'm going with.
My argument has always been that architectures are not flat if that makes sense, they are pros and cons. I was trying to illustrate that with Ryse and Hitman.
This very much applies to PS4-PC comparisons, the PS4 GPU while weaker than a 670 may outperform it when it comes to compute which we know will have a very important part in games.
I believe I have been coherent from the very beginning.
No, I see the point you are trying to make. It just doesn't address my conundrum.
I mean sure, at the end of the day if my 770 ends up not being supported and I need to drop some bills on a new card in 2019 that's fine. TW3 comes out in a few days
I get that lack of driver/api support on the PC side can kill off superior cards before they're time, that's an inherent risk with PC gaming.
I thought it was obvious I was comparing contemporary cards.I also understand that games designed on GCN architecture (consoles) will benefit PC gamers using GCN cards (your ryse and hitman examples -- again my disbelief came from me thinking you were referencing a GTX 280 not a Radeon 280) and that Nvidia has to tackle this with more horsepower.
Me: "Is it May 19th yet?"
Geralt: "No."
Me: "Then fuck off."
Seriously, this game looks ridiculous. Eat your heart out, Bethesda. And I am saying this as a huge ES fan. Just wow.
I thought it was obvious I was comparing contemporary cards.
*sigh*
i5 4670k
GTX 760 2GB
8GB RAM
128GB SSD
1080p Samsung TV
I'm praying I can somehow negotiate 1080p 60fps out of some mix of medium/high settings with the minimum level of AA. You guys think it's possible?
I have no idea why my mind went to a GTX 280. I think it's just because I've not bought a radeon in so long it didn't make the connection.
But again, really, thanks for the insight and the discussion. You actually have me considering going radeon when I upgrade next
My biggest fear this generation is that my i7 (I use it for video stuff too, so not wasted) will go unused since the consoles' CPUs are so slow.
*sigh*
i5 4670k
GTX 760 2GB
8GB RAM
128GB SSD
1080p Samsung TV
I'm praying I can somehow negotiate 1080p 60fps out of some mix of medium/high settings with the minimum level of AA. You guys think it's possible?
I will reiterate it, but those shadows are from a time of day change letting the sun shine through a window... and geralt is just standing next to a point light in one and not the other. They are the same basically otherwise.
Man, could it be? Could I actually like this one? I loved W1 and hated W2, so I have no idea on which side this will fall. But it seems to be more monster focused with the political world as a backdrop.
I have a feeling they borrowed the Ubisoft "towers" with their small towns that you free, which is amusing.
GTX 670 FTW or PS4? guys?
How will TW3 run on:
- i7-4710HQ
- 8 GB RAM
- GTX 970M 3GB
@1080p?
Less than 30 FPS on high?
Should I buy it on PS4 instead?
As someone who loves both games I always find it hard to comprehend how someone could love W1 but hate W2. Although I guess it's entirely possible, I for one loved Mass Effect but felt completely underwhelmed by ME2, but I dunno if I'd go as far as hate. Even with a more political focus W2 still had fantastic characters and a really interesting plot, I guess if you really disliked the politics that much, but plenty of that happens in the books.. Oh whatever, each to their own I guess.
Also I don't think village liberation will be a "Tower" like mechanic. It's entirely possible, but I don't think CDPR would be that stupid. I think it's merely a coincidence based thing and another way to layer the living world. They also spoke in interviews about how clearing monsters on certain roads would increase merchant travel in the area and change the economy, and yet I doubt that is connected to any kind of Map Icon Reveal.
So my bets on it simply being another way of expanding on the living world, but we'll have to wait and see.
How will TW3 run on:
- i7-4710HQ
- 8 GB RAM
- GTX 970M 3GB
@1080p?
Less than 30 FPS on high?
Should I buy it on PS4 instead?
About the tower thing, yeah, I wouldn't expect CDPR to do that. But:
- monsters dead
- cinematic of villagers coming in
- a hub with a shop opens.
I may be just making weird connections, but I found it amusing.
This game isn't based in the melting pot. I am all for diversity, and in a game like say, Mass Effect, or Halo it makes absolute perfect sense to have the most diverse cast of characters possible. However adding diversity just for the sole sake of checking off a box is not the right way to go about it.
Go PC![]()
You're fine, it'll run probably at least as well or better than the PS4. Plus you have the option of tweaking the graphics for either better visuals or a better framerate.
This is the socially acceptable way to say the default human is a white male.
My argument has always been that architectures are not flat if that makes sense, they have pros and cons. I was trying to illustrate that with Ryse and Hitman.
This very much applies to PS4-PC comparisons, the PS4 GPU while weaker than a 670 may outperform it when it comes to compute which we know will have a very important part in games.
I believe I have been coherent from the very beginning.
So when someone was torn between a PS4 and a 670 equipped PC I advised him/her to go PS4. I never said your 770 would not fare better, but my estimation is that it may not make anywhere near as big of a difference as you believe.
As a paradox, I love the books too. It's just that the story in W2 was boring to me and the combat was lame. Anyway.
About the tower thing, yeah, I wouldn't expect CDPR to do that. But:
- monsters dead
- cinematic of villagers coming in
- a hub with a shop opens.
I may be just making weird connections, but I found it amusing.
Reminds me of the Red Dead Redemption trailers.