The Witcher 3: 1080p/30fps on PS4, 900p/30fps on Xbox One

I get the feeling you really like to say that. You used to be a very hardcore Xbox crusader from what I remember from my lurking days.
You seem to have changed your mind, Microsoft's hardware choices have not been to your liking ? :)

Don't worry you are probably not the only one who has switched side.

I'm still an Xbot. I think I always will be. But I choose PS4 this time around due to Microsofts hardware decisions.
I still tend to fight the Xbox corner though. Yes. I know. I'm confused and conflicted and well... Human.
 
This is getting a bit silly, so here's a complete PC specced out to match the recommended requirements for The Witcher 3, for less than €650:

witcher3pcrgulj.png


(This isn't even going as cheap as possible on the mainboard and case, and uses a 290 which is much faster than a 770. Could probably get it below 600€ if you really wanted)

I want some of that geile Dragon RAM
 
Why is this silly topic now about building PC parts?

Is it because the Xbox One is a Windows 10 device or what is the connection?
 
COD is an awful example. Matchmaking is based on skill therefore you sometimes end up with someone that has fucking 1 s ping and is ruining the game for everyone. Alot of people have shit internet.

Also, MS has only showed one demo with a game releasing in 2016. One demo isnt enough

And prior to them showing the demo they had tech demos, which also weren't enough at the time. And just prior to that they had claims. That's generally how the adoption of tech progresses. In 2015 at E3 they will likely show actual gameplay using it. And after you dismiss that as 'only one choreographed demo' they will go on to having the press get hands on time with it, which you will dismiss as 'the press being enthused about previews as usual'. At some point the damn thing ships and you will run out of excuses though.

Let's just put it this way. MS knows more than you do about this tech. Cloudgine's founders know more than you do about this tech. Cloudgine's corporate sponsors like EPIC and Nvidia etc know more than you do about this tech. All of the above are enthused enough to throw large sums of $$$ at the tech. Maybe Cloudgine are frauds...but we have a live demo showing it's not.

Be skeptical, but let's not cling to being downright ignorant. It's perfectly cool to just say 'we will see'. :)

Btw, my point about CoD was that it sells a ton and requires the internet to play it's most popular game mode. Not that it was a fantastic game. MS isn't interested in throwing $$$ away on Crackdown if it were as untested as you suggest. Just because the public isn't aware of how well it works doesn't mean it doesn't work well.
 
Why is this silly topic now about building PC parts?

Is it because the Xbox One is a Windows 10 device or what is the connection?
The connection is that some people still, in 2015, insist that you need a "1500€" PC to play games. After being corrected repeatedly.

(After pricing it out I'm honestly surprised how cheap you can go for a system such as that one, both the GPU and the CPU are more than twice as fast as the consoles)
 
PS4 is altogether more powerful though and do we actually know the cpu speeds in the PS4? This is a genuine question btw not asking to be facetious

Well not altogether is it. There is a important system within the Xbox One that is more powerful then PS4. It's CPU.

1.75GHz vs 1.6GHz.
 
Why is this silly topic now about building PC parts?

Is it because the Xbox One is a Windows 10 device or what is the connection?

Close. It was the "it would definitely cost €1500 to build a PC to match the recommeded spec, when PS4 would do that"
 
Why is this silly topic now about building PC parts?

Is it because the Xbox One is a Windows 10 device or what is the connection?

Because even though this topic has nothing to do with the PC version people need to prove that the PC version will be better. Then other people rush in and say "$4000 computer" and the cycle continues.

You know, like a lot of these sorts of threads.
 
Wait are we back to "the gap between Xbox One and PS4 is closing" again from Xbox defenders?

Don't forget the "PS4's CPU is slower than the XBO's CPU" argument either....Ubisoft started it, and then it took off.

'By the skillful and sustained use of propaganda, one can make a people see even heaven as hell or an extremely wretched life as paradise."
 
I'm still an Xbot. I think I always will be. But I choose PS4 this time around due to Microsofts hardware decisions.
I still tend to fight the Xbox corner though. Yes. I know. I'm confused and conflicted and well... Human.

I love that the mods changed your tag to make everybody else just as confused after the 'change' :)
 
Stop listening to the random ppl here. You absolutely can add them as they are set up in parallel.



Quote by whom? Some know nothing posting here? Did you see my other post with Nick Baker's quote in it? Actual real world games/apps have been measured using 150 GB/s with just the eSRAM alone. Did you read my post at all?



Are you even reading what I typed? 176 GB/s is theoretical peak on PS4. 204 GB/s + 68 GB/s (272 GB/s) is theoretical peak combined on X1. Yes, it is much easier to hit the peak on PS4, but no game ever will. That isn't my point. If it were my point I would have been quoting 272 GB/s as the bandwidth for X1, which I wasn't. My point was real world, actual games/apps have been measured to hit 150 GB/s ONLY using the eSRAM and using that as an efficiency estimate for the main RAM nets us around 200 GB/s total bandwidth. Straight from the guy who designed it. He knows more than you do.

Even if we severely low ball the DDR3 RAM's real world usage efficiency it STILL easily eclipses the real world usage of the PS4's GDDR5. You could have 38% usage efficiency for the DDR3 in X1 and 100% perfect efficiency on PS4 and they'd still be equal.



...no, they aren't actually. One only does compression/decompression and some DMA work (PS4), the other does that on more than twice as many voices along with a bunch of other stuff. Stop.

Why is it so difficult for you guys to admit that X1 has hardware advantages in some areas over PS4?

You are portraying outlier scenarios as the norm. You may get 133 gb/s with some alpha blending scenarios on X1 but is by no means the norm which is closer to a 100 GB/s. Real world bandwith for ddr3 is probably sub 50 GB/s. So probably closer to 150 GB/s total system bandwidth spiking up near 190 in certain situations. PS4 is likely around 150 GB/s consistently. So, they are around the same in raw bandwidth numbers. However, the efficiency and effectiveness you get with that bandwidth is far better on ps4 since there is much less copy overhead with the unified memory and HSA/hUMA like capabilities (via volatile flag implementation), as well as managing far fewer abstraction layers, OSes, etc.
 
The DS4 is pretty good IMO, much improved from the DS3
I hated using the DS3, it always felt awkward to me.

The DS4 is my favorite controller. So well designed. I just wish I had waited to get extra controllers. Their first run of controllers have that shitty thumbstick peeling issue. It's happening to 2 out of the 4 I own.
 
Close. It was the "it would definitely cost €1500 to build a PC to match the recommeded spec, when PS4 would do that"
Wouldn't this be just solved by saying the PC recommended specs are not equivalent to whatever the PS4 is running, also this is a console warz thread and not a general platform warz thread pl0x.
 
Ergo cloud compute is bollocks and will never get used.

Your extreme insistence is making me beg you to bet your avatar on that statement.

The bet is if Microsoft's Azure cloud computing makes it into Crackdown 3.

What do you say? Wager your avatar for 3 months?
 
From my PC upgrading days. I know that having a higher spec graphics card, gave much higher performance increase than a small CPU overclock.

Yes of course.

But this is a console not a PC and some games are more CPU bound then others. There are other issues as well like how mature the development kit is. How long the devs have to optimise on each platform. Are they taking advantage of other things like compute etc...
 
This is getting a bit silly, so here's a complete PC specced out to match the recommended requirements for The Witcher 3, for less than €650:

http://abload.de/img/witcher3pcrgulj.png[IMG]

(This isn't even going as cheap as possible on the mainboard and case, and uses a 290 which is much faster than a 770. Could probably get it below 600€ if you really wanted)[/QUOTE]

And this is for the recommended specs CDPR released and not the minimum so you could also go a fair bit cheaper and still be playing it. I mean the Witcher 2 scaled really well even at release and still looked great at low settings.
 
Ergo cloud compute is bollocks and will never get used.

It'll be used in crackdown and Kampfheld has stated on this forum that it is not "bollocks" and can indeed help improve games graphically. I recommend reading through his post history, he's posted some interesting stuff.
 
Well not altogether is it. There is a important system within the Xbox One that is more powerful then PS4. It's CPU.

1.75GHz vs 1.6GHz.

Yes, but the PS4 is still the more powerful system out of the 2.

Either way I wish both these systems had a better CPU, but I guess you gotta cut down costs somewhere.
 
This is getting a bit silly, so here's a complete PC specced out to match the recommended requirements for The Witcher 3, for less than €650:

witcher3pcrgulj.png


(This isn't even going as cheap as possible on the mainboard and case, and uses a 290 which is much faster than a 770. Could probably get it below 600€ if you really wanted)

How much extra would a decent monitor add?
 
This is getting a bit silly, so here's a complete PC specced out to match the recommended requirements for The Witcher 3, for less than €650:

witcher3pcrgulj.png


(This isn't even going as cheap as possible on the mainboard and case, and uses a 290 which is much faster than a 770. Could probably get it below 600€ if you really wanted)

I don't think that really matters--even if you just say new video card then it's €249? (On mobile, so harder to see). That really what I'd expect, it's nowhere near break the bank.

But recommended doesn't necessairly get you the image shown, and I think that's his point. They're running a 980 setup, so if you want that Ultra-ish image, things will get pricey. Nothing wrong with it, and it would be a strange world where a 300-500 card can't actually beat the PS4.

But not everyone is going to bother with the price and setup, so it's a moot point for them, they don't care. Attempting to make them care is usually where things go off the rails, and people get annoyed, etc.

Sticking to only one platform to play games is basically impossible if you enjoy multiple types. Consoles don't have DOTA 2, PC doesn't have Uncharted or Mario. Need more than one in my eyes.
 
Yes, but the PS4 is still the more powerful system out of the 2.

Either way I wish both these systems had a better CPU, but I guess you gotta cut down costs somewhere.

I agree on both counts.

On the other hand. I'm impressed that they can get a game like Witcher 3 to run at 900p/1080p at 30fps at all. They must have done a incredible amount of work to get the consoles to run that well.

I want to see how this game runs on a GT 750ti. I really do.
 
I'm still an Xbot. I think I always will be. But I choose PS4 this time around due to Microsofts hardware decisions.
I still tend to fight the Xbox corner though. Yes. I know. I'm confused and conflicted and well... Human.

I know it's off-topic but I also remember Reiko, wonder what the guy's been doing since then. Specialguy is also MIA, but still posts on beyond3d under another name.
Ah, when Microsoft were unquestionably believed to come up with the beefier console. Those were the days.

I know it's off-topic but I also remember Reiko, wonder what the guy's been doing since then
Oh...Shame. I mean, he was annoying with his pseudo insider teasing but I know he meant well.
I thought he just became depressed after the specs were outed.
 
You are portraying outlier scenarios as the norm.

Stop here...what makes you assume what I noted was 'an outlier scenario'? It wasn't cited as such by Baker's comment. Note also that what I said was that you can get higher bandwidth from X1's set up when properly used. I NEVER claimed nor suggested that was 'the norm'. Devs are still getting their ducks in a row in the area of how to effectively leverage the eSRAM properly.
 
I agree on both counts.

On the other hand. I'm impressed that they can get a game like Witcher 3 to run at 900p/1080p at 30fps at all. They must have done a incredible amount of work to get the consoles to run that well.

I want to see how this game runs on a GT 750ti. I really do.

It's impressive, and games like The Order and Quantum Break look amazing. I know they will never touch PC, but what they're doing this early in the generation is pretty damn good.
 
Well not altogether is it. There is a important system within the Xbox One that is more powerful then PS4. It's CPU.

1.75GHz vs 1.6GHz.

That's sort of like saying Lindsay Lohan is hotter than Jesica Alba just because her tits are a little bigger. I mean yeah, tits are important and all, but there are other much more important factors to consider...

The CPU power difference is very small and Sony will most likely make up that difference if they determine there won't be any long-term issues with a small overclock.

The GPUs on the other hand are very different.
 
It'll be used in crackdown and Kampfheld has stated on this forum that it is not "bollocks" and can indeed help improve games graphically. I recommend reading through his post history, he's posted some interesting stuff.

He said it *couldn't* be used to improve tech graphics per se, iirc. I think he did qualify that though by saying it can help in areas many of us would contend are important to 'game visuals' (i.e. dynamicism of a game world, for instance, which is also crucial to making games feel visually believable).
 
I don't know, how much does a decent TV cost?
I doubt he's trying to make a point, and is actually just curious, because a lot of people (like myself) don't own a monitor, because our notebooks are sufficient for school/work.

I do want to ask something though. Is it generally simple to get your gaming PC up and running on a TV? Honest question.
 
Eh, I would still take a console over PC at this point in time, although consoles are becoming more PC like, which I am not sure is a good thing. I just want something to play games. I like to to keep my gaming and work seperate. I began playing PC games about 25 years ago, but I just want something streamlined to play games. I prefer high performance, but PC gaming has issues which I find I don't particularly like anymore. Unfortunately, consoles games are having all the sorts of issues PC games have had for years...constant patching and all that...
 
That's sort of like saying Lindsay Lohan is hotter than Jesica Alba just because her tits are a little bigger. I mean yeah, tits are important and all, but there are other much more important factors to consider...

When it comes to the ps4 and xb1, both have small tits anyway.
These consoles are about the ass.
 
I do want to ask something though. Is it generally simple to get your gaming PC up and running on a TV? Honest question.

Nowadays yes, its very simple.

Most cards nowadays have an HDMI port or come with a DVI-D port and include an HDMI converter, then you just plug in an HDMI cable into the card and plug it into the TV and done - its basically like connecting up a monitor.
 
Because even though this topic has nothing to do with the PC version people need to prove that the PC version will be better. Then other people rush in and say "$4000 computer" and the cycle continues.

You know, like a lot of these sorts of threads.

maybe you have no interested in pc but don't spin content of the news and fp of this thread because of that
 
I doubt he's trying to make a point, and is actually just curious, because a lot of people (like myself) don't own a monitor, because our notebooks are sufficient for school/work.

I do want to ask something though. Is it generally simple to get your gaming PC up and running on a TV? Honest question.

Yup, most (if not all) cards have an HDMI port on them now.

maybe you have no interested in pc but don't spin content of the new of this thread because of that

The content of this thread, the OP, is based on news of the Xbox One and PS4 version resolution and framerates. I am not spinning anything.

Also I have a very vested interest in the PC version of the game:

4pDyMagl.png
 
I'm still an Xbot. I think I always will be. But I choose PS4 this time around due to Microsofts hardware decisions.
I still tend to fight the Xbox corner though. Yes. I know. I'm confused and conflicted and well... Human.

thank you for saying what I've been thinking for a while
 
I doubt he's trying to make a point, and is actually just curious, because a lot of people (like myself) don't own a monitor, because our notebooks are sufficient for school/work.

I do want to ask something though. Is it generally simple to get your gaming PC up and running on a TV? Honest question.

So long as you can sort out the space/cables, and you have an HDMI port, yea, just run one HDMI and connect it. That's that.
Playing D3 on my TV in my living room at launch while we got connection errors was a lot of fun.
 
Top Bottom