What? No.
If anything, PC gaming has a wider assortment of games and genres.
You know, thinking about it, I think it's just that the diversity on consoles and PCs is different. I stand by the rest though. PC gaming is a huge pain in the ass.
What? No.
If anything, PC gaming has a wider assortment of games and genres.
I hadn't even thought of that, that's pretty insane. I don't remember that ever happening before, the xbox 360 couldn't fit a whole xbox games worth of content in its memory, the PS3 couldn't for the PS2, etc.
And along with the memory being larger, they can make more effective use of streaming since more assets will be able to be vacated from the memory as the scene changes. Hopefully a lot of devs do it right and bring us games with few load screens.
I don't think so. We knew about them ahead of time - but now it seems they are unified.
I can't seriously understand how people say that there isn't any leap lol
new direct feed demo posted by Killzone.com, watch in 1080p...
GameCube had 43MB of ram & N64 cartridges was only 32MB I think.
The sense of scale with this level of graphics alone makes this a true Next Gen leap.
Yea, that's why pretty much all launch consoles suffered the YLOD defect.. (including my european 60GB launch console)
Absolute nonsense. The Ps3 launched in November 2006 in NA and in March 2007 in Europe. Crysis launched in March 2007. I bought my Gaming PC in July 2008 (which at the time was somewhat outdated and rather middle class in terms of performance) and I only upgraded the OS/external peripherals.
(CPU: QuadCore Intel Core 2 Quad Q9450, 2666 MHz (4x 2,6GHz), MB: Asus P5K, RAM: 4GB DDR RAM (too lazy to look up which one), GFX: nVIDIA GeForce GTX 260 896 MB (Zotac)
And I can still run every modern game at 1080p maxed out (most of the time even with 16xAA/AF!) with at least 30f-60fps if not more.
Also, you guys should be aware that at least ~2GB will be reserved for the OS. Sure it has a little co-processor thing for the recording/streaming but it still needs lots of RAM. (especially if you want good quality videos, i.e. 720p streaming/recodring)
You know, thinking about it, I think it's just that the diversity on consoles and PCs is different. I stand by the rest though. PC gaming is a huge pain in the ass.
Considering you can already do that on a PC with 1GB of ram and a bloated Windows OS, I think the rumored 512MB for the OS is more than enough for all these functionalities.
I agree that for some less technologically inclined that the PC can be a slight annoyance, but the view that you have to constantly search for drivers and lookup errors is 5-8 years outdated. It's still as simple as on consoles, but the way console games have headed this generation the differences in simplicity are narrowing rapidly
I don't really mean drivers, but I mean things like having to sign on to whatever Blizzards or whoevers server wants me to do, finding out that Civilisation 5 doesn't have cross play between Mac and PC, and having to tweak the game setting to not get it to slow down constantly. With a console I just put the disc in the machine and press play. I can't say that I've had a single PC gaming experience that didn't require to do something besides just playing the damn thing.
Yeah, PC gaming does take a couple more steps. What I'm saying is, with mandatory installs, constant updates when booting, games freezing, having to clear the cache (I had to do this recently on the 360 for FIFA 13) and more, console gaming is moving towards PC levels of complexity. I hope Sony and MS have stricter QA requirements going forward.
Yeah, PC gaming does take a couple more steps. What I'm saying is, with mandatory installs, constant updates when booting, games freezing, having to clear the cache (I had to do this recently on the 360 for FIFA 13) and more, console gaming is moving towards PC levels of complexity. I hope Sony and MS have stricter QA requirements going forward.
I also think PC gaming far more limited genre, accessibility and simplicity-wise.
I am comparing hardware. Crysis runs on High end graphics cards from the end of 2006/ early 2007 (think 8800 GTX), and the PS3 never really ran Crysis all that well throughout its lifetime, even after they retooled the game for consoles.
Pretty disappointed in the CPU and GPU but I guess you can't have everything. I assume affordable PCs will get significantly further ahead in about 4 or 5 years as well so basically a repeat of previous gen. Really wish they went for something like a top end i5 at least. This will do..let me watch that 1080p video of killzone.
Uh, incorrect. The 7850 BLOWS away the 6870.
Consider that the 7850 is able to run Crysis 2 at high settings @ 1080p, 60fps.
As far as graphics cards go, the 7850 is pretty damn powerful. It will run virtually every game on the market at 1080p 60fps, including Crysis 3.
You ARM CPU (for background tasks) + (AMD)APU? Are they on the same die?
The CU info is great. Sounds like PS4 will be really flexible.
I don't think so but I will ask a friend.
anyone think Sony is pulling a PS3? announce how much powerful your system is and it'll launch with the next Xbox only to be delayed a year? I don't but they have done this.
4gb-8gb is a pretty huge deal, thats not free watts or a simple motherboard change.not the same situation i think. I remember reading that they had to change their specs quite late so that probably was a factor. I'm not sure it'll be the same this time.
4gb-8gb is a pretty huge deal, thats not free watts or a simple motherboard change.
Blu-ray and Cell were factors into the PS3's delay since they were new technology and yields were low. Since the PS4 is mostly using off the shelf parts, they shouldn't run into this problem. 8GB GDDR5 might be, but I hope for their sake that they actually manage to pull it off.not the same situation i think. I remember reading that they had to change their specs quite late so that probably was a factor. I'm not sure it'll be the same this time.
I stopped reading when he mentioned the 1000 dollar titan, what sensationalist tripe.
Even a slow HDD has no problems storing an HD stream while also reading data concurrently. That's how DVR's work.Except that on PCs you aren't recording video using only 1GB of RAM. When I record video, albeit lossless 1080p, it takes up a decent chunk of RAM, and unless the PS3 is going to use a comparatively slow HDD they'll end up using RAM for the video recording feature (which I love). I'd say 2GB is a decent estimate, especially given their ambitions for multitasking.
4gb-8gb is a pretty huge deal, thats not free watts or a simple motherboard change.
They've either been debating this for quite some time (well before the latest dev kits went out) or Sony was waiting on confirmation of bigger modules (1GB).
Safe to assume it'll be 2.5D?
Even a slow HDD has no problems storing an HD stream while also reading data concurrently. That's how DVR's work.
PS4 has dedicated IC's for decoding/encoding video. It wouldn't be a jump to assume they're essentially going straight from the framebuffer to HDD.
4gb is at least $50 of ddr5(7850 2gb and 1gb is a $35 difference) and I would imagine 4GB they already have is another $10-15 bucks more. lets just say the touch controller is $5 more and the graphics card is $15 more. we are looking at a console that costs atleast $100 more with a company that can't afford it.
they both will have camera so thats a wash, I really bet we see a $299 xbox and a $499 PS4
or PS4 isnot launching this year. or highly limited.
Good point, forgot about the dedicated decoder. I was working with lossless 1080p60 numbers, which can be up to 3.5GB per minute, something 7200RPM drives still struggle with for long periods. The PS3 will be storing the compressed output from the dedicated decoder, I imagine at a decent compression level. Probably 200MB per minute, max.
ok. what are your costs.You would think after being so wrong in the past that some people would finally stop the armchair analyst and concede the fact that
a)no one has any idea about the return of investment or research and development that goes into any of this.
b) and pretending that you do is pointless.
but here we are.
Good point, forgot about the dedicated decoder. I was working with lossless 1080p60 numbers, which can be up to 3.5GB per minute, something 7200RPM drives still struggle with for long periods. The PS3 will be storing the compressed output from the dedicated decoder, I imagine at a decent compression level. Probably 200MB per minute, max.
It's gotten better - primarily thanks to market being console driven and PC getting ports that had the benefit of stricter QA.RibbedHero said:but the view that you have to constantly search for drivers and lookup errors is 5-8 years outdated.
I recently heard that 512MB modules came to exist this Jan. Are they really expecting double the density already?
this is true, but adding 4 gb of ddr5 is not free, it drives everthing up.The PS3 cost like 800 bucks to manufacture originally. PS4 will cost less.
Vudu HDX's nominal bitrate isn't even 10mbit ... it's something like 8mbit. Granted that's at 24p, so 30p would probably bump it up to around 10.you can easily get away with 10mbit/s for 1080P.
The entire 15 minute stream ends up being 1125MB or something along those lines.
EDIT:.
Which is 75MB / minute
4gb is at least $50 of ddr5(7850 2gb and 1gb is a $35 difference) and I would imagine 4GB they already have is another $10-15 bucks more. lets just say the touch controller is $5 more and the graphics card is $15 more. we are looking at a console that costs atleast $100 more with a company that can't afford it.
they both will have camera so thats a wash, I really bet we see a $299 xbox and a $499 PS4
or PS4 isnot launching this year. or highly limited.
Although I'm not complaining about it, I find the choice to go with 8GB of GDDR5 to be quite perplexing. The PS3 combines what essentially is a mid-range GPU (Radeon 7850) and a slightly less than mid-range CPU with more high-speed RAM than the Titan.
There's a reason why high-end PC graphics cards aren't paired with so much video RAM: modern games, even the most graphically-intensive ones, don't use anywhere near 8GB -- in most cases not even 2GB.
So either the OS and streaming/recording functions are going to require massive RAM (like 2-4GB worth), and/or developers will start using the RAM to constantly load data from the blu-ray/storage device into RAM long before it needs to be processed by the console so as to eliminate loading.
It's just a strange choice to pair mid-spec compute parts with absurd amounts of RAM -- like a puppy with enormous paws: the paws make sense when he grows up into a full-size dog, but in this case the puppy will never grow up because they're not going to upgrade the graphics card down the road.
Maybe they're thinking ahead to 2015 and beyond when PC graphics cards will all be carrying 8GB GDDR5 or faster and want to ensure that RAM levels are still comparable, even though the compute power of the future PC cards will be many times greater? Or maybe just looking at the past and seeing that RAM prices nosedive over time and figuring that the investment in more GDDR5 now will pay off in time.
In any case, if what Digital Foundry has reported about Durango is true, the performance gap between the PS4 and Durango is fairly massive. Would love to have been a fly on the wall at the meetings over at Microsoft this morning.
Let me ask, can a part of the rumoured 16GB flash chip play a part here?
GameCube had 43MB of ram & N64 cartridges was only 32MB I think.
this is true, but adding 4 gb of ddr5 is not free, it drives everthing up.