gaming_noob
Member
GAF will be glorious once ps4 specs are released. The meltdowns are already happening.
Insider knowledge or just a guess?
GAF will be glorious once ps4 specs are released. The meltdowns are already happening.
GAF will be glorious once ps4 specs are released. The meltdowns are already happening.
Yeah I'm hoping at least one of them will be a true beast,I'm getting both and a gaming PC either way just like this gen.Here's the bucket of tears your highness. Now be silent and drink from the fountain of sorrow for it tastes delicious.
PS4 may have decent specs when unveiled but the true uproar will commence when the XB3's specs are revealed (provided they are superior).
Here's the bucket of tears your highness. Now be silent and drink from the fountain of sorrow for it tastes delicious.
PS4 may have decent specs when unveiled but the true uproar will commence when the XB3's specs are revealed (provided they are superior).
I agree. Sony would be extremely short sighted to release an under powered machine. They fostered an entire first party on the bleeding edge of visual fidelity. Both consoles should be around 10% within each other's range. The true differences will lie within first parties who will code specifically for the platforms.
Insider knowledge or just a guess?
I wish people would stop stating their opinion as fact.
The only way I see BC happening is if an actual piece of hardware is included in the PS4 that can run it. Just as you say.Some folks are really ignorant. Bc through gaikai? BC through an add on? It's freakin hard if not out right impossible.
the only technical and feasible way of including BC in the ps4 is adding the Cell in the system. Ofcourse that would make too much sense for the idiots in charge of SCE if we are to believe the rumours
As crazy as it sounds, Jeff has shown AMD's interest (well, you didn't need Jeff to say it) in providing a system that can plug modules together to get a single coherent system. If they can provide the Jaguar (or whatever rumored cores) along with something that can BC PS3, it could work well. It wouldn't be used JUST for PS3, but for PS4 games as well.yeah no. last thing we need is to increase the cost of the system by throwing cell in there and either cutting back on performance or passing the cost on to the consumer.
That would NEVER happen. The undertaking to licensing PC versions of games, when they already have PS3 usage is just silly. And once again, first party would be a huge problem, and that is one of their biggest selling points. They aren't going to throw millions into first party and then just dump them.Don't have use PS3 version. Can be use the pc version if they have one. PS only/1st party may have some problem but they can assets it by emulator or rewrite for pc.
Don't cloud gaming services such as Gakai/Onlive use high end pcs? A ps3 would be cheaper than that. They'd probably nickel and dime the process too and end up making money on the ps3 back catalogue.
The ps3 transition to a more pc type environment in the ps4 means the end of the bc difficulty from next gen onwards.
There are things to worry about though. Will the process of transferring my psn over to ps4 make my digital only ps3 games obsolete?
General numbers like 10% are a little meaningless.
Our current porting twins have differences far larger than that in some cases, in different aspects, going both ways.
But importantly they're still capable of sharing content.
I think something like '10%' - depending on what you're actually talking about - is far too tight a margin to place on things before there'd actually be a question mark over multiplatform-ism between the two.
Some folks are really ignorant. Bc through gaikai? BC through an add on? It's freakin hard if not out right impossible.
Impossible? Would they have really bothered patenting something twice if it was impossible?
gaiki works because you pay for it . So sony would have to charge you for BC on the ps4 if they use gaiki or it will cost them money.
gaiki works in the same way that onlive works , they have numerous server farms running the games and compressing the framebuffer to send out to the end user and it still ends in a sub par game experience.
At the same time a ps4 with say 4 gigs of system ram and 2 gigs of vram could get slaughtered visualy by a xbox 720 with 6gigs of system ram and 3 gigs of vram or worse 4 gigs.
If you don't have a Cell with the memory and interfaces being pretty much exactly the same (timings being the most important) you won't get hardware BC.
Anything else would require re-licensing software - at your cost probably.
Where is brain_stew when you need him!
The problem with the ps3 is simple. Bluray. It carried a premium price at the start of the gen and caused the ps3 to release a year later and since it took a slice of the budget it stopped them from increasing the ram amount.
A ps3 with a dvd drive and even just 256/512 ram c onfiguration over the 256/256 it shipped at would have slaughtered the xbox 360 in terms of graphics.
At the same time a ps4 with say 4 gigs of system ram and 2 gigs of vram could get slaughtered visualy by a xbox 720 with 6gigs of system ram and 3 gigs of vram or worse 4 gigs.
They could also go with more and faster ram which would increase the visual gap even more.
Right now we have video cards that have 2/3/4 gigs of ram on them but the ram is just used to display higher resolutions . If a developer actually targeted these ram amounts you would have a huge increase in graphical fidelity but of course you'd give up the higher resolutions and fsaa amounts .
If the ps4 ships in nov of 2013 and the xbox is delayed until 2014 ms could potentially be on the right side of a micron drop meaning they would be able to use bigger chips and more ram at similar prices to sonys console. So they might not even have to take such huge losses to put out a better console.
The reason why people equate these advantages to MS and not sony is because of the problems sony as a company is having. MS can eat a few quarters or years of losses in the xbox division because the rest of the company makes billion dollar profits.
Sony on the other hand has been loosing money or selling off parts of itself to stay afloat , so it could likely be that big losses on the ps4 coupled with a weak demand on games could end up sinking them.
It will be really interesting to see what happens next generation.
According to you the PS4 will be as weak as the Wii U, now it will have decent specs!
Which one is it?
Also no word on ram and gpu in the ps4 is a falsehood you have been spouting in this thread.
Samara (20w) + 7950M in a 8000 or 9000 series GPU (~70w) using a common Ultrawide IO memory pool.Since the 6990M was running at 100W and was based on the 6870 core. I assumed that the card which will be used for the flagship of the 7000M series will have similar TDP as the Desktop 6870.
The card that best matches this criteria is the 7950, which does consume a little bit more than the 6870, but it's very close to it.
Thus, the next thing to do was to look at power efficiency, and the 7950 is between 20-25% more power efficient than the 6870.
Same power envelope as the launch PS3.So what do you think they are targeting, 120 W as a whole?
That was just used as an example. Key here is that the ultrawide IO mobile standard is going to make changes in Laptop, notebook and Phone designs.So what do you think they are targeting, 120 W as a whole?
Same power envelope as the launch PS3.
Now, answer the question....why are they both proceeding with customised mobile parts? No, it's not to keep in line with possible energy ratings or law changes. It's not cost/watt because when you increase the clock, you decrease yields and decrease the cost-benefit ratios you were hoping for by going mobile. Technology efficiency gains are more achievable in-engine on more open hardware rather than forcing through locked down hardware design choices. Keep digging Jeff.
A game console is a locked down device and is more efficient than a PC. That is primarily software but it's similar thinking for the hardware in embedded designs.Now, answer the question....why are they both proceeding with customised mobile parts? No, it's not to keep in line with possible energy ratings or law changes. It's not cost/watt because when you increase the clock, you decrease yields and decrease the cost-benefit ratios you were hoping for by going mobile. Technology efficiency gains are more achievable in-engine on more open hardware rather than forcing through locked down hardware design choices. Keep digging Jeff.
It's the wide IO buss that is the ONLY difference barring TDP power limitations in a Laptop. The wide IO buss allows Full HSA at 200-1TB/sec rather than current A10 with 27GB/sec and a second GPU with GDDR5 @ 200GB/sec that has a PCIe bottleneck to the A10.mrklaw said:performance per watt? even if they go with a launch PS3 power envelope, thats maybe 120W for the GPU (guess). what performance would you get with a desktop part at that power level vs a mobile part?
PS4 may have decent specs when unveiled but the true uproar will commence when the XB3's specs are revealed (provided they are superior).
A ps3 with a dvd drive and even just 256/512 ram c onfiguration over the 256/256 it shipped at would have slaughtered the xbox 360 in terms of graphics.
@jeff_rigby: Do you have a benchmark for a mobile apu that is anywhere near realistically plausible for 1.8tf+ next gen console?
The performance just isn't there.
Eurogamer said:The Pitcairn core is fairly small, occupying 212mm2 of area. Compare that with the 240mm2 of the RSX in the launch version of the PlayStation 3 and the 180mm2 of the Xbox 360's original 90nm Xenos GPU and we have a ballpark match. Of more interest is power consumption: at full tilt, the 7970M sucks up around 65 watts of power. That's not going to be especially good news for a laptop running on battery power alone, but considering that the launch versions of the Xbox 360 and PS3 both consumed around 200W in total, again we see an eminently suitable match.
AMD 7970m is supposed to be around 7850-7870 levels
http://www.eurogamer.net/articles/df-hardware-radeon-7970m-alienware-m17x-r4-review
what would an 8xxxm bring to the table?
Maybe I worded it wrong. But we're not discussing mobile gpu variants. I'd be interested to see the best of the best apus.
I thought we are expecting a custom APU based on customer requirements? so you could put a bunch of jaguar CPU cores alongside a GPU of your choice with optional edram?
eDRAM can be ultrawide IO. In the past it was on the logic or GPU connected via transposer. If you have ultra wide IO 512 bit on a transposer 2.5D outside on a combination transposer/MCM carrier you don't need it on top of the GPU or CPU.I thought we are expecting a custom APU based on customer requirements? so you could put a bunch of jaguar CPU cores alongside a GPU of your choice with optional edram?
4GB of RAM. Or more...
Early rumours suggest PS4 will have 2 to 4Gb of RAM (that's four to eight times the size of PS3's 512Mb RAM: which is split between 256Mb for video, 256Mb for systems). Wii U has 2Gb: 1Gb for menus and systems and 1Gb for video. However, the latest rumours suggest Microsoft's next console will have a staggering 8Gb.
Why is RAM important? It allows your console to run more programs in its main memory at any one time, rather than swapping between programs (via loading), hugely raising performance. If your hard drive is the size of your filing cabinets, RAM is how much you can fit on your desk. Differences in the RAM structure of PS3 (2x 256Mb) and 360 (unified 512Mb) were one factor cited to cause the notorious Skyrim lag.
More RAM allows higher resolution textures and less loading - like when you enter buildings in an open world. When we saw Square Enix's Luminous next-gen game engine demo at E3, all we were told is that it was running on high-end PC specs with 'a lot of RAM'. If Microsoft opt for 8Gb of RAM, it may force Sony's hand - even 4Gb might cause issues when porting code across consoles. The downside is that RAM is expensive, but Sony can't afford to scrimp.
It'll use a Quad-Core AMD chip
Sony will opt for AMD's quad-core APU (accelerated processing unit) codenamed 'Liverpool,' according to multiple reports in June. It's tipped to be built on a 28-nanometer process. The smaller this number, the more transistors can be fitted into the same space on the chip, and the lower the power consumption, but the more complicated the chip is to build. For context, PS3's Cell processor shrank from 90nm to 45nm over the console's six-year life.
The clock speed is 3.2 GHz, which while not lightning fast will be supplemented by powerful graphics hardware - the Radeon HD 7970 (currently £300 on its own) is being linked to PS4.
Sony will be looking to assemble PS4 from 'off the shelf' PC parts, reducing costs and making it easier to program for. This is in contrast to PS3's Cell chip, which its creator Ken Kuturagi once envisioned appearing in freezers and TVs as part of a parallel processing network. And look how that worked out.
AMD's chips allow for easy porting of code, theoretically preventing the issues surrounding, say, the PS3 port of Skyrim compared to Xbox 360. It'd be easier for developers to get PS4 games up and running, without waiting years for them to learn its unique tricks.
Now, answer the question....why are they both proceeding with customised mobile parts? No, it's not to keep in line with possible energy ratings or law changes. It's not cost/watt because when you increase the clock, you decrease yields and decrease the cost-benefit ratios you were hoping for by going mobile. Technology efficiency gains are more achievable in-engine on more open hardware rather than forcing through locked down hardware design choices. Keep digging Jeff.
How many TFLOPs has 7970?
http://www.computerandvideogames.co...-of-the-next-playstation-4/?page=1#top_banner
Single precision 3.7 TFLOPS, double precission 947GFLOPS. Dont expect this to be in PS4, its a big standalone chip who eats 250W. PS4 will most likely use only APU with GPU section in line to 7850/7870 at max [1.8-2.4 TFLOPS].
I figure this may be a good place to ask a general question on next gen tech..
I bought a Samsung 7000 series 3D LED (1080P, 240Hz) in the spring and I'v notice with some games that the graphics look actually worse with the 240Hz than they did with 60Hz. Will be tech be upgraded enough on consoles to utilize the extra refresh rate on newer TVs next gen, or should that not matter at all?
Like I said, may be a dumb question, but definately something I've noticed.
7970 has a 250W TDP ... so even with a shrink and HD8xxx move that won't cut it.
The 120 and 240hz settings aren't allowing your games to run at a higher frame rate natively. What happens on higher end TV's (like yours) is, it takes a 30 or 60 fps source, in this case your game, and then interpolates and creates new images in between the native frames, to make it appear more smoothly.
Lets say you have a video feed of a ball rolling across the screen. Here are two frames:
[_O_____]
[_____O_]
What the 120 and 240 settings do, is create more frames in between by analyzing the before and after images... so you get:
[_O_____]
[___O___]
[_____O_]
or even better:
[_O_____]
[__O____]
[___O___]
[____O__]
[_____O_]
The more frames you have, the smoother it looks. The problem with this is that, because the TV (NOT THE CONSOLE) creates the new images, those new images often have flaws. Especially with the extremely "jaggy" and low image quality console games we have today. The created frames will look worse.
Unless the next gen brings more 60fps games, you won't see a frame rate increase. 120fps is used in maybe a couple games, and one of those is Super Stardust HD on PS3, and that's only because of 3D.
Unless the next gen brings more 60fps games, you won't see a frame rate increase. 120fps is used in maybe a couple games, and one of those is Super Stardust HD on PS3, and that's only because of 3D.
Super Stardust has a frame sequential 1080p "60fps gameplay" 3D? :-O
Thank you!
Very nice, that makes sense to me. So, the manufactured images by the TV are what is creating the "bubbles" I'm seeing around characters at times in games? Worst offender so far is WKCII, but I've seen the bubbles/shadows on other games as well. I guess it just pulls out the inperfections even more. I never notices these before on my older Samsung that ran at 60hz.
Super Stardust has a frame sequential 1080p "60fps gameplay" 3D? :-O
720p. Looks ace though.
I'll have to fire that up and see it in motion, I haven't played it since I got my TV. I'm very interested in giving it a ride now.