PS4K information (~2x GPU power w/ clock+, new CPU, price, tent. Q1 2017)

Why would you release a new version of your console right after a holiday season, potentially burning and pissing off millions of new customers that just got a PS4 for Christmas?

From a business standpoint Christmas would be the perfect time to finish offloading the old model PS4.
 
Then why even release the 4K ....

If the PS4 stays main platform nobody needs the PS4K

If the PS4K is the main platform every PS4 owner is pissed off.

It´s just bad no matter how you look at it.

Nah, I disagree. PSVR is a huge reason to release PS4K.. That, and something like the PS4K serves as a mid-generation upgrade. Considering exponential advances in technology, many wouldn't mind a console upgrade. That said, I anticipate something like the PS4K to be eased into the market, and then later on taking its place as the main PS4 SKU.
 
Then why even release the 4K ....

If the PS4 stays main platform nobody needs the PS4K

If the PS4K is the main platform every PS4 owner is pissed off.

It´s just bad no matter how you look at it.

Because of better resolution, framerate, additional effects and whatever else for the more enthusiasts console players? I think that is the focus for, at least, 2 or 3 years until they release an even more powerful PS4k2 so developers can decide if they support first PS4 sku or just the last two ones.
 
Hmm, bought my PS4 just a couple of weeks before all this kicked off. Guess the excitement around this thread is whether people like me would have delayed purchasing. Personally no, but would absolutely have been a consideration.
Don't see the point in this as im happy with set long term console cycles.

This (if comes to pass as indicated) can only have a negative effect on sales , but not as badly as many posting in this thread would like. The amount of interest in this thread wont be reflected in NPDs due to the PS4s momentum , still think its dumb though.
 
Then why even release the 4K ....

If the PS4 stays main platform nobody needs the PS4K

If the PS4K is the main platform every PS4 owner is pissed off.

It´s just bad no matter how you look at it.

I dunno how many times this needs to be stated: the mass market will not care, nor will they be pissed. The mass market cares more about affordable prices. PS4 will sell great to the mass market at 299. PS4K is for enthusiasts.....as you put it....

QUOTE=mike4001_;201199216]nobody needs the PS4K[/QUOTE]

You're absolutely right. You DON'T need it. Your PS4 games Will. Run. The. Same. As. They. Did. Be. Fore. And. That. Will. Not. Change.

Do I need to even mention the uncertainty around this thing? DF and Kotaku are reporting vastly different info than retailers. Depending on who you believe, this thing is either coming really soon with just minor upgrades, or it's a really long way off and packing lots more power. We simply don't know what the truth is, beyond this: you needn't worry about it. Really. Not your precious PS4, nor consoles, nor videogames themselves are going anywhere. This gen we've had massive day one patches, huge game-breaking glitches, draconian season pass prices, microtransactions galore, utterly broken and unplayable games, big delays, you name it. Yet, gaming thrives (yes, X1 is doing just fine as well). None of that has managed to kill the industry. This won't either.

Now, people like Queso, Enigma, Zoetis, Gopher, Osiris and myself have tried speaking reason to panic. Unfortunately, you simply cannot seem to be made to see it. I regret setting foot in this thread again. If I were a mod, I'd lock it immediately for people's sanity. I'm bowing out. Dark Souls 3 and R&C won't play themselves on my soon-to-be dead plastic slab.
 
Why would it break bc?

Zen has SMT so a quad core zen can run 8 threads and the software would never know the difference.

I agree with the person you quoted, a quad core zen would be way more powerful than an 8 core puma.
IPC-wise it would be faster, but I'm not sure if programming 7 physical cores (Jaguar/latest SDK) would translate to 7 logical cores (Zen) without issues... SMT gives a 20-30% boost.
 
Then why even release the 4K ....

If the PS4 stays main platform nobody needs the PS4K

If the PS4K is the main platform every PS4 owner is pissed off.

It´s just bad no matter how you look at it.

The only thing that makes sense about this idea to me is Sony trying to push their 4K service.
 
Btw, the latest NX rumors say that NX will have a Polaris-like (semi-custom) GPU... considering that Nintendo is a cheapskate when it comes to hardware, why wouldn't the PS4.5 have a Polaris GPU as well?

I remember some people saying that Polaris is expensive. It doesn't make sense, unless it's 14nm that drives up the cost (due to low yields or something).

http://wccftech.com/nintendo-nx-all...lkan-support-luigis-mansion-3-development-ip/

2x PS4 performance... sounds like NX is on par with the rumoured PS4.5.
 
You sold your PS4 in anticipation of a rumored console that's not coming for at least 9 months, if it even actually happens. Uh... okay. That better have been near full price.

Sold it for $400.

Which floored me because I paid a significantly lower price when it first launched here in Canada, thanks to Gift Cards and some serious discounts from friends working at Target, lol

Worst case scenario: PS4K was an elaborate lie and I end up settling for a Slim PS4.

T_T
 
Sold it for $400.

Which floored me because I paid a significantly lower price when it first launched here in Canada, thanks to Gift Cards and some serious discounts from friends working at Target, lol

Worst case scenario: PS4K was an elaborate lie and I end up settling for a Slim PS4.

T_T
400 USD or CAD?
 
Btw, the latest NX rumors say that NX will have a Polaris-like (semi-custom) GPU... considering that Nintendo is a cheapskate when it comes to hardware, why wouldn't the PS4.5 have a Polaris GPU as well?

I remember some people saying that Polaris is expensive. It doesn't make sense, unless it's 14nm that drives up the cost (due to low yields or something).

http://wccftech.com/nintendo-nx-all...lkan-support-luigis-mansion-3-development-ip/

2x PS4 performance... sounds like NX is on par with the rumoured PS4.5.

Well, it's 1 rumor from 1 source. I wouldn't count on it.

edit: It's two sources now apparently. hmm
 
IPC-wise it would be faster, but I'm not sure if programming 7 physical cores (Jaguar/latest SDK) would translate to 7 logical cores (Zen) without issues... SMT gives a 20-30% boost.

Yeah, the results could be unpredictable. That's why I think they have no other option than using Puma.
 
Assuming the PS4 and PS4k use the same architecture, optimizing for one and optimizing for the other is the same thing in 99% of the cases.

This is micro-optimization. Yes, their new tricks will enable better effects with fewer cycles and both machines will benefit.

But on a macro level, you can't optimize for both. Why? There are tasks that will not scale, especially things that you do on the CPU. How much cpu time will you spare for the AI on the CPU, for example? And now we're beginning to see the use of GPU for tasks that wouldn't scale like typical GPU jobs (typical taks like resolution, texture filtering quality, AA). How much GPU will you be using for those tasks? At least, the rest of the GPU can be scaled down or up easier than things you do with your CPU.

So no, optimization is not solely done to architecture, it's also done with respect to how much you have of what, and no, you can't fully utilize both.
 
Yeah, the results could be unpredictable. That's why I think they have no other option than using Puma.
Even Puma is a slightly different/enhanced microarchitecture though...

Let's say that a hypothetical PS3.5 used Bobcat cores... would PS4's Jaguar cores be able to run the exact same x86 code without unpredictable results?

That's why I've been saying that consoles are not PCs. PCs can easily switch from Jaguar to Zen or i7 without issues, because the software is programmed with scalability/forward compatibility in mind.

Console games are programmed with fixed specs in mind. Totally different philosophy.

I guess the most BC-proof solution is to keep using Jaguar cores at 14nm, with higher frequency (2.5 GHz) for PS4.5 games and a "legacy/BC frequency" (1.6 GHz) for PS4 games.

Does anyone remember that MS/IBM had to emulate CPU-GPU bus latency in the XCGPU @ 45nm? Consoles need 100% predictability for flawless BC.

http://arstechnica.com/gaming/2010/08/microsoft-beats-intel-amd-to-market-with-cpugpu-combo-chip/

"It would have been easier and more natural to just connect the CPU and GPU with a high-bandwidth, low-latency internal connection, but that would have made the new SoC faster in some respects than the older systems, and that's not allowed. So they had to introduce this separate module onto the chip that could actually add latency between the CPU and GPU blocks, and generally behave like an off-die FSB."
 
Even Puma is a slightly different/enhanced microarchitecture though...

Let's say that a hypothetical PS3.5 used Bobcat cores... would PS4's Jaguar cores be able to run the exact same x86 code without unpredictable results?

That's why I've been saying that consoles are not PCs. PCs can easily switch from Jaguar to Zen or i7 without issues, because the software is programmed with scalability/forward compatibility in mind.

Console games are programmed with fixed specs in mind. Totally different philosophy.

Even if you program entirely in assembly, you'll get the same results across pretty much any modern x86 processor. They implement the same instruction set at the end of the day. It's no longer the early 2000s when the x86_64 spec was being finalized, SIMD instructions were being introduced, hyper threading was being introduced, multi-core processors were being introduced, etc. There haven't been any major new features added to x86 in quite some time. For the most part, it's been improvements in pipelining, task scheduling, etc which the devs don't really take control of.
 
Even if you program entirely in assembly, you'll get the same results across pretty much any modern x86 processor. They implement the same instruction set at the end of the day. It's no longer the early 2000s when the x86_64 spec was being finalized, SIMD instructions were being introduced, hyper threading was being introduced, multi-core processors were being introduced, etc. There haven't been any major new features added to x86 in quite some time. For the most part, it's been improvements in pipelining, task scheduling, etc which the devs don't really take control of.
The instruction set is the same (there are over 1000 x86 opcodes), but different microarchitectures do not execute x86 opcodes the same way...

Here's an example of what I'm talking about: http://www.hugi.scene.org/online/co...zing cort optimizing for sse a case study.htm

ps: I edited my previous post to add more stuff.
 
The instruction set is the same (there are over 1000 x86 opcodes), but different microarchitectures do not execute x86 opcodes the same way...

Here's an example of what I'm talking about: http://www.hugi.scene.org/online/co...zing cort optimizing for sse a case study.htm

ps: I edited my previous post to add more stuff.
In your example he is porting from 3Dnow (AMD) to SSE (Intel). This involved alot of maneuvering back in the earlier days but its not the same kind of challenge we see in this situation. I find it hard to believe the jump from a recent AMD microarchitechure to a more recent one is anywhere near as daunting.

I'll say again that people are ovestating the complexity a bit in here.
 
At the risk of stating the blooming obvious..........surely any changes needed to ensure compatibility/BC would be designed into the upgraded chips?

I mean whatever PS4K turns out to be it was almost certainly planned even before PS4 launch and I'm sure AMD/Sony know what they're doing.

I'm more interested in how they go about selling/explaining this new change for the console model. The casual consumer really doesn't like change at least until it is explained to them.
 
The instruction set is the same (there are over 1000 x86 opcodes), but different microarchitectures do not execute x86 opcodes the same way...

Here's an example of what I'm talking about: http://www.hugi.scene.org/online/co...zing cort optimizing for sse a case study.htm

ps: I edited my previous post to add more stuff.

Like I said, we haven't seen those kind of extensions added since the early 2000s.
No need for it as a lot of the audio decoding, video decoding, 3D math, physics, etc all are offloaded to the GPU these days.

The 360 example also doesn't work because they weren't creating a more powerful 360, and they didn't want devs to get good performance on one 360 and not the other as they were otherwise supposed to be the same spec. This is not the case of PS4K where the intended thing is the PS4K can get better performance or better visual fidelity over the existing PS4.
 
In your example he is porting from 3Dnow (AMD) to SSE (Intel). This involved alot of maneuvering back in the earlier days but its not the same kind of challenge we see in this situation. I find it hard to believe the jump from a recent AMD microarchitechure to a more recent one is anywhere near as daunting.
I'm talking about this part:

"A brief note on cycle timings: the timings I give for each function were achieved when running the sample code on my 500 MHz Pentium 3. Different machines will have different timings (for instance, the Pentium 4 offers significantly faster SIMD performance than the Pentium 3). But the absolute timings are mostly irrelevant - what matters is the relative timings, which show off the performance gains from each successive optimization."

Sure, Jaguar -> Puma would not be such a radical change, but it's still a different uArch...

I remember ND saying that they write cache-optimized code for the Jaguar to maximize performance. What would happen if Puma or Zen have different cache timings?

Like I said, we haven't seen those kind of extensions added since the early 2000s.
No need for it as a lot of the audio decoding, video decoding, 3D math, physics, etc all are offloaded to the GPU these days.
I agree that it's all about GPGPU these days, but they keep adding CPU SIMD extensions (i.e. AVX2, AVX-512). It's no different than MMX/SSE back in the 90s.
 
This is micro-optimization. Yes, their new tricks will enable better effects with fewer cycles and both machines will benefit.

But on a macro level, you can't optimize for both. Why? There are tasks that will not scale, especially things that you do on the CPU. How much cpu time will you spare for the AI on the CPU, for example? And now we're beginning to see the use of GPU for tasks that wouldn't scale like typical GPU jobs (typical taks like resolution, texture filtering quality, AA). How much GPU will you be using for those tasks? At least, the rest of the GPU can be scaled down or up easier than things you do with your CPU.

So no, optimization is not solely done to architecture, it's also done with respect to how much you have of what, and no, you can't fully utilize both.

Basically this.

Imagine you're optimizing for PS4K and the general game logic takes 50% of your CPU time, while 50% of your CPU time is consumed by something that has something to do with graphics. Then you go to do things with PS4, and while there are no micro-optimization problems, PS4k has, say, 50% more powerful CPU, so now game logic takes 83% of your CPU time and PS4k graphics take another 83% of your CPU time. Since PS4 is not powered by programming proverbs or Russian election frauds, and you can't optimize game logic further (or else you would on PS4K already), you now have to reduce the graphical CPU processing by a factor of 5. Throw in communication buses, RAM issues, GPU, and so on, and you have quite a kerfuffle.
 
I don't see this thing being at E3. At all. No mention. I do expect there to be a God of War 4 trailer with a "gameplay running real-time on a Playstation 4 system" message right at the start, prompting endless arguments about whether it was PS4K or OG PS4.
 
I don't see this thing being at E3. At all. No mention. I do expect there to be a God of War 4 trailer with a "gameplay running real-time on a Playstation 4 system" message right at the start, prompting endless arguments about whether it was PS4K or OG PS4.

Nah. Maybe initially.


E3 stage demos are usually played again at various livestream shows, there devs can showcase where the demo is running and answer community questions.
 
If he doesn't plan on playing anything on the PS4 until the thing comes out, it isn't a terrible idea.

Not when he has no idea when it comes out.

Sold it for $400.

Which floored me because I paid a significantly lower price when it first launched here in Canada, thanks to Gift Cards and some serious discounts from friends working at Target, lol

Worst case scenario: PS4K was an elaborate lie and I end up settling for a Slim PS4.

T_T

o_o Okay, never mind. Well, ignoring that the CAD is poop right now :P
 
I don't see this thing being at E3. At all. No mention. I do expect there to be a God of War 4 trailer with a "gameplay running real-time on a Playstation 4 system" message right at the start, prompting endless arguments about whether it was PS4K or OG PS4.

I agree because of PSVR. It makes no sense why they'd cut the legs out from under those sales by announcing this 5 months before PSVR releases.
 
I agree because of PSVR. It makes no sense why they'd cut the legs out from under those sales by announcing this 5 months before PSVR releases.

I don't see them as competing products. If what they've said is true, then PS4K would be a product that would only enhance the experience of PSVR. They've most likely already sold through their initial allotment of VR units anyway. It's also probably going to be supply constrained if we reference HTC Vive and Occulus as examples. There's also no point in having another press conference about PSVR. They have done enough already on that over the past 6 months. Sure it will be at their booth, but their conference is either going to be about games only and potentially new hardware.
 
I agree because of PSVR. It makes no sense why they'd cut the legs out from under those sales by announcing this 5 months before PSVR releases.

I don't know if I'm alone here but I got the feeling the PSVR price and date being announced behind closed doors at a developers conference as it was had reasons beyond it coming a bit later than thought.

Just a niggle I have that may well be very wrong.
 
I don't see this thing being at E3. At all. No mention. I do expect there to be a God of War 4 trailer with a "gameplay running real-time on a Playstation 4 system" message right at the start, prompting endless arguments about whether it was PS4K or OG PS4.

Unless they need to get the word out before PSVR arrives that PS4K will be needed for some PSVR games to work.

Imagine the backlash if they announce that after PSVR arrives.

As for GoW in 4K, i'm sure people will know off the bat if it is from the new or old. Differences will be very noticeable.

I agree because of PSVR. It makes no sense why they'd cut the legs out from under those sales by announcing this 5 months before PSVR releases.

Sales would be lost up front for sure. But if 4K means more frames and more frames means more higher fidelity games then the pay off would be greater in the long run.
 
This is micro-optimization. Yes, their new tricks will enable better effects with fewer cycles and both machines will benefit.

But on a macro level, you can't optimize for both. Why? There are tasks that will not scale, especially things that you do on the CPU. How much cpu time will you spare for the AI on the CPU, for example? And now we're beginning to see the use of GPU for tasks that wouldn't scale like typical GPU jobs (typical taks like resolution, texture filtering quality, AA). How much GPU will you be using for those tasks? At least, the rest of the GPU can be scaled down or up easier than things you do with your CPU.

So no, optimization is not solely done to architecture, it's also done with respect to how much you have of what, and no, you can't fully utilize both.

Think about a game like FFXV. It could run at 1080p@60fps or at least at a super stable 30fps, and with better AA, and enhanced effects. These are the kind of improvements I think we can expect.
 
I agree because of PSVR. It makes no sense why they'd cut the legs out from under those sales by announcing this 5 months before PSVR releases.

Sony is going to sell every PSVR that they can make this holiday so I don't think PSVR will be affected by a PS4K announcement this year.
 
Sony is going to sell every PSVR that they can make this holiday so I don't think PSVR will be affected by a PS4K announcement this year.

Strategically it's going to be very interesting to see how they position it alongside PSVR and OG PS4.

I wonder how "your current PS4 is good enough for PSVR, but if you want the best experience we have PS4K" is going to go down with the masses. Probably fine, but who can tell until it happens.
 
Like I said, we haven't seen those kind of extensions added since the early 2000s.
No need for it as a lot of the audio decoding, video decoding, 3D math, physics, etc all are offloaded to the GPU these days.

The 360 example also doesn't work because they weren't creating a more powerful 360, and they didn't want devs to get good performance on one 360 and not the other as they were otherwise supposed to be the same spec. This is not the case of PS4K where the intended thing is the PS4K can get better performance or better visual fidelity over the existing PS4.
AMD is offloading audio decoding, video decoding, most OpenVX, some OpenCL and post processing to Xtensa accelerators as soon will the PS4 and XB1. This is why I have been speculating that the PS4K is misunderstood and could just be a firmware update to enable the Southbridge.

Off loading these processes result in saved CPU and GPU cycles without changing the architecture of the PS4 APU as the accelerators are in Southbridge. What if the PS4.5 has a more efficient and larger more powerful Xtensa DPU in Southbridge. It would make no difference in programs that do not take advantage of it and would accelerate programs that off load CPU and GPU cycles to the accelerator. Later AMD APUs have more powerful and efficient Xtensa DPUs and we should expect the same and maybe more so with the PS4 revisions.

This theory agrees with early PS4K and PS4.5 rumors but not the 2x GPU. Is that 2 GPUs as onQ123 has speculated but not a second GCN as the Xtensa Accelerators can emulate a GPU and for sure provide GPGPU which is abstracted as OpenVX and OpenCL. Or is it 2X the GPU performance which Eurogamer says is not possible. Is Sony going to add add a second GPU to southbridge in addition to the Xtensa accelerators? Increase the Southbridge memory from 256MB to a couple of GB? This would be by far cheaper than changing the PS4 APU.

Eurogamer still thinks the PS4 APU contains a UVD and VCE as do most who haven't thought the GDDR5 memory issue through. There was no GDDR5 memory controller for ARM IP and the AMD UVD is a Xtensa DPU on a ARM AXI bus controlled by a trustzone processor. The XB1 APU contains a ARM bus which is why it must use DDR3 while Sony moved the ARM IP out of the APU so they could use GDDR5. This may keep Eurogamer from coming to the conclusions I'm speculating.
 
Eurogamer still thinks the PS4 APU contains a UVD and VCE as do most who haven't thought the GDDR5 memory issue through. There was no GDDR5 memory controller for ARM IP and the AMD UVD is a Xtensa DPU on a ARM AXI bus controlled by a trustzone processor. The XB1 APU contains a ARM bus which is why it must use DDR3 while Sony moved the ARM IP out of the APU so they could use GDDR5. This may keep Eurogamer from coming to the conclusions I'm speculating.
This is contradictory.

PC AMD GPUs have VCE/UVD and GDDR5 memory. Care to explain this one?

Even the TrueAudio DSP uses GDDR5 memory (64MB allocation).
 
This is contradictory.

PC AMD GPUs have VCE/UVD and GDDR5 memory. Care to explain this one?

Even the TrueAudio DSP uses GDDR5 memory (64MB allocation).
VCE is AMD and not ARM and there is a special host guest IOMMU between the X-86 bus and ARM bus. There is no GDDR5 ARM memory controller, GDDR5 requires drive transistors and memory speeds not found for ARM and there was no AMD analog for the NOC features of the ARM AXI bus when the PS4 and XB1 were designed!

AMD APUs do not use GDDR5 except for the PS4. There is no GDDR5 memory controller in any AMD APU except the PS4. The newer AMD APUs (Kaveri and Carrizo) are also responsible for network standby and media power modes (same as the XB1) so they use DDR3 and DDR4. Carrizo does not have a VCE, all codecs encode and decode take place in the UVD which is a Xtensa accelerator same as the XB1. This (Network standby) was the Launch explanation for the PS4 Southbridge which has it's own 256MB of DDR3, but they also mentioned the Trustzone processor was in Southbridge. Trustzone manages the ARM AXI bus and can support a Root of Trust boot for the APU and Trustzone is typically used to support a TEE for Media. There is no need for the added expense in duplicating a ARM bus and providing a IOMMU in the PS4 APU when it's already in Southbridge. Additionally it allows for an easier upgrade of the APU we are now discussing. Edit: I would think Southbridge having 256 MB of it's own memory would point to more than Network Standby. The PS3 also has 265 MB of system memory so what can be accomplished with that amount of memory shouldn't be underestimated.

Large AMD dGPUs can not support network standby but can support a TEE with video codecs and player with GPU off using GDDR5 at a lower clock via a IOMMU. The Xtensa accelerators in AMD APUs and dGPUs will also be used for OpenVX vision processing and essentially all post processing, player, codecs, DRM Key, HDCP, Playready porting kit software and Eyeinfinity is ARM IP on a Trustzone managed ARM bus.

For the PS4 assume a large AMD dGPU and move the ARM blocks out of it to Southbridge with its' own DDR3 memory. It can ALSO now support Network Standby, CEC, wake on Key Phrase (voice turn on) and any features supported with Southbridge do not take cycles from game Memory or GPU.

Cadence which provides the ARM IP and Host guest IOMMU for AMD and the XB1 has no GDDR5 memory controller.
 
Basically this.

Imagine you're optimizing for PS4K and the general game logic takes 50% of your CPU time, while 50% of your CPU time is consumed by something that has something to do with graphics. Then you go to do things with PS4, and while there are no micro-optimization problems, PS4k has, say, 50% more powerful CPU, so now game logic takes 83% of your CPU time and PS4k graphics take another 83% of your CPU time. Since PS4 is not powered by programming proverbs or Russian election frauds, and you can't optimize game logic further (or else you would on PS4K already), you now have to reduce the graphical CPU processing by a factor of 5. Throw in communication buses, RAM issues, GPU, and so on, and you have quite a kerfuffle.

So why use a PS4K as your starting point? Why not start with PS4, then adjust on the faster PS4K?

If you start our developing a new game and know in 2-3 years time when it is released you want it to run on both systems, why aim for PS4K first and cause your self problems trying to reduce to PS4? Why not aim for PS4 then see what extra overhead PS4K provides for extra features as time allows?
 
Sony is going to sell every PSVR that they can make this holiday so I don't think PSVR will be affected by a PS4K announcement this year.

Strategically it's going to be very interesting to see how they position it alongside PSVR and OG PS4.

I wonder how "your current PS4 is good enough for PSVR, but if you want the best experience we have PS4K" is going to go down with the masses. Probably fine, but who can tell until it happens.

Yeah, Sony has nothing to worry about in terms of PS VR this year. The device is going to launch well since there's enough hype from early adopters.

What they should worry about is 2017+ and balancing out how they push the device. The device will be for all PS4 users but if there are many examples that show that the device works better on PS4K then that may impact PS VR sales for people who are sticking with the original PS4.
 
Basically this.

Imagine you're optimizing for PS4K and the general game logic takes 50% of your CPU time, while 50% of your CPU time is consumed by something that has something to do with graphics. Then you go to do things with PS4, and while there are no micro-optimization problems, PS4k has, say, 50% more powerful CPU, so now game logic takes 83% of your CPU time and PS4k graphics take another 83% of your CPU time. Since PS4 is not powered by programming proverbs or Russian election frauds, and you can't optimize game logic further (or else you would on PS4K already), you now have to reduce the graphical CPU processing by a factor of 5. Throw in communication buses, RAM issues, GPU, and so on, and you have quite a kerfuffle.

are we talking about hypothetical PS4 exclusives? Since most real life games have versions for XB1 and PC and those are a lot bigger "problems" than PS4 and PS4.5, which is very simple in comparison.
 
So why use a PS4K as your starting point? Why not start with PS4, then adjust on the faster PS4K?

If you start our developing a new game and know in 2-3 years time when it is released you want it to run on both systems, why aim for PS4K first and cause your self problems trying to reduce to PS4? Why not aim for PS4 then see what extra overhead PS4K provides for extra features as time allows?

That's the point, you can't optimize for both. Either PS4K gets under-utilized, or PS4 suffers immensely. It's like the PC situation where newer hardware doesn't properly get utilized.

are we talking about hypothetical PS4 exclusives? Since most real life games have versions for XB1 and PC and those are a lot bigger "problems" than PS4 and PS4.5, which is very simple in comparison.
Good point, but this time around XB1 and PS4 rocking nearly identical CPU with very similar output had set a really good baseline to shoot for.
 
So why use a PS4K as your starting point? Why not start with PS4, then adjust on the faster PS4K?

If you start our developing a new game and know in 2-3 years time when it is released you want it to run on both systems, why aim for PS4K first and cause your self problems trying to reduce to PS4? Why not aim for PS4 then see what extra overhead PS4K provides for extra features as time allows?

You are correct in that optimizing for OPS4 first and throwing in a PS4K enhancement or two later would prevent that from happening most of the time. The problem is that I don't trust marketing teams to recommend to publishers to order the developers to do just that. The logical thing to do from marketing viewpoint, since they show one version of a trailer, would be focusing on the more powerful hardware. And, depending on your interpretation of leaks so far, this may be already happening.
 
Do we yet have any proof this actually exists outside of a devkit at Sony?

I find it absolutely bizarre that major third parties are rumored to have never heard of it while the detailed information about it was given at a retailer conference.
 
That's the point, you can't optimize for both. Either PS4K gets under-utilized, or PS4 suffers immensely. It's like the PC situation where newer hardware doesn't properly get utilized..

Surely a good business model would be to provide a period of overlap between systems when the PS4K gets better but not necessarily fully utilized as you put it. Then when the overlap period has ended the PS4 is no longer supported and the problem goes away?

You are correct in that optimizing for OPS4 first and throwing in a PS4K enhancement or two later would prevent that from happening most of the time. The problem is that I don't trust marketing teams to recommend to publishers to order the developers to do just that. The logical thing to do from marketing viewpoint, since they show one version of a trailer, would be focusing on the more powerful hardware. And, depending on your interpretation of leaks so far, this may be already happening.

Surely any business wanting to make profit would target were they can maximise sales. If PS4 users > PS4K users...
 
Surely a good business model would be to provide a period of overlap between systems when the PS4K gets better but not necessarily fully utilized as you put it. Then when the overlap period has ended the PS4 is no longer supported and the problem goes away?



Surely any business wanting to make profit would target were they can maximise sales. If PS4 users > PS4K users...


this 100%.

ps4 is selling like hot cakes will have a 50 million plus lead by the time ps4k comes out, developer will target ps4 specs, for at least 3 years from now.
 
Jeff, why do you keep insisting that dedicated co-processors/DSPs need a separate/dedicated memory controller?

It doesn't work like that. You can only have one memory controller (tied to the GPU) and the memory bandwidth is shared with other processors (CPU, UVD, VCE, TrueAudio DSP etc.) by using internal buses.
 
Do we yet have any proof this actually exists outside of a devkit at Sony?

I find it absolutely bizarre that major third parties are rumored to have never heard of it while the detailed information about it was given at a retailer conference.

Yeah, this is why I have serious doubts.
 
You are correct in that optimizing for OPS4 first and throwing in a PS4K enhancement or two later would prevent that from happening most of the time. The problem is that I don't trust marketing teams to recommend to publishers to order the developers to do just that. The logical thing to do from marketing viewpoint, since they show one version of a trailer, would be focusing on the more powerful hardware. And, depending on your interpretation of leaks so far, this may be already happening.

Doesn't seem like it would make much sense to not optimize for the lowest spec first which also has the largest install base. That's where they are going to get most of their sales from.
 
Do we yet have any proof this actually exists outside of a devkit at Sony?

I find it absolutely bizarre that major third parties are rumored to have never heard of it while the detailed information about it was given at a retailer conference.

Yeah, this is why I have serious doubts.

"The absence of evidence is not the evidence of absence, simply because you don't have evidence that something does exist does not mean you have evidence of something that doesn't exist."

The+absence+of+evidence+is+not+the+evidence+of+absence+_16b2cc96cac38221051852b93b858f18.png
 
I'm just surprised there was ever a time when someone selling pancakes was so overwhelmed with demand that a whole idiom came out of it.

Like, if I saw some dude making pancakes I wouldn't think "that dude's on his way to the penthouse".

Pancakes that big? I know they have a whole International House for them, but I just don't see any pancake rush anymore.

But I guess because I don't see this big pancake bonanza isn't confirmation that such bonanza doesn't exist. Maybe I'm just missing out on pancakemania.
 
Top Bottom