Every time Amazon's cloud service goes down, it breaks the Internet, what would happen if MS Azure goes down?
It's not prebaking when it is being calculated in real time
You take grey scale images which compress extremely well and apply them to textures
May be a couple frames off or so but for GI? It's great stuff.
Also other things handled by cloud... Foliage
Cloud computing is not an MS thing. Anybody can use cloud computing today. Making network calls to a server has been around for years. Server farms have been around for years.
I don't understand why Microsoft keeps pushing this as a unique feature.
Gaikai does not compare the Azure.
what they claim is when connected to cloud, the game will be prettier, and game still perfectly playable when offline, I cannot see how is that DRm
what they claim is when connected to cloud, the game will be prettier, and game still perfectly playable when offline, I cannot see how is that DRm
Oh, your dad laughed at it? Guess we can put this one to rest.
Environmental stuff can be far more powerful, other things that can have plenty of beneficial are special effects of any kind, cutscenes, and gaming calculations towards your player indirectly like mob moving enemies, wind, any calculations non-millisecond input sensitive, lovely waterfalls flowing anyone?
Details, games can have far more details and offload it from the hardware, just it.
This is HOW it works, it's on Microsoft's hands to make good use of it.
That was the exact same claim made for simcity, that calculations HAD to be done on a cloud server so that the system requirements are lower. IT turned out that the entire game could be played offline due to a hacker. Not hard to see why people are skeptical.
They're talking about how they could do all of this pre-calculation of lighting and other things in the cloud, things that aren't going to be impacted by lag, as reasons for having always-on.
What I want to know is why this is any different from doing pre-calculation of that kind of information before the information is pressed to the disc?
I mean it sounds to me like they're saying "we can bake your lighting while your level loads so it's more awesome thanks to the cloud!" because if it's not real-time, frame by frame lighting, isn't it baked lighting? Haven't we been doing baked lighting forever?
I'm ignorant, though, and incredibly, spitfoamingly anti-anti-consumer. Educate me.
They're talking about how they could do all of this pre-calculation of lighting and other things in the cloud, things that aren't going to be impacted by lag, as reasons for having always-on.
What I want to know is why this is any different from doing pre-calculation of that kind of information before the information is pressed to the disc?
I mean it sounds to me like they're saying "we can bake your lighting while your level loads so it's more awesome thanks to the cloud!" because if it's not real-time, frame by frame lighting, isn't it baked lighting? Haven't we been doing baked lighting forever?
I'm ignorant, though, and incredibly, spitfoamingly anti-anti-consumer. Educate me.
Simcity claim it works offline?
People are actually starting to believe this BS?
seriously?
seriously?!
Just not going to happen. We aren't going to get games w/ the tagline "you must have a X Mbit connection in order to play this game's single player mode" This type of stuff is entirely dependent on the connection of the user.
Will require a ton of coding to account for variances in connections, and devs will just deem it not worth the extra effort to do so. Just like most multiplatform games, they will go for the lowest common denominator when developing a game.
That's my prediction.
They said 40x Xbox 360 FYI
They say one is 10x 360
So that would mean that one in your home and 3 in the cloud equals 40x 360
Is simple math really that difficult
They aren't pre calculating
The game connects to a cluster of servers and they do heavy operations there and send info back and forth in between the X1.
It's like having a massive CPU with a small high latency link in between it
How can lighting be pre-calculated? That makes no sense to me? How does it know which direction I'll be facing by the time the response is sent back from the "cloud"?
What's the minimum internet connection bandwidth needed to actually take advantage of cloud computing anyway?
They're talking about how they could do all of this pre-calculation of lighting and other things in the cloud, things that aren't going to be impacted by lag, as reasons for having always-on.
What I want to know is why this is any different from doing pre-calculation of that kind of information before the information is pressed to the disc?
I mean it sounds to me like they're saying "we can bake your lighting while your level loads so it's more awesome thanks to the cloud!" because if it's not real-time, frame by frame lighting, isn't it baked lighting? Haven't we been doing baked lighting forever?
I'm ignorant, though, and incredibly, spitfoamingly anti-anti-consumer. Educate me.
Hm. I'll believe it when I see it.
Sending light maps for textures
Won't be handling reflected or refracted light.
One thing is for sure, if both consoles release, and PS4 games continually end up looking better, Microsoft will find it hard to live down these sorts of claims.
What they are proposing is a logistical and technical nightmare. Seems to be complicating the development pipeline further, but that's just my opinion. To me, they would be better suited to just using those servers to offer dedicated servers (which I think they are) for low latency advantages in ping, less lag in online games, bigger maps, more players etc.
People are actually starting to believe this BS?
seriously?
seriously?!
Sounds like the same stuff I've been saying in threads, but nobody comments.
They have already said games will have dedicated servers.
What's the benefit of doing that in the cloud? Wouldn't light maps for textures that won't change based on user interactions be something that could just be loaded in the background on the system? I thought that stuff was very non-processor intensive overall?
They should be toting that line stronger, not saying the system will be 40x more powerful vs the 360 with the cloud ....
One thing is for sure, if both consoles release, and PS4 games continually end up looking better, Microsoft will find it hard to live down these sorts of claims.
What they are proposing is a logistical and technical nightmare. To me, they would be better suited to just using those servers to offer dedicated servers (which I think they are) for low latency advantages in ping, less lag in online games, bigger maps, more players etc.
One thing is for sure, if both consoles release, and PS4 games continually end up looking better, Microsoft will find it hard to live down these sorts of claims.
What they are proposing is a logistical and technical nightmare. Seems to be complicating the development pipeline further, but that's just my opinion. To me, they would be better suited to just using those servers to offer dedicated servers (which I think they are) for low latency advantages in ping, less lag in online games, bigger maps, more players etc.
And then if this does work, what stops nintendo, Sony and pc devs from doing it, essentially neutralizing the advantage...?
Anyone that has even a small understanding of cloud architecture would know these things are possible, but on GAF it gets lost in all of the fanboy drivel.
3-4 TeraFlops? Holy mother of ..Its not really bullshit, he's just emphasizing how much computational power relative to one console
He could have said hundreds.
I assume though when he says 3, that may imply there's a 3-4 TF limit per session as its developer handled
How can lighting be pre-calculated? That makes no sense to me? How does it know which direction I'll be facing by the time the response is sent back from the "cloud"?