Gamesindustry: Xbox Live Compute (Cloud Servers) offered free to devs

There is still a host though, I was under the impression that the server simply accepts all the connections from all players, regardless of their NAT status then sends that data to the game host in 1 big sweet lump that is free of NAT compatibility issues.



It's just regular P2P with the NAT issues removed, you'll still get host advantage because the Host's machine is having the final say on every shot fired.

But in P2P, doesn't one machine act like host and determine damage, hits, scoring, everything? In Killzone, YOUR PS4 determines damage taken, your position on the map, hits, etc. The host is only receiving scoring information. It's really not difficult to understand.

"Almost all logic relating to you (e.g. moving, shooting and taking damage) runs on your local PS4, with only a very small portion of the game logic (i.e. mission/scoring logic) running on the ‘session master’, one PS4 in the game selected for its connection quality."
 
For those whinging about virtual servers, how much processing power and ram do you think dedicated servers need?

For Rackspace's Openstack, which Sony will be using:

Suggested system requirements

To install Private Cloud Software, you will need compute resources for the following 2 key roles within the infrastructure.

Chef server (optional - Opscode Hosted Chef may be used)
16GB RAM
60GB disk space
4 CPU cores


OpenStack controller node
16GB RAM
144GB disk space (more if Glance resides on the same server)
4 CPU cores

http://www.rackspace.com/cloud/private/openstack_software/
 
For Rackspace's Openstack, which Sony will be using:

Suggested system requirements

To install Private Cloud Software, you will need compute resources for the following 2 key roles within the infrastructure.

Chef server (optional - Opscode Hosted Chef may be used)
16GB RAM
60GB disk space
4 CPU cores


OpenStack controller node
16GB RAM
144GB disk space (more if Glance resides on the same server)
4 CPU cores

http://www.rackspace.com/cloud/private/openstack_software/

Keep in mind Sony isn't just exclusively using Rackspace. A vast majority of their previous games with dedicated servers used Amazon.
 

Hmm does go well with your argument then

The "immediately available to your game" part is the most interesting

If that's true then it has to be a by the unit estimate and not an overall estimate divided by projected install base right?
 
There is a difference, virtual is better for online gaming of this type. There are a lot of advantages.

So? Even if they had physical server they would still run out if there aren't enough servers. The advantage of virtual servers is that they are more versatile. What I don't get is why you don't understand this.

So you think they won't have enough servers for launch? You think a server host won't enough servers? Ok I'm done.

People see the difference. The problem here is that some people(you for example) can't grasp the reason why virtual servers are a better solution than allocating a physical server in every region of the world for every single game (that would be incredibly stupid).

When did I ever say virtual was a bad thing? And of course they will be running virtual machines anything else would be stupid. What I still don't get is why you don't get it.

If there is 300k physical ones then lets just say they can 20 times for virtual.
If there is 300k virtual ones is that it?(still good thou)

Or is there some rule in the TOS I missed that forbid me from asking a question to something I didn't know? And from you 3 it seems as none of you do either.
 
So how are physical servers better? They are limited by their hardware as well. So what is your point?

Dedi's that still needs a player's PS4 as a host...

If you expect different kind of dedi servers comming for X1 you're in for a nasty surprise imo
 
For Rackspace's Openstack, which Sony will be using:

Suggested system requirements

To install Private Cloud Software, you will need compute resources for the following 2 key roles within the infrastructure.

Chef server (optional - Opscode Hosted Chef may be used)
16GB RAM
60GB disk space
4 CPU cores


OpenStack controller node
16GB RAM
144GB disk space (more if Glance resides on the same server)
4 CPU cores

http://www.rackspace.com/cloud/private/openstack_software/

I am speaking about running a game's dedicated server that is processing a single game instance. What are the resources required for that?
 
Facepalm. It's the opposite.

Both you guys are getting tied up in different scenarios and semantics.

He is saying you need less resources on virtual because of load balancing.

You are saying you need more resources on virtual because of redundant memory usage.

Both are correct.
 
I really don't know how to dumb this down any further.

The more physical hardware you have, the more resources you have.

The more virtual servers you have, the more physical resources you need.

So your argument is that virtual servers will never be as good because if you want 3 virtual servers you need physical hardware 3x more powerful than just having a single physical server functioning as a single physical server.

GASP mind blown.

who'd of thunk it, thank you for enlightening us all.
 
It shouldn't happen, because the allocation of CPU for a single user will greatly exceed the CPU power required to run a dedicated server for several players together.

So lets say if you are playing in a game with 16 people each user has 5.25ghz of CPU available to their machine (based upon the 3x cpu statement)

but the dedicated server itself may only be using 1ghz to run the game for those 16 players. That means there is 83ghz of CPU power between all of those users sitting there unused, whilst playing a game there is going to be limited uses for that power. Some of it may be dealing with notifications or doing something background matchmaking, but generally there should always be enough to go around. Especially early on when the cloud isn't going to be used to render anything or do anything truly intensive.

As for the proof, if it is free to leverage, why wouldn't they use it?

That's not how it works. At all.
 
What?


Facepalm. It's the opposite.

No, it is not the opposite. If I have one server running 100 virtual servers, I require *LESS* hardware.

When did I ever say virtual was a bad thing? And of course they will be running virtual machines anything else would be stupid. What I still don't get is why you don't get it.

If there is 300k physical ones then lets just say they can 20 times for virtual.
If there is 300k virtual ones is that it?(still good thou)

Or is there some rule in the TOS I missed that forbid me from asking a question to something I didn't know? And from you 3 it seems as none of you do either.

Because it would be a huge waste of money. How many multiplayer games do you think will be released this gen? Using your example, that would be 6 million virtual servers.
 
But in P2P, doesn't one machine act like host and determine damage, hits, scoring, everything? In Killzone, YOUR PS4 determines damage taken, your position on the map, hits, etc. The host is only receiving scoring information. It's really not difficult to understand.

"Almost all logic relating to you (e.g. moving, shooting and taking damage) runs on your local PS4, with only a very small portion of the game logic (i.e. mission/scoring logic) running on the ‘session master’, one PS4 in the game selected for its connection quality."

In P2P I believe you console makes all the decisions about everything, but then the data is sent to the host and that host's machine then decides whether or not your machine's calculation of where everything in game is correct.
I thought that the reason for rubberbanding in games was because your machines decides that you have moved, but the host's has decided you haven't, so it puts you back where you were.

If it is such a simple solution, then why don't they process the logic right now in this way using P2P?
 
Hilarious, we're now back to the same exact argument people were making when MS unveiled Azure for the X1 (but teh vm!). By pretty much the same people, too.

Running on such a large scale wouldn't be really possible (or feasible) without Virtualization. It also allows you to utilize your hardware much more efficiently. And none of this affects actual performance in any way. And it seems people are ignoring that physical machines can be overloaded just as easily. Hell, pretty much every single dedicated server out there is going to be hosting more than one game...
 
But in P2P, doesn't one machine act like host and determine damage, hits, scoring, everything? In Killzone, YOUR PS4 determines damage taken, your position on the map, hits, etc. The host is only receiving scoring information. It's really not difficult to understand.

"Almost all logic relating to you (e.g. moving, shooting and taking damage) runs on your local PS4, with only a very small portion of the game logic (i.e. mission/scoring logic) running on the ‘session master’, one PS4 in the game selected for its connection quality."

A lot is done on the host/server. That is why dedicated servers are so important. We're talking about your positioning, whether you've received a hit or have been hit, for everyone in the entire match.

It also depends on the game, because some stuff is handled more client side and some stuff is handled server side. For example, if hit detection is handled all client side, that is absolutely terrible. Games today have hit detection based on both client and host server, the server acting as a sort of authentication that a hit is indeed a hit.

More stuff on the client = more hackable as well. More stuff on the server side, the more hackable from the server's end. If it is dedicated server, you remove that vulnerability. The more handled by the server at that point, the less people can hack.

As per visual fidelity, it is a boost for everyone. I think some games have been holding back visuals in MP because the host player would get the crappy visuals and thus no one would want to be host. Since most games don't have client side hit detection or pure server side, there isn't a huge host advantage anymore either (although it still exists in a smaller way). So devs had to choose between giving host a short stick or giving everyone equality. I think many chose equality.
 
If you expect different kind of dedi servers comming for X1 you're in for a nasty surprise imo
Gears 3 had real dedicated server, sadly there where too few of them. The beta was magical though. They have never played with the definition of dedicated servers.
I'm going to make it a simple as possible. I'm going to use an example

100 physical machines -> 50 physical servers for CoD and 50 physical servers for Halo.

100 physical machines-> 50 virtual servers for CoD and 50 virtual servers for Halo.

There is no difference in performance.

The advantage of virtual server is that they can switch the ratio's on the fly. So the software detects that more people are playing CoD. So it becomes:

100 physical machines-> 80 virtual servers for CoD and 20 virtual servers for Halo. With physical servers you are now in que in CoD. While Halo has too many empty slots.

When the servers are empty you can use it for other purposes. With Physical servers you are just burning cash, idling.
 
But in P2P, doesn't one machine act like host and determine damage, hits, scoring, everything? In Killzone, YOUR PS4 determines damage taken, your position on the map, hits, etc. The host is only receiving scoring information. It's really not difficult to understand.

"Almost all logic relating to you (e.g. moving, shooting and taking damage) runs on your local PS4, with only a very small portion of the game logic (i.e. mission/scoring logic) running on the ‘session master’, one PS4 in the game selected for its connection quality."

The host never really ran everything in p2p it's not streaming the game to you like Gaikai, all local information your movements damage etc are calculated on your machine then then uploaded the host simply organised it in respects to other players, which is where the bandwidth issue comes you your, console only received the presorted data and then upload it's own, the host actually needed to receive the data from all sources and send out data to all sources which considering the typical consumer level bandwidth is where the issue comes in combined with distances etc. It's also why the host has the advantage because he's received the data from those players directly whereas your receiving the data from him (you don't receive any data directly from any of the of the other players.
 
We can argue Killzone:SF servers until we're all blue in the face. The main thing is, it's the same implementation of Killzone 2--and that shit ran like a champ. We'll see how things pan out come launch.
 
Not gonna lie, but something like free servers to devs is something that might actually benefit me as a consumer.

If MS keeps it up, they might have another Xbone owner.
 
I'm going to make it a simple as possible. I'm going to use an example

100 physical machines -> 50 physical servers for CoD and 50 physical servers for Halo.

100 physical machines-> 50 virtual servers for CoD and 50 virtual servers for Halo.

There is no difference in performance.

The advantage of virtual server is that they can switch the ratio's on the fly. So the software detects that more people are playing CoD. So it becomes:

100 physical machines-> 80 virtual servers for CoD and 20 virtual servers for Halo. With physical servers you are now in que in CoD. While Halo has too many empty slots.

When the servers are empty you can use it for other purposes. With Physical servers you are just burning cash, idling.

There is another benefit to virtual servers/machines. The things running inside them are segregated. They cannot go awry and use up all the resources on the box. They would just kill their own instance and everything else stays up.
 
Because it's not P2P, it's using a dedicated server to connect all the players together.

Ugh

Read my wall of text on the last page please

Running more logic on the client side has been happening since the day WoW was released, it's not to our benifit it's done to decrease bandwidth useage (with huge disadvantages)

Running more logic on client side with p2p multiplayer has also been a sad trend that has been going on for several years, guerilla didn't invent this.

It's p2p multiplayer, simple as that, with every downside that entails.
There is no WIN here for the gamer with the system guerilla use.
It's a PR spin , like how MS took dedicated servers and called it 'power of the cloud' and suggested it would make your xbox games look better.

Hold on to your dignity as a consumer please and don't just buy every pr spin story you are told
Stand up for yourself, not for a brand name.

Some magic words related to clientside prediction, packet loss and the game developer choosing the game servers that long time wow players will understand:
flash of light/charge bug
Telia
 
Because it's not P2P, it's using a dedicated server to connect all the players together.

It's server dedicated to routing traffic, not a server dedicated to making all the decisions about what is happening with 16 players in the game.
 
How does it work then?

Because you can't just magically divide GHz and do whatever you want with it. You really need to stop for a second and ask yourself if your examples make sense. Sharing a define X amount of ressources won't be the same for 1 application or 30. Running one copy of windows isn't the same as running 5. And so on and so forth.

Anyway, this is perhaps a good way to start:

http://computer.howstuffworks.com/server-virtualization.htm

Pay attention to page 4 in particular.

2san said:
I'm going to make it a simple as possible. I'm going to use an example

100 physical machines -> 50 physical servers for CoD and 50 physical servers for Halo.

100 physical machines-> 50 virtual servers for CoD and 50 virtual servers for Halo.

There is no difference in performance.

The advantage of virtual server is that they can switch the ratio's on the fly. So the software detects that more people are playing CoD. So it becomes:

100 physical machines-> 80 virtual servers for CoD and 20 virtual servers for Halo. With physical servers you are now in que in CoD. While Halo has too many empty slots.

When the servers are empty you can use it for other purposes. With Physical servers you are just burning cash, idling.

Except that it's not a 1:1 ratio between physical and virtual. That's the whole freaking point of it. Have more (virtual) servers run on a smaller amount of (physical) servers.
 
Gears 3 had real dedicated server, sadly there where too few of them. The beta was magical though. They have never played with the definition of dedicated servers.

I'm going to make it a simple as possible. I'm going to use an example

100 physical machines -> 50 physical servers for CoD and 50 physical servers for Halo.

100 physical machines-> 50 virtual servers for CoD and 50 virtual servers for Halo.

There is no difference in performance.

The advantage of virtual server is that they can switch the ratio's on the fly. So the software detects that more people are playing CoD. So it becomes:

100 physical machines-> 80 virtual servers for CoD and 20 virtual servers for Halo. With physical servers you are now in que in CoD. While Halo has too many empty slots.

When the servers are empty you can use it for other purposes. With Physical servers you are just burning cash, idling.

I think the whole integrity of your argument breaks down when you realize that physical servers aren't some static entity that can only do one thing. But yeah, at this point it isn't worth going into.
 
Ugh

Read my wall of text on the last page please

Running more logic on the client side has been happening since the day WoW was released, it's not to our benifit it's done to decrease bandwidth useage (with huge disadvantages)

Running more logic on client side with p2p multiplayer has also been a sad trend that has been going on for several years, guerilla didn't invent this.

It's p2p multiplayer, simple as that, with every downside that entails.
There is no WIN here for the gamer with the system guerilla use.
It's a PR spin , like how MS took dedicated servers and called it 'power of the cloud' and suggested it would make your xbox games look better.

Hold on to your dignity as a consumer please and don't just buy every pr spin story you are told
Stand up for yourself, not for a brand name.

Some magic words related to clientside prediction, packet loss and the game developer choosing the game servers that long time wow players will understand:
flash of light/charge bug
Telia
Well then COD is also going to be p2p for some segment of the population if we go by your definition since they are using a mix of dedicated and listen servers. Would this be correct then?
 
The discussion here arguing that azure is going to lack the necessary physical resources to support the virtual machines necessary for gaming applications is dumb. If any physical servers overload, VMs migrate to a frame with excess capicity. Are people here suggesting that Azure doesnt have the physical compute resources to support the VM load of xbox live on xbox one? If so, that is a foolish argument. If this is about just people trying to understand server virtualization, there are probably better resources than this thread on this forum.
 
Wasn't there an Xbone video just the other day talking to a tech about dedicated servers in the cloud and how they act as the authority for the match, thus eliminating host advantage and making the match more secure from hacking?

Edit: 2 minutes, 30 seconds here.

There was a GAF thread about it here: http://www.neogaf.com/forum/showthread.php?t=697075&highlight=xbox

Depends on the game logic, ultimately depending on the devs no ? It's not MS who decides for the MP design of 3rd party games
 
Because it would be a huge waste of money. How many multiplayer games do you think will be released this gen? Using your example, that would be 6 million virtual servers.

Because among so many of the devotees it was taken as fact that it was physical and the preacher said so as well. And that was part of what I was questioning.
Now the second question I have is if these servers are dedicated for Xbox or if it will spread among all Microsofts products and services.

And I press again that I only want to know. One of the main reasons I joined GAF was that the collective seemed to know a lot but questions I had was sometimes left hanging.

btw we can chill about the Windows phone and Tablets no need to involve them in the equations if it's even needed.

nvm

Yup only Xb1 gets the discount or "free" in this case.
 
Because you can't just magically divide GHz and do whatever you want with it. You really need to stop for a second and ask yourself if your examples make sense. Sharing a define X amount of ressources won't be the same for 1 application or 30. Running one copy of windows isn't the same as running 5. And so on and so forth.

I understand this, I was simply saying that the resources that each user in a multiplayer has been allocated by Microsoft would significantly exceed the processing requirements of say running the software required of a single dedicated server.
 
Except that it's not a 1:1 ratio between physical and virtual. That's the whole freaking point of it. Have more (virtual) servers run on a smaller amount of (physical) servers.

And physical servers run more than one game/application, what's your point?

Virtualization using off the shelf software isn't going to cost you more than 5%.
 
Oh cool... so the cloud isn't finite, it's... INFINITE?

Did you even read the part where I said it is soft-DRM? That has nothing to do with allocation and everything to do with MS flipping a switch when they want you to buy the NEXT game they want you to buy.

Games as a service. Cloud computing.

Ahem... D-R-M ?

That's NOT how it works at all with Azure.

With an Azure setup, it doesn't MATTER what game is being played since its based on usage.

Currently, if a 4 year old game has a small userbase, it makes sense to shut down the servers since those servers cost money and the number of users doesn't justify the cost. (the original servers were designed to handle 100times the current small population figure)

With Azure, it doesn't matter since you only use as much "server" as the game requires. Even before the reveal of today that servers would be free, there was always going to be a benefit to developers with working with Azure since their costs would be dynamic for server usage.
 
Top Bottom