Are the 300k servers Microsoft promised game changing?

Recently, as part of the xbox live announcement, Microsoft announced a dramatic expansion in the amount of compute they intended to add to their infrastructure. The plan, as announced, was to grow the server count from 15k to 300k, a 20 fold increase.

This is an astonishing amount of new servers to add to any new service, especially if you are not expecting a huge growth in the number of users.

The marketoon hypothesis

One hypothesis is that some guy in marketing asked some woman in engineering how many servers could the data center hold, and the woman said it could hold 300k, and the bozo figured that would make an awesome press release.

If this is true, the groans in Microsoft Engineering would be vast and awesome…

They are trying to do something different. 

Another, more interesting hypothesis is that they are actually trying to do this:

Booty says cloud assets will be used on “latency-insensitive computation” within games. “There are some things in a video game world that don’t necessarily need to be updated every frame or don’t change that much in reaction to what’s going on,” said Booty. “One example of that might be lighting,” he continued. “Let’s say you’re looking at a forest scene and you need to calculate the light coming through the trees, or you’re going through a battlefield and have very dense volumetric fog that’s hugging the terrain. Those things often involve some complicated up-front calculations when you enter that world, but they don’t necessarily have to be updated every frame. Those are perfect candidates for the console to offload that to the cloud—the cloud can do the heavy lifting, because you’ve got the ability to throw multiple devices at the problem in the cloud.” This has implications for how games for the new platform are designed.

One of the limitations of systems like the xbox is that the upgrade cycle is  5-7 years. The problem with a 5-7 year upgrade cycle is the difficulty in delivering better and better experiences. The effort to extract even better performance requires more and more software tuning until the platform is unable to give any more.

The approach Microsoft is taking to shift some of the computational effort to the cloud and leverage the faster upgrade cycles they have control over to deliver a better experience to their users without forcing their users to buy more hardware.

Several startups, that failed, have demonstrated that it is possible to stream a AAA title to a device. So the idea of doing this is not implausible.

With this, theoretical, approach the folks at Microsoft are attempting to square the circle. They have a stable and rapidly decaying platform in people’s homes, but use the hardware in their data center to give increasingly better graphics through a vast amount of pre-computed data.

The problem, of course, is that in practice the amount of things you have to pre-compute and store is so vast given 3D immersive worlds to be almost impractical.  Well, perhaps except if you had 20x more servers per user…

In the 2D space, this was basically the solution Google adopted to Google Maps. Confronted with the problem of how do you dynamically render every tile on the client, they pre-rendered data on the server and then had the client stream the data.

This is going to be very interesting to see… Although my money is still on the marketoon hypothesis…

 

Leave a Reply