Nocturno999
Member
I missed tons of Wii and Wii U games these years. Is there a chance of being able to play them on Nx?
I imagine this "cloud box" will be their console. Since it looks like their handhelds and consoles will have a similar game library. I suspect that they will want to in various other ways try to encourage people to buy both their handheld and console.By OP's console example at the end, sounds like a handheld (as others have said) that provides gaming on the go, that connects via local cloud to a box at home that boosts graphics/capabilities and allows you to play on an HD TV (or display). And if you don't have the handheld and just the box, your tablet, phone, or device will connect to it and allow you to play. If that's the case, sounds awesome, and I wonder if it would let you game via your phone while not local by using your provoder? I wonder if Apple would have a cow?
Nice thoughts, Thraktor. I've also thought that this would be something they might choose to launch a year or two after the initial NX hardware, as the console itself will already be an investment for consumers. Also, what kind of message would it send if Nintendo's new hardware already needed a 32X-esque add-on right out of the gate? At the same time, I could see them marketing it as partially as external HDD/NAS, especially if they end up doing 2 NX console skus (one w/ optical drive and one w/ internal HDD), as that previous patent implied. In that case, it might make sense to release such a device sooner, if only to supplement the optical drive sku's storage.
Good question. I suppose it could just pause the game like when the Wii U Gamepad gets moved out of range if a disconnect happens unexpectedly. Perhaps one could automatically set to connect to the next closest available SCD if the device you're using has a specific timer associated with its use or in the case of a mobile device, it senses that you're moving out of range.
Great analysis. One question about the ratio of NX+ to NX. Let's say Nintendo wants content streaming at launch and doesn't want to wait for an NX+ launch to implement it, but they also don't want to spend for a server farm (or the standard definition of a server farm)
Could they turn whatever NX in store kiosk they have planned into a supercharged NX server of sorts? If we assume that pretty much every Target and Walmart in the USA is going to have one of these kiosks and we have 4,177 Walmart stores and 1789 Target stores, that's 5,966 kiosks available to act as dedicated servers in the US alone (plus whatever super stores I am forgetting...maybe Toys R Us, Sears, and Kmart depending), and they'll already be in strategic locations around the country.
Would this be enough to get things started if the system sold lets say three million units in the first six months of availability? And if not, what kind of ratio are we talking about to make this feasible?
If it could work, it seems to me the cheapest way to go about things, since multi year distribution deals are already set up with those retail partners, and Nintendo already has their army of reps and technicians in place to deal with any issues they would have with kiosk maintenance.
Interesting points. There's one problem you didn't address though and that is getting the game to the source NX. If you are relying on the person having bought the game in the first place you are cutting down the number of available machines considerably. You would also need to have it stored digitally because none's going to be able to change disks, and therefore you would need so much storage it'd be impractical.
There's also the issue of data caps limiting the number of available systems. I basically think it'd be unworkable at present as an internet based system. However within your subnet I can see it working to some degree. I haven't read the entire patent yet but it seems like console and SCD's can process data requests from other consoles, with SCDs being limited by having no audio, controller etc. So you could have one in the bedroom, one in the lounge and use resources from each other if needed. I would still be concerned with latency and bandwidth, but if you're on say a wired network with 1-2ms latency, that's 4 both ways, so long as the other machine can render, encode and get the data back to you within 14ms (Or about 28 at 30fps) you might be able to use it as a rendering platform, a-la WiiU->GamePad.
It's all very interesting nonetheless.
And there would have to be some kind of downside to letting someone use your personal resources, otherwise they probably wouldn't offer rewards for doing it.
Not necessarily. The main downside to the user is that sharing the SCD over the Internet will eat up into your bandwidth and electricity bill. This distributed cloud compute network is a clever idea but in order for it to reach its full potential, hundreds of thousands, if not millions of SCD owners will have to commit to making their unit available as much as possible (without it eating into their own gaming time). It only makes sense for Nintendo to offer rewards to keep people engaged in the initiative.
The word supplemental has been floating around in my head for a few days now. I'm wondering if I've been looking at this all wrong, and the SCD is a PowerPC device that locally gives access to the legacy Wii U/Wii/GC game library, and also offers some cloud features like game streaming to the NX handheld or whatever cloud processing Nintendo has in mind for their new online infrastructure. The Wii U SOC is probably pretty cheap these days, and it would allow the NX to "absorb" the Wii U architecture as Iwata once described.
Nintendos hypothetical console can connect to multiple supplemental devices, measure their latency and performance characteristics, and assign appropriate workloads, all with the goal of improving primary console performance. These supplemental devices are shown as being wired directly to the primary console, as below, but the actual patent text makes it clear that supplemental devices could also connect via Wi-Fi or Bluetooth.
Cluster-based console gaming?
The system Nintendo describes in its patent application sounds more like a compute cluster than a traditional platform. Game streaming is still in its infancy, but virtually every major player has something in the works. The Wii U arguably pioneered local game streaming, though the feature didnt drive sales the way motion controllers drove the original Wii. Today, Sony and Nvidia have PlayStation Now and GeForce Now, the PlayStation Vita can stream at least some PS4 titles, and the Xbox One can stream games to any Windows 10 PC running on a compatible network. Microsoft also has a cloud computing backend available for additional processing, though Im not aware of any shipping titles that currently use the feature.
When the Wii U shipped, reviewers noted that the consoles unique controller also limited its overall performance. While the console could technically support more than one controller (albeit at a performance hit), in practice, Nintendo assumed a single Wii U gamepad. The supplemental devices that Nintendos patent contemplates dont sap the performance of the console they improve it.
This patent doesnt clarify what kinds of scenarios Nintendo believes would be suitable for offloading to supplemental devices, the hardware involved in doing so, or how the company would compensate for the still-significant latency hit of performing calculations remotely. Presumably there would be a way for developers to specify which tasks could be farmed out to supplemental devices, while core gameplay ran natively. Its also not clear if Nintendo envisions a system in which end-users purchase multiple devices simultaneously, or how much that hardware would cost. Consumers are used to buying consoles as a distinct unit, so selling them on the idea of supplemental hardware could be tricky.
On the other hand, clustering hardware together could offer an interesting way for gamers to invest more money in exchange for better overall performance. Both the Xbox One and PS4 have often struggled to maintain frame rates (the Xbox One has a larger overall problem, but neither console is guaranteed to be lag-free). I dont know how many Sony or Microsoft fans would pay an additional premium to alleviate these issues and guarantee a smooth 30-60 FPS at 1080p, but we bet some would. Whether Nintendo can launch and ramp such an approach is open to debate network-related features and online gaming are far from the companys traditional strengths.
Nintendo has already said it expects to announce the NX consoles release date at next years E3, and that the new system will be a complete break from the Wiis architecture. This last cant come quickly enough
lol here we go, another season of listening to idiots hype up the secret power sauce of "the cloud" only for it to not really go anywhere.
So the supplemental device works to share processing resources over the cloud? Kinda like Sony promised w/ Cell and PS3? This patent is crazy...
tldr: Nintendo should be able to achieve a streaming service with "good enough" latency and zero latency variability, even with a relatively small install base of NX+ units.
This patent is device agnostic. It doesn't just apply to a theoretical NX console or handheld, it could also be used to power their mobile games. The supplementary processing devices are also able to be any computing device. So it could be a dedicated SCD, but even the NX console or handheld itself could act as an SCD. The patent also allows for non-dedicated hardware to be used as SCDs which means potentially PCs and Macs.
This generation both Nintendo and MS got burned trying to appeal to core and casual. This patent seems to me Nintendo decoupling elements of the hardware so that they can appeal to everyone effectively. They could release a console for between $150-$250 with their innovation/gimmick to appeal to the casuals and then have SCDs for $150-$500(possibly more even) to appeal to the Sony/MS audience as well as free online gaming. They could then release upgraded SCDs annually. It would be disruptive if they can do this. It would mean they kind of get a 2-3 year headstart over PS5/XB2 and makes extended console generations no-longer viable for anyone.
If they do make allow open hardware like PCs to act as SCDs then that would mean people with PCs or living near people with PCs can engage in high end console gaming+gimmicks for only the price of a base unit. In fact with the local cloud or a SCD purchase it means the NX would be more powerful than the PC for games, maybe a lot more powerful.
the handheld could be used out and about and when you get home, all of a sudden the graphic quality enhances because it uses an SCD.
What would be the point though? Sony have tried twice to bring home console style gaming to handhelds, and it hasn't proved popular. I don't see playing say Zelda U on the move with lower quality textures etc as a very marketable concept.
A lot of the big 3DS releases are console style games these days. Xenoblade, zelda, smash, mario kart, ...
Has this changed very recently?!ExtremeTech's take on the patent:
Nintendo has already said it expects to announce the NX console’s release date at next year’s E3
Guys, nothing indicates this patent could not also be used for the Wii U...
What would be the point though? Sony have tried twice to bring home console style gaming to handhelds, and it hasn't proved popular. I don't see playing say Zelda U on the move with lower quality textures etc as a very marketable concept.
Err, and what, add a break out box for CPU/GPU boosts over...USB 2? Even Thunderbolt limits performance, let alone USB 3, let quadruply alone USB 2.
Besides...Nintendo is definitely getting ready to bury that, as much as they talk about wanting to please it's buyers. There's not going to be any major additions to it like a buddy system console, even if that was a possibility with it's inputs. The patent also describes it with a physical connection after all, which makes it more feasible than a cloud one.
Heck, over local wifi would be worse than USB 2, especially if you're doing two wireless hops, Buddy console to wifi, wifi to Wii U. You may see 9MB/s if you're lucky on wifi that way.
I never thought about a PC or Mac as an SCD but of course the patent doesn't discount this. Not sure Nintendo would put a "game player" on a platform like that but man wouldn't it be great.
9MB/s could still push a frame buffer. Compress it and it's less. Latency is what I'd be worried about. Although they didn't mention it I assume it would work over ethernet too which is good for me.
Regarding the connection between main unit and SCD it mentioned physical connection which could also be PCIe (I don't know if Nintendo would go for that)l. But thunderbolt or USB-c would definitely be workable.
Yup, there is certainly lots of potential with the system as outlined. As you implied, the handheld could be used out and about and when you get home, all of a sudden the graphic quality enhances because it uses an SCD.
I never thought about a PC or Mac as an SCD but of course the patent doesn't discount this. Not sure Nintendo would put a "game player" on a platform like that but man wouldn't it be great.
For instance, the supplemental computing device(s) may include processor(s), memory for storage, and interface(s) for coupling to game consoles, but may be free from display drivers, audio drivers, a user control interface for interfacing with the control 110, or the like.
Could you perhaps explain how a configuration such as described in the patent would be more difficult than a Wii U/Gamepad setup, in which the entirety of the A/V signal is sent from the Wii U to the Gamepad over a local wireless connection? Along with the user inputs being sent from Gamepad to Wii U over the same connection?It can't just be a frame buffer though. If it's an assistance system, and not running the whole game since the main console would be doing part of that too, they have to communicate assets, inputs, the whole shebang back and forth.
Dual hop over wifi is a bad idea for most uses anyways. That's why the sneaker net is still so popular![]()
(Heck, 9MB/s, for sending 30 or 60 framebuffers a second, that alone would be hard/not possible. )
Could you perhaps explain how a configuration such as described in the patent would be more difficult than a Wii U/Gamepad setup, in which the entirety of the A/V signal is sent from the Wii U to the Gamepad over a local wireless connection? Along with the user inputs being sent from Gamepad to Wii U over the same connection?
It can't just be a frame buffer though. If it's an assistance system, and not running the whole game since the main console would be doing part of that too, they have to communicate assets, inputs, the whole shebang back and forth.
Dual hop over wifi is a bad idea for most uses anyways. That's why the sneaker net is still so popular![]()
(Heck, 9MB/s, for sending 30 or 60 framebuffers a second, that alone would be hard/not possible. )
So I think this definitely won't be for Wii U, partly for the input limits, partly because they're ready to be done with it soon. They'll support it a bit longer, but nothing major like added processing hardware for it.
For the latter part I don't really see a reason for using PCI-E, as that's a standard for all PCs to use. A console doesn't need to abide by any standards for compatibility with other systems. I don't know any consoles that use PCI-E, it's all proprietary. Thunderbolt is likewise carried over PCI-E. USB C gen 2 reaches 10Gb (small B) per second, which only brings it in line with Thunderbolt 1, and Thunderbolt 2 was still bottlenecking GPUs alone, let alone a CPU+GPU in the buddy console.
Could just be 100% proprietary, why not, it's a console.
Actually, the patent describes it the other way around. The SCD would always be dedicated hardware from Nintendo, but the "game console" could be a PC, tablet, smart phone...pretty much anything. I find this part most interesting indeed.
Is it possible to use these devices as basically terminals, but also have them handle a bit of the mundane processing tasks, such as I/O and display processing? The other question I have is if such a set up would require custom hardware, such as the Broadcom chip in the Wii U Gamepad or could the SoCs in most smart devices these days basically be capable enough to carry out the mundane processing and decompression on their own?
Also, to those more knowledgable in such subjects: Is having the A/V drivers on such an "enhanced termninal" something which would make sense in a modern graphics pipeline? I'm thinking back to this part of the patent just for reference:
With the Gamepad the regular controller inputs are sent to the Wii U, and a video stream is fed back.
From my understanding of the first page patents, you would have a "main" console that could do work on it's own, and a "buddy" console that could help it along. Thus you're not simply sending video from A to B, you're splitting work between them, which means sharing assets and a lot more bandwidth than just a video feed.
That's how I see the OP patents anyways. People were also talking about this as possibly a base system that could run games on it's own, and an upgrade add on, which would necessitate the added bandwidth the way I see it.
Maybe I'm overthinking it and it would be more like Playstation Remote Play, where you're either playing a full Vita game, or streaming a full PS4 game to it, not combining the assets of both.
But then the part of the patent where you have a "buddy" console that can also use it's resources to help others outside your house, and you get some sort of points for it, is still in question.
Iwata said:Since we are always thinking about how to create a new platform that will be accepted by as many people around the world as possible, we would like to offer to them "a dedicated video game platform with a brand new concept" by taking into consideration various factors, including the playing environments that differ by country.
Right. Obviously, if we're talking things such as textures and framebuffer, that's going to require massive bandwidth. That can't be what they have in mind. Not for more distant communications and likely not for close-range communication either, if they are using WiFi and ethernet as examples.
But they've got something in mind. What if it did basically work like Gamepad streaming or Vita Remote play, except with the final display/audio signal processing being performed by the tablet/PC/whatever? Along with this, what if that same tablet or PC was also processing user input and other gameplay before beaming the results to the SCD, which would process AI and more complex physics along with most of the graphics and sound? Does this sound feasible?
With the Gamepad the regular controller inputs are sent to the Wii U, and a video stream is fed back.
From my understanding of the first page patents, you would have a "main" console that could do work on it's own, and a "buddy" console that could help it along. Thus you're not simply sending video from A to B, you're splitting work between them, which means sharing assets and a lot more bandwidth than just a video feed.
That's how I see the OP patents anyways. People were also talking about this as possibly a base system that could run games on it's own, and an upgrade add on, which would necessitate the added bandwidth the way I see it.
Maybe I'm overthinking it and it would be more like Playstation Remote Play, where you're either playing a full Vita game, or streaming a full PS4 game to it, not combining the assets of both.
But then the part of the patent where you have a "buddy" console that can also use it's resources to help others outside your house, and you get some sort of points for it, is still in question.
Actually, the patent describes it the other way around. The SCD would always be dedicated hardware from Nintendo, but the "game console" could be a PC, tablet, smart phone...pretty much anything.
Because the sole or primary function of the supplemental computing device(s) may be to enhance the gaming experience by supplementing resources of the game console 102, in some instances the hardware of the supplemental computing device(s) is purposefully limited.
implementations herein are not limited to the particular examples provided, and may be extended to other environments, other system architectures, other types of merchants, and so forth
In still other instances, different portions of a game or other application may be stored across multiple supplemental computing devices such that these portions are "closer" to users' game consoles and therefore may be rendered faster as compared to storing the data at remote servers. For instance, different portions of a map of a game may be stored across a group of supplemental computing devices that are within a relatively close network distance to one another such that the associated game consoles may each access these parts of the game, when needed, relatively quickly.
Relatively close supplemental computing devices may be able to provide services at a nearly real-time speed (e.g. processing real-time graphics and sound effects), while relatively far away devices may only be able to provide asynchronous or supplementary support to the events occurring on the console (e.g. providing for weather effects in games, artificial intelligence (AI), etc.).
After identifying supplemental computing devices within range (e.g., having a threshold connection strength, threshold latency, etc.), the module 214 may present this information to a user of the game console, who may select one or more supplemental computing devices to connect with. After receiving a user selection, the module 214 may attempt to establish a wireless connection with the selected devices and may begin utilizing the devices if successful.
For instance, a console could send a different AI algorithm to different supplemental computing devices and may therefore receive multiple different results calculated using the different algorithms. The game console may then select which of the different results to act on or otherwise render on the display. The supplemental computing devices may additionally or alternatively provide any other type of support to the game console, including resources for performing specialized graphics processing, storing uncompressed game data so that the game console doesn't have to perform the de-compression on the fly, and the like.
In still other instances, the local supplemental computing device 104 may make this determination and, hence, may seek to couple to a remote supplemental computing device for buttressing the processing resources and/or storage available to the game console 102.
users that share resources may similarly utilize other supplemental computing devices, potentially in equal amounts of what they shared (measured in time or resource amounts).
Right. Obviously, if we're talking things such as textures and framebuffer, that's going to require massive bandwidth. That can't be what they have in mind. Not for more distant communications and likely not for close-range communication either, if they are using WiFi and ethernet as examples.
But they've got something in mind. What if it did basically work like Gamepad streaming or Vita Remote play, except with the final display/audio signal processing being performed by the tablet/PC/whatever? Along with this, what if that same tablet or PC was also processing user input and other gameplay before beaming the results to the SCD, which would process AI and more complex physics along with most of the graphics and sound? Does this sound feasible?
You don't really need to be sharing assets in real time. If they really intended to have a setup like this, the SCD would probably have enough storage that things could be mostly preloaded.
Maybe! There's no reference point for such a system so far so it's kind of hard to say, but perhaps. But one processing gameplay and input while the other calculated AI, physics, and graphics as you give an example of, still sounds like a lot more communication than the relatively simple video feed of the gamepad. i.e, AI responds to your input to a degree, so more direct interactions would need quick turnaround. Could the round trip be done in one frame? Lots of questions we can't know until it's out!
An NXCast would be interesting too...One unit tucked away somewhere, cheap dongles for every TV in the house? That could be cool.
Synchronization would still take something more than a raw video feed of bandwidth though, see early SLI bridges and current SLI/Crossfire PCI-E bandwidth needs. Even when each card has the same assets within VRAM, there has to be a lot of communication between them. Even for a simple setup like split frame rendering, which also happens to be inefficient since something cool could only be happening in one part of the screen.
That's an interesting alternative take on it, although if the WiiU SoC is so cheap, why not include it in the console in the first place?
I suppose they could sell it as a storage expansion which also gives you BC, if they throw a decent size drive in there.
Maybe! There's no reference point for such a system so far so it's kind of hard to say, but perhaps. But one processing gameplay and input while the other calculated AI, physics, and graphics as you give an example of, still sounds like a lot more communication than the relatively simple video feed of the gamepad. i.e, AI responds to your input to a degree, so more direct interactions would need quick turnaround. Could the round trip be done in one frame? Lots of questions we can't know until it's out!
An NXCast would be interesting too...One unit tucked away somewhere, cheap dongles for every TV in the house? That could be cool.
Synchronization would still take something more than a raw video feed of bandwidth though, see early SLI bridges and current SLI/Crossfire PCI-E bandwidth needs. Even when each card has the same assets within VRAM, there has to be a lot of communication between them. Even for a simple setup like split frame rendering, which also happens to be inefficient since something cool could only be happening in one part of the screen.
This doesn't say anything about portable games though right?Online-only games require an online connection, which is a luxury that portable games don't always have available.
My thought was that the supplemental unit would be external storage + cloud, and having the Wii U SOC in there for backwards compatibility and other general cloud computation would mean that it would have some very compelling features while remaining completely optional. You don't need backwards compatibility or the cloud device for the NX to function, but if you purchase it, you get storage and cloud processing access as well as full compatibility with a huge library of games. It would be an attractive option for lapsed Nintendo fans who may want to go back and play something they missed but never would have bought a console for, plus it would allow Wii and Wii U owners to potentially transfer their digital software directly onto the SCD for play on NX.
If you want to do an SLI sort of setup, then of course you will need a lot of bandwidth. However, that is only one of many ways this could be configured. There are other tasks in video games that are that don't directly involve rendering part of the image which require a lot less bandwidth. We really don't know how (or if) Nintendo plans to use this, so assuming a dual GPU setup like that is a bit premature.
LakituSince its looking like this thing could be heavily reliant on the Cloud what should they call the thing then? The Nintendo Sky? The Nintendo Horizon?
You don't really need to be sharing assets in real time. If they really intended to have a setup like this, the SCD would probably have enough storage that things could be mostly preloaded.
Pokemaniac is right - in such low-latency distributed systems the traffic is highly asymmetrical - nobody sends back and forth assets, at least not at playtime.Synchronization would still take something more than a raw video feed of bandwidth though, see early SLI bridges and current SLI/Crossfire PCI-E bandwidth needs. Even when each card has the same assets within VRAM, there has to be a lot of communication between them. Even for a simple setup like split frame rendering, which also happens to be inefficient since something cool could only be happening in one part of the screen.
Pokemaniac is right - in such low-latency distributed systems the traffic is highly asymmetrical - nobody sends back and forth assets, at least not at playtime.
Unit A and unit B both have all the assets they need to operate with, in advance. Then, assuming a master-slave configuration (which is natural, given the control inputs normally arrive at one unit only, say A), unit A requests some work done by unit B. But that does not need to happen per-frame, every frame. Say, A can sent a slim request 'compute the distance field of geometry X and light-source Y, animated by Z*, for the next N hundred frames' and for the duration of the next N hundred frames, B sends back a distance field map (the fat response), per frame, every frame. Everything A needs to send back is a 'keep alive' type of acknowledge, periodically, but not necessarily per frame.
* where X, Y and Z are assets.
Since its looking like this thing could be heavily reliant on the Cloud what should they call the thing then? The Nintendo Sky? The Nintendo Horizon?
Thanks for the detailed reply. I see what you're trying to say, and I agree there could be some sort of solution that could technically work, but would it be worth it in the end for the amount of engineering and cost required to develop this part of the system? Ethernet connected units and physically bus-connected ones, no worries, it'd probably be pretty good but wireless out onto the net, to the few machines around that have the necessary data and availability to process what you want, and need to be turned on, just feels like you'd be left with a subset of possible connections of less than 1![]()
Actually, from a statistical post of view, it's extremely likely that there's a connection available that's "close" to you, even with a very low install base of NX+'s (the 1 million I used in my example above is basically Virtual Boy level numbers).
Let's have a quick look at the numbers again, keeping the install base at 1 million for a start. The expected number of NX+'s on your exchange in that case comes to just over 6 (99.8% is the probability that at least one NX+ is on your exchange, but obviously there could be more). Let's say that there are 6, and four of those NX+'s are being used (which would represent very high demand). Let's also say that each NX+ has 20 games loaded on it (1TB of space and 50GB per game), and assume those games are loaded in simple statistical proportion to the number of players who want to play them (a more advanced technique that intentionally spreads games around geographically would achieve better results, but let's keep it simple for now).
For the purposes of estimating the distribution of games, I'm going to use Steam, as it's the only readily available source of accurate data. I'm going to assume that there are 200 games on the platform, and that the usage statistics of those games match the top 200 games on Steam. Taking these numbers, and doing a bit of work in Excel, tells me that there's an 81.8% chance that the game you want to play will be on at least one of the two remaining available NX+ units.
Now, even in this unlikely scenario, there's still a small chance that there won't be someone on your exchange (i.e. with minimum physically possible latency), so let's examine that. The next lowest latency would be with people connected to exchanges on the same ISP that are "neighbours" to your exchange (i.e. either directly connected to your exchange, or both connect to the same ISP hub). These should add low single-digit milliseconds of latency to your stream, and still maintain a low variability of latency. Let's say there are five of these exchanges. The six exchanges (including your own) would now have an estimated 37.5 NX+'s, and assuming 66.7% are used, as above, we have 12 free NX+ to stream from, with almost minimal physically possible latency. Redoing the calculations, the probability that your game is on at least one of these 12 NX+ units comes to 99.996%.
So, in a situation where:
- The NX+ sells almost as badly as the Virtual Boy
- Nintendo implements the laziest possible game pre-loading technique
- Two-thirds of NX+ units are in use (much higher than typical demand should be expected to be)
you are still effectively guaranteed to be connected to an NX+ that has your game and can offer next-to-minimal latency, and you have a greater than 80% chance of being connected to an NX+ right on your exchange, which will give you the lowest latency any video game streaming service could possibly provide.
If we re-do the calculations with more realistic assumptions (5 million NX+'s, and 25% usage rate), the probabilities start to get a little crazy. In this case you have:
- a 99.9999998% chance that your game is available on an NX+ on your exchange
- a 99.99999999999999999999999999999999999999999999999999% chance that your game is available on an NX+ in a neighbouring exchange
That is, you are more than twice as likely to be hit by lightning the same second you're hit by a meteor, on the same day you've been drafted by the NBA, while holding winning lottery tickets for both the Powerball and the Euromillions, than you are to not be able to stream a game with almost minimal latency.
Now, granted this model I'm using is only a very rough approximation of reality, and more real-world data (particularly on the topologies of real ISP's networks) would help make it more accurate, but I feel it still does a decent enough job of demonstrating a simple statistical fact:
Under any reasonable set of assumptions, you would be able to stream from an NX+ that is, by the standards of the internet, extremely close to you.
Even my estimates for total latency are on the conservative side. By 2019, the average fixed-line broadband speed in North America is expected to hit 43.7Mb/s, in Western Europe 49.1Mb/s and in Japan over 100Mb/s (PDF source, page 19). The technologies that bring in these higher speeds (such as FTTH, DOCSIS 3.1 and VDSL/G.fast with fibre to a street-level cabinet) also reduce latency in the access network. It's quite possible that for a large number of users, the total latency added by the streaming solution would be so low that it would be hidden entirely within the vsync delay, making the service indistinguishable from playing locally.
So the supplemental device works to share processing resources over the cloud? Kinda like Sony promised w/ Cell and PS3? This patent is crazy...
I hear ya but......I'm in New Zealand. Us and I imagine MANY other countries won't have the density you suggest.