What do you mean "generated at runtime on the player's side?" Because the persistence of the universe and the "if one player finds a tree, you can visit that planet and will see the same tree" stuff goes against total randomization IF the planet has been charted and uploaded to the servers. My takeaway from the explanation is that whatever set of variables a player finds on planet X is uploaded for everyone in exactly the same fashion, but was wholly unique to whoever found and decided to share it first.
As for the Milky Way numbers, I understand how enormous the universe is in real life, but I'm having a tough time visualizing that coming from a game. Like, if there's persistence, there has to be a "videogamey" limit.....right? And the planets can't actually be full sized planets......right? Are......are these thoughts what next-gen feels like?
Basically: procedural isn't random. It's driven on pseudorandom generators but those, given the same input, the same "seed", will always produce the same result (so whenever a game offers you actual random levels it's really randomizing the seeds too, usually simply by deriving it from something that constantly changes, like the date and time). So it is quite feasible to design your procedural algorithms so they rely on a single, simple seed for the whole universe. That is, in fact, how the original Elite worked, on a much smaller scale, or how the Midwinter games could offer much bigger territories than current sandboxes.
That's also how you can get around memory/scale limits: since the same seed always gives the same result,
if you can generate everything you need fast enough, you only need to store the seeds, which are very compact (can be a single number). It's just like how Streaming works for something like GTA, except instead of loading more detailed "chunks" from disk as you get closer, you generate them, and then when you leave you just save a list of changes the player made that are worht remembering so you can reapply them next time you need that chunk.
Usually a balance has to be struck between size and speed so everything gets made in time, and also because it's not just that the system isn't random, it can be driven, so by determining the kind of seeds you need to have control you can generate things that go your way. For example in the midwinter games the world was seeded by a basic map that was like 200x100 pixels and would be expanded into the hundred-thousand-square-kilometers world, or in Spore the seed for the creatures was a rough skeleton over which the player had control.
There is a videogamey limit but it can be stretched very, very far through making the whole thing very heirarchical in structure. eg imagine dividing your galaxy in 1000 kilometer cubes. The "seed" for each cube will be the coordinates, since they're unique to each cube, so there not even a need to store them. With 32 bits per coordinate (standard for integer numbers) that gives a whopping 64 billions of billions of billions unique 1000-kilometer cubes that take
no room outside of ram (the seed is implicitly the coordinates), and in ram the worst case is having to store eight of those generated cubes at a time (because you're stading right at their eight common corners). That's a lot! If only 1% of them have a single unique star system, that's already a lot to explore.
So yeah, the numbers boggle the mind, though of course a procedural generator's output will easily feel samey if not carefully designed.