Paul Weir's (NMS Audio Lead)
GDC talk is now available. First half is more of an intro to procedural audio, second half dives into technical details of their creature audio and the generative soundtrack. Their soundtrack tool is called Pulse and he actually shows it off in action, and explains many of the ways it decides when and how to shift the soundtrack.
some specific bits:
CREATURES:
- about a dozen distinct creature audio archetypes (no wonder they mostly sound the same)
- Weir knows that the creature sounds aren't very good ("it makes me cringe sometimes at what the creatures sound like"), but had limited time to learn to "play" the vocalization instrument they created and to tweak before shipping, also their toolchain was laborious to use at the time
- he hopes to redo all the creature audio, also hopes to tie sounds to the animation length
- creature "performances" were done using MIDI iPad app to control the parameters, but would like to use machine learning in the future
IMO right now the creature audio is all within a narrow frequency band on the high-end. They should vary frequency by creature size. Diplos and other large animals should have really low rumbling sounds, for instance.
MUSIC:
- contains over 2500 musical elements drawn from the 65dos soundtrack and sounds not on the soundtrack
- control of individual elements (audio source point in xyz, volume, etc) dependent on player direction, movement, frequency of interactions, etc.
- soundtracks are not created for specific planets, but are created as a playlist when game starts, and then tweaked depending on what player does
- Pulse tool isn't as complex as people think, it's more about managing the complex pipeline of elements. (I think he's being a bit humble there, but a big part of the soundtrack's success is certainly the fantastic sounds 65dos made for it)
Yup. Patch notes:
http://www.no-mans-sky.com/2017/03/4411/