Jump to content

NQ-Giantsmoy

Developer
  • Posts

    1
  • Joined

  • Last visited

Reputation Activity

  1. Like
    NQ-Giantsmoy got a reaction from Serula in DevBlog r0.16 Alpha 2 Ambient Sound System   
    Hi Noveans!
     
    I’m Maxime Ferrieu aka NQ-Giantsmoy, Lead Audio at Novaquark Paris. You may already know me as the music composer for Dual Universe, but today, I’d like to tell you more about something we haven’t showcased yet, and that we think is really important: the sound design! Working at Novaquark, I’m not only responsible for the music, but basically everything you hear in the game, especially what we are going to discuss today: ambient sound!
     
    Before going further, I want to stress right away that we will dive deep into the rabbit hole and this will be rather technical. 
     
    Ambient sound is critical to building an immersive soundscape, and in Dual Universe, this raises quite a number of challenges. The emergent / player-edited world leaves us completely unaware of what a player’s environment will look like at any given time or location. In a traditionally designed game world, ambient sound switching is generally done via trigger volumes. The player enters the trigger volume, we fade out the current ambient sound and then fade in a new one. Job’s done. Simple. But this standard method is inapplicable for DU because players can terraform and design their own geometry. 
     
    A second solution would be to rely upon the underlying biomes that the player is currently in. But again, what if those players completely wipe all of the trees in a forest biome? Without a forest, will the ambient sound of one be sensible or relevant? Again, the creative freedom given to the players forces us to take an alternative approach to this problem.
     
    After a period of trial and error, we decided on a nested detection approach. It may sound complicated, but it’s actually very simple! The graphic below shows how the ambient system is currently implemented in the game:
     


    The idea here is to ask questions about the player’s environment to the game engine, which will tell the ambient sound system which sounds to appropriately render in real-time. We are using Wwise, by the good folks at Audiokinetic, to manage the audio in Dual Universe. It’s a diverse and powerful tool that allows us to create complex sound behaviors very easily in addition to receiving real-time parameters from the game engine to modulate sound properties. And that’s perfect for our scenario.
     
    The first question we ask is: Is the player underwater? If so, we play an underwater ambiance and filter the current above water ambient. Simple, yet effective. 
    Things get more interesting when above water. We have to detect if we are in or out, and as I mentioned before, there is no way to rely on trigger volumes for this task. To determine this, we went for a raycast approach. Every frame, a ray is launched in a random direction and tells us if it collided with geometry or not. The ratio of rays that collides gives us an approximation of what we call the ‘Indoor Factor’. In the game, you are now hearing a blend of indoor and outdoor ambience based upon this value, which is cool, but we can go further than this.

    For the indoor ambience, we can also check if it’s a soil voxel, construct voxel, or element. With the same type of operation, we can determine a ratio of cave/construct ambience and render a blend of sound that appropriately suits the situation. We can also calculate the average of the rays' lengths before they collide using simple math. This gives us an approximation of the volumetric size of the interior to apply reverberation to accordingly, which plays a big part of the indoor ambience as a whole.
     
    For outdoors, we conduct a simple asset detection around the player. We place a base wind layer and on top of that, we ask the game engine how many tree assets are in a given radius around the player. How many small vegetation assets? With those values, we can easily render deserts/prairies/forests and everything in between. The main challenge here is that the detection must be tightly coded due to it being CPU intensive, and I think we reached a solid compromise in the latest releases. We keep working on improving, though. We stated this multiple times in the past, we’re still in Alpha and actively developing the game. Optimization is still work in progress!
     
    In addition to the previous, we also detect atmospheric density. This allows us to change the base wind sound if we are at sea level or at various altitudes transitioning into space.
     
    The version we are shipping for Alpha 2 on July 11th is the first iteration of the environment sound system. It provides us a solid technical ground to build upon when adding extra layers of detail in the future. This means adding branches to the tree diagram shown above with weather variations, temperature, and hygrometry (which helps us to determine biomes more precisely), and things such as unseen wildlife (no ETA for this to be implemented yet). Basically, everything we can think of to give the player a richly-detailed, immersive, and interactive ambient sound.
     
    We will show you examples of this sound system in the upcoming Dev Diary video, so stay tuned!
    Hope you guys enjoyed this article and I’m looking forward to interacting with you in the game!  Enjoy Alpha 2!
     
    Maxime FERRIEU / (NQ-Giantsmoy)
    Audio Lead - Novaquark Paris
  2. Like
    NQ-Giantsmoy got a reaction from Oxyorum in DevBlog r0.16 Alpha 2 Ambient Sound System   
    Hi Noveans!
     
    I’m Maxime Ferrieu aka NQ-Giantsmoy, Lead Audio at Novaquark Paris. You may already know me as the music composer for Dual Universe, but today, I’d like to tell you more about something we haven’t showcased yet, and that we think is really important: the sound design! Working at Novaquark, I’m not only responsible for the music, but basically everything you hear in the game, especially what we are going to discuss today: ambient sound!
     
    Before going further, I want to stress right away that we will dive deep into the rabbit hole and this will be rather technical. 
     
    Ambient sound is critical to building an immersive soundscape, and in Dual Universe, this raises quite a number of challenges. The emergent / player-edited world leaves us completely unaware of what a player’s environment will look like at any given time or location. In a traditionally designed game world, ambient sound switching is generally done via trigger volumes. The player enters the trigger volume, we fade out the current ambient sound and then fade in a new one. Job’s done. Simple. But this standard method is inapplicable for DU because players can terraform and design their own geometry. 
     
    A second solution would be to rely upon the underlying biomes that the player is currently in. But again, what if those players completely wipe all of the trees in a forest biome? Without a forest, will the ambient sound of one be sensible or relevant? Again, the creative freedom given to the players forces us to take an alternative approach to this problem.
     
    After a period of trial and error, we decided on a nested detection approach. It may sound complicated, but it’s actually very simple! The graphic below shows how the ambient system is currently implemented in the game:
     


    The idea here is to ask questions about the player’s environment to the game engine, which will tell the ambient sound system which sounds to appropriately render in real-time. We are using Wwise, by the good folks at Audiokinetic, to manage the audio in Dual Universe. It’s a diverse and powerful tool that allows us to create complex sound behaviors very easily in addition to receiving real-time parameters from the game engine to modulate sound properties. And that’s perfect for our scenario.
     
    The first question we ask is: Is the player underwater? If so, we play an underwater ambiance and filter the current above water ambient. Simple, yet effective. 
    Things get more interesting when above water. We have to detect if we are in or out, and as I mentioned before, there is no way to rely on trigger volumes for this task. To determine this, we went for a raycast approach. Every frame, a ray is launched in a random direction and tells us if it collided with geometry or not. The ratio of rays that collides gives us an approximation of what we call the ‘Indoor Factor’. In the game, you are now hearing a blend of indoor and outdoor ambience based upon this value, which is cool, but we can go further than this.

    For the indoor ambience, we can also check if it’s a soil voxel, construct voxel, or element. With the same type of operation, we can determine a ratio of cave/construct ambience and render a blend of sound that appropriately suits the situation. We can also calculate the average of the rays' lengths before they collide using simple math. This gives us an approximation of the volumetric size of the interior to apply reverberation to accordingly, which plays a big part of the indoor ambience as a whole.
     
    For outdoors, we conduct a simple asset detection around the player. We place a base wind layer and on top of that, we ask the game engine how many tree assets are in a given radius around the player. How many small vegetation assets? With those values, we can easily render deserts/prairies/forests and everything in between. The main challenge here is that the detection must be tightly coded due to it being CPU intensive, and I think we reached a solid compromise in the latest releases. We keep working on improving, though. We stated this multiple times in the past, we’re still in Alpha and actively developing the game. Optimization is still work in progress!
     
    In addition to the previous, we also detect atmospheric density. This allows us to change the base wind sound if we are at sea level or at various altitudes transitioning into space.
     
    The version we are shipping for Alpha 2 on July 11th is the first iteration of the environment sound system. It provides us a solid technical ground to build upon when adding extra layers of detail in the future. This means adding branches to the tree diagram shown above with weather variations, temperature, and hygrometry (which helps us to determine biomes more precisely), and things such as unseen wildlife (no ETA for this to be implemented yet). Basically, everything we can think of to give the player a richly-detailed, immersive, and interactive ambient sound.
     
    We will show you examples of this sound system in the upcoming Dev Diary video, so stay tuned!
    Hope you guys enjoyed this article and I’m looking forward to interacting with you in the game!  Enjoy Alpha 2!
     
    Maxime FERRIEU / (NQ-Giantsmoy)
    Audio Lead - Novaquark Paris
  3. Like
    NQ-Giantsmoy got a reaction from Murmandamus in DevBlog r0.16 Alpha 2 Ambient Sound System   
    Hi Noveans!
     
    I’m Maxime Ferrieu aka NQ-Giantsmoy, Lead Audio at Novaquark Paris. You may already know me as the music composer for Dual Universe, but today, I’d like to tell you more about something we haven’t showcased yet, and that we think is really important: the sound design! Working at Novaquark, I’m not only responsible for the music, but basically everything you hear in the game, especially what we are going to discuss today: ambient sound!
     
    Before going further, I want to stress right away that we will dive deep into the rabbit hole and this will be rather technical. 
     
    Ambient sound is critical to building an immersive soundscape, and in Dual Universe, this raises quite a number of challenges. The emergent / player-edited world leaves us completely unaware of what a player’s environment will look like at any given time or location. In a traditionally designed game world, ambient sound switching is generally done via trigger volumes. The player enters the trigger volume, we fade out the current ambient sound and then fade in a new one. Job’s done. Simple. But this standard method is inapplicable for DU because players can terraform and design their own geometry. 
     
    A second solution would be to rely upon the underlying biomes that the player is currently in. But again, what if those players completely wipe all of the trees in a forest biome? Without a forest, will the ambient sound of one be sensible or relevant? Again, the creative freedom given to the players forces us to take an alternative approach to this problem.
     
    After a period of trial and error, we decided on a nested detection approach. It may sound complicated, but it’s actually very simple! The graphic below shows how the ambient system is currently implemented in the game:
     


    The idea here is to ask questions about the player’s environment to the game engine, which will tell the ambient sound system which sounds to appropriately render in real-time. We are using Wwise, by the good folks at Audiokinetic, to manage the audio in Dual Universe. It’s a diverse and powerful tool that allows us to create complex sound behaviors very easily in addition to receiving real-time parameters from the game engine to modulate sound properties. And that’s perfect for our scenario.
     
    The first question we ask is: Is the player underwater? If so, we play an underwater ambiance and filter the current above water ambient. Simple, yet effective. 
    Things get more interesting when above water. We have to detect if we are in or out, and as I mentioned before, there is no way to rely on trigger volumes for this task. To determine this, we went for a raycast approach. Every frame, a ray is launched in a random direction and tells us if it collided with geometry or not. The ratio of rays that collides gives us an approximation of what we call the ‘Indoor Factor’. In the game, you are now hearing a blend of indoor and outdoor ambience based upon this value, which is cool, but we can go further than this.

    For the indoor ambience, we can also check if it’s a soil voxel, construct voxel, or element. With the same type of operation, we can determine a ratio of cave/construct ambience and render a blend of sound that appropriately suits the situation. We can also calculate the average of the rays' lengths before they collide using simple math. This gives us an approximation of the volumetric size of the interior to apply reverberation to accordingly, which plays a big part of the indoor ambience as a whole.
     
    For outdoors, we conduct a simple asset detection around the player. We place a base wind layer and on top of that, we ask the game engine how many tree assets are in a given radius around the player. How many small vegetation assets? With those values, we can easily render deserts/prairies/forests and everything in between. The main challenge here is that the detection must be tightly coded due to it being CPU intensive, and I think we reached a solid compromise in the latest releases. We keep working on improving, though. We stated this multiple times in the past, we’re still in Alpha and actively developing the game. Optimization is still work in progress!
     
    In addition to the previous, we also detect atmospheric density. This allows us to change the base wind sound if we are at sea level or at various altitudes transitioning into space.
     
    The version we are shipping for Alpha 2 on July 11th is the first iteration of the environment sound system. It provides us a solid technical ground to build upon when adding extra layers of detail in the future. This means adding branches to the tree diagram shown above with weather variations, temperature, and hygrometry (which helps us to determine biomes more precisely), and things such as unseen wildlife (no ETA for this to be implemented yet). Basically, everything we can think of to give the player a richly-detailed, immersive, and interactive ambient sound.
     
    We will show you examples of this sound system in the upcoming Dev Diary video, so stay tuned!
    Hope you guys enjoyed this article and I’m looking forward to interacting with you in the game!  Enjoy Alpha 2!
     
    Maxime FERRIEU / (NQ-Giantsmoy)
    Audio Lead - Novaquark Paris
×
×
  • Create New...