Updating Godot's audio system #5704
Replies: 3 comments
-
To generate audio programmatically, you can use an AudioStream Generator. Granted, AudioStream Generator works pretty bad right now. For me, it's outputting silence at anything more than 4000 hz sample rate, which is awful. I would tend to favor scrapping the busses as they currently exist and replacing them with audio nodes to be manipulated in the inspector. The fact that Audio works so much differently than every other system is a real pain. |
Beta Was this translation helpful? Give feedback.
-
Yes, fully agreed. The current bus system is easy to understand when coming over from DAWs, but it's currently way less flexible than, say, Reaper's more modular bus approach, and doesn't sit well in Godot's scene-node paradigm at all. I had some thoughts on how to implement this that I didn't get around to putting in the main body of the discussion due to it already being a bit wordy. I'll put these below: Idea 4: Replace AudioStreamPlayer and audio buses with AudioObject and AudioEnvironmentRelating to the first idea, and part of overall modular / graph-focused audio rework. Would function roughly as below: AudioObject3D/2D
AudioEnvironment
|
Beta Was this translation helpful? Give feedback.
-
I believe that the graph workflow (while useful) has a major drawback (that metasounds currently have): It is not friendly with music or music synced events. FMOD, with it's timeline workflow, on the other hand, offers the possibility to easily transition in sync, vertically layer sounds (in sync), and horizontally transition between sounds and music, while at the same time offers async functionalities. |
Beta Was this translation helpful? Give feedback.
-
I've seen some talk from @reduz about overhauling how Godot handles audio to better support the needs of sound designers, in light of tools like FMOD or Unreal's Metasounds, and was looking for discussions and proposals for how this could be implemented. As such, this post is more-or-less a high-level sketch of how I think this could be achieved. For background, I got into game development doing sound design and music implementation for several indie projects and mods. This proposal isn't based on any existing tool or workflow, but stems from things I'd like to be able to do with audio in the Godot engine, and how I could see some issues with the current implementation being tackled.
Current issues
Godot's current audio workflow is pretty inflexible when it comes to audio playback and DSP, and only supports series routing via single buses out of the box. This is fine for simple use-cases where sound designers are doing most of their work in a DAW like Reaper, and then importing rendered audio into Godot. However, this doesn't really hold up to current industry standards, and many game engines either have very tight integration with dedicated middleware, and/or implement much more advanced features directly in-engine.
Idea 1: Modularise audio playback and effects using GraphEdit as a frontend
In much the same way as AnimationTrees allow for much more flexible handling of animations, the same approach could, and I would argue should be used for audio. GraphEdit's workflow is already very similar to how modular synthesis and audio 'patching' works, and is an easy-to-grasp paradigm that's familiar to most sound designers. This would allow for effects to be freely routed, and generally much greater control over how audio is processed. This workflow could be used both on buses for scene-wide effects, as well as individual audio player nodes for semi-procedural / parametric audio. Graph node types would be roughly as follows:
Idea 2: In-engine sound generation via Braids code
Although Godot can play back samples, there is currently, as far as I am aware, no way to generate audio programmatically - this is becoming more common in game engines with systems like Unreal's MetaSounds. Mutable Instruments maintains a huge amount of MIT-licensed DSP code for their Braids oscillator module - this has been successfully ported to other devices, including the Arturia Microfreak and Minifreak, and Dirtywave's M8 tracker, as well as several pure software implementations that run on consumer PC hardware. This is admittedly the sort of thing that could very likely be implemented in an extension instead of core, but I feel worth suggesting nonetheless.
Idea 3: Audio propagation, occlusion and reflection via SDFs
This one is a real pie-in-the-sky suggestion that I have not the slightest clue about implementation for but would love to see happen in some capacity. From my understanding of similar solutions such as Steam Audio, such effects are done using a simplified intermediate representation of the game geometry to model how sound would propagate throughout a scene, in a method similar to how bounced lighting is calculated. As Godot already has code for voxelising and generating SDF representations of geometry, it would be extremely cool if there was a means to re-use that data in a way that would benefit realistic audio.
It's worth noting that my experience is with sound design, not audio programming. I'm aware I could be proposing things here that are not feasible to achieve. I hope this at least sparks some discussions as to how audio could be overhauled in Godot's future.
Beta Was this translation helpful? Give feedback.
All reactions