Unity Audio vs Wwise

To start with I wanted to do a general investigation into Wwise the integrated audio package for Unity by AudioKinetic. When I started working through it I figured it would be more interesting to look at Wwise in comparison to Unity’s own audio API and mixer components which have been around since Unity 5.

To do that I’m going to compare a game in three different builds. Build one is it’s original state with simple scripts that run an AudioSource.Play() method. The Second build I will add another layer of complexity by using the Unity built in Mixer and see if there are any differences or advantages. Lastly I’ll redo the project with the Wwise API and investigate how that impacts build size and project complexity and weigh it up against the previous two builds. Mostly I’m looking for difference in performance between the three builds, build size and complexity, and weighing that up against ease of implementation and flexibility.

I refreshed an old project called “MusicVisualiser”that I started for my Five Games in Ten Weeks Challenge. The game is like a singing solar system. There is a bunch of “planets” in the night sky that play a set piece of music when clicked. It’s a really simple concept and project but I think it will work for this comparison as the parameters can be limited to just a few audio tracks but we can play with spacing and roll-off and other advanced audio features.

Let’s have a look at the game first.

These “planets” are simple native Unity sphere meshes with an Audio Source component and a particle system that’s triggered when it’s clicked. You can see in the Audio Source that we are not using a Mixer for Output and all the Audio sources compete for resources and play at their default volume and priority.

The PlayMe script just takes in the AudioSource and plays it:

   public AudioSource my_sound;
        
  if (Input.GetMouseButtonDown(0))
  {
      RaycastHit hitInfo;
      target = GetClickedObject(out hitInfo);
      if (target != null && target.name == my_name)
      {
          _mouseState = true;
          screenSpace = Camera.main.WorldToScreenPoint(target.transform.position);
          offset = target.transform.position - Camera.main.ScreenToWorldPoint(new Vector3(Input.mousePosition.x, Input.mousePosition.y, screenSpace.z));

          my_sound.Play();   // This is the Audio Component!
          var expl1 = GetComponent<ParticleSystem>();
          expl1.Play();
      }
  }

Pretty simple right. This is what the project looks like in the Profiler when it’s running and being actively engaged with. At that point we are looking at two Audio Sources are playing:

This is the build size from the Editor Log with our Audio Files broken out:

Build Report (Audio.Play)
Uncompressed usage by category:
Textures 0.0 kb 0.0%
Meshes 0.0 kb 0.0%
Animations 0.0 kb 0.0%
Sounds 547.5 kb 1.4%
Shaders 188.0 kb 0.5%
Other Assets 1.4 kb 0.0%
Levels 38.3 kb 0.1%
Scripts 941.9 kb 2.4%
Included DLLs 3.9 mb 10.2%
File headers 9.3 kb 0.0%
Complete size 38.6 mb 100.0%

Used Assets and files from the Resources folder, sorted by uncompressed size:
204.3 kb 0.5% Assets/SomethingLurks_AAS.wav
164.5 kb 0.4% Assets/Step2Down_AAS.wav
136.9 kb 0.3% Assets/Underwater_AAS.wav
41.8 kb 0.1% Assets/M1_M12_37_ThumPiano_Aflat1.wav

Unity Audio with Mixer

Now we add in the Mixer component to the project:

Then add a couple of Channels to the Mixer to split the audio between left and right. Then the Audio Sources are dropped into one or another of the Mixer Channels:

Adding the Mixer as the Output source

Next for bit more interest I added some effects in the Mixer. Here is where we see the advantages of using the Unity Mixer. Sounds can be manipulated in complex ways and the Audio Output chain be defined with presets and levels etc.

If we have a look at our Profiler while running with the new component we cannot really see any great differences. The ‘Others’ section of the CPU Usage is a bit higher and the Garbage Collector in the Memory is pumping regularly but the Audio Stats look pretty much unchanged:

Profiler Mixer

Mind you this is a fairly low utilising game so we might get wildly different stats if we were really putting the system under the pump but I’m not performance testing here just comparing run states between the two builds.

Next if we build the game and have a look at the Editor Log the only thing that’s changed here is that the “Other Assets” size is a KB higher (Complete size has not been changed):

Build Report (Mixer)
Uncompressed usage by category:
Textures 0.0 kb 0.0%
Meshes 0.0 kb 0.0%
Animations 0.0 kb 0.0%
Sounds 547.5 kb 1.4%
Shaders 188.0 kb 0.5%
Other Assets 2.3 kb 0.0%
Levels 38.3 kb 0.1%
Scripts 941.9 kb 2.4%
Included DLLs 3.9 mb 10.2%
File headers 9.3 kb 0.0%
Complete size 38.6 mb 100.0%

Unity with Wwise

Next we are going to add Wwise to the Project. This is the basic workflow. In the Wwise Launcher we register our project and on the first tab we are presented with three Hierarchies.

Project Audio Explorer in Wwise

The Master-Mixer Hierarchy – does what it says.
The Actor-Mixor Hierarchy – where most of your game audio develops (use the SoundSFX defaults).
Interactive Music Hierarchy – other stuff we won’t get into.

Events Tab

The next tab along is the events tab where you link your audio to game events. You can define your event here (use the default work unit).
Once you got the event there you can associate the event with the audio in the Action List.

SoundBank Tab – this is the bit that get’s imported into your project.

Next you generate a SoundBank with Wwise that includes your audio and the code for the API calls to trigger sounds. You export that SoundBank into your game engine and link up the calls in your code.

To Get Started with Wwise

To get started make an account with Audiokinetic and download the Wwise Launcher. The Integration package for Unity can be downloaded and installed directly from the WWise Launcher.

In the Wwise Launcher there is a WWISE tab that you can install and start the application from. Once you open it up you need to register your project within the launcher so Wwise can track you ūüôā ( click on the key icon next to your Wwise project and select ‘Register your Project to obtain a License’). Wise will run in Trial mode which restricts the SoundBank content to 200 media assets and cannot be used for Commercial purposes. Pricing for licensing is on their site but this is not a sales piece so if you want it you can look it up.

There are a bunch of plugins by Audiokinetic and their partners available and also Community offerings like AudioRain a dedicated rain synth with 60 procedurally generated presets for rain. What’s not to love about that!

There is a Wwise SDK for authoring your own plugins and a Wwise API which allows you to integrate into any engine, tool or application.

Audiokinetic do certifications that covers audio integration workflows,
mixing virtual soundscapes, working with sound triggering systems, and performance optimisation :
https://www.audiokinetic.com/learn/certifications/

Basically in Wwise you let the Launcher do all the setting up for you. You will install the Wwise binaries from here and manage your platform versions. Projects can be integrated here and if you don’t have the necessary plugins installed the Wwise Launcher will install them for you.

Integrating the MusicVisualiser project with Wwise.
This is how big the Wwise Integration packages and binaries are.
Applying…
Done!

That’s basically it for the set up of Wwise and Integration with your Project. Next up we will have a look at what this has done to the Unity Console.

Wwise in Unity

First thing we see is a bunch of errors that can be safely ignored. As we did not perform any configuration of our project in Wwise with audio files and events there was no SoundBank to generate yet.

Unity – Initial Errors can be ignored if you have not generated your SoundBank yet.

In the Unity Console we have a new tab in our editor. The Wwise Picker Tab contains all the elements of the Wwise project that have been imported with the project integration. There is also a Wwise Global Game Object in the Unity Hierarchy and all the Wwise folders in the Assets folder.

Unity Editor
The WwiseGlobal Game Object

Under the Component pull down there is a whole slew of Ak (AudioKinetic) options.

Wwise Components.
Wwise Configuration Settings.

I know there has been a lot of “show and tell” in this post but I’m going to keep going and show the process of importing the audio into the Wwise Project, creating Events, and Generating the SoundBank.

Working in Wwise

In the Wwise Project Explorer I right click on the Default Work Unit and import the audio files that were part of my project. (I’ve stripped the raw files out of my project for now and removed all the Mixer components and etc.).

Importing Audio Files into the Wwise Project.
This is what the files look like.
Right click on the file to create a new Event (which can be called in the Unity code).
Here is the event created for “Play”.
And all my “Play” events.

Finally a SoundBank is generated from which the Unity project can access the sound files through the AudioKinetic API.

Generating a SoundBank

Wwise Audio in Unity

When we go back to our Unity Editor and Refresh the Project and Generate SoudBanks we are presented with the following in the Wwise Picker. We can now access these files and and drag them on to our game objects directly. It’s that simple. Drag a sound from the Picker onto a Game Object and it automagically creates a component that is immediately accessible from within the editor.

The new audio imported into the Wwise Picker.

Below the Play_Underwater_AAS event and audio file has been added to the Sphere Game Object.

The Trigger, Actions, and Callbacks can all be configured and access through the API. In my case I easily integrated the functionality I wanted with only one line change to my attached PlayMe.cs script that we looked at above. So now instead of my audio coming from the AudioSource component referenced by my_sound the audio is played by the AKSoundEngine.PostEvent.

            //my_sound.Play();
            AkSoundEngine.PostEvent("Play_Underwater_AAS", this.gameObject);

Actually getting Wwise installed and set up and integrated with my Project was very very easy but not without bumps. It takes a very long time for packages to download and I had a bit of trouble upgrading my Wwise Launcher from an old version (it got stuck! and I had to remove it by hand and re-install). When I did have issues I got some very excellent help from AudioKinetic and after logging a case was emailed directly by a real person (which honestly was so surprising and wonderful to get that kind of support from a company when I’m on a trial license with no formal support agreement or rights).

So lets have a look at the differences in performance and package size. The first thing you notice with the Profiler below is that there is very little difference in performance but we can no longer see our audio stats as it’s been abstracted away from the Unity Engine. The Graph still shows the resources being used by Audio and the Total Audio CPU seems to be up to a third lower than the native Unity Audio statistics. It looks like it’s being clamped at just over 1.2. MB instead of regular peaks over 3 MB.

Profiler with Wwise Audio running.

The Build Report is only a couple of MB larger for the total project size:

Build Report
Uncompressed usage by category:
Textures 0.0 kb 0.0%
Meshes 0.0 kb 0.0%
Animations 0.0 kb 0.0%
Sounds 0.0 kb 0.0%
Shaders 188.0 kb 0.5%
Other Assets 7.3 kb 0.0%
Levels 38.5 kb 0.1%
Scripts 1.3 mb 3.1%
Included DLLs 3.9 mb 9.7%
File headers 13.2 kb 0.0%
Complete size 40.5 mb 100.0%

Basically a 2 MB difference! The Sounds have been extracted away as a file in the Build Report and we assume they are now part of “Other Assets” above.

I’m kinda blown away by how how little additional file size there is to the build considering the additional libraries code and available complexity that Wwise adds. There is literally a plethora of options and effects that we can play with in the Wwise package. It’s a bit like the excitement I got after the install of my first real Audio DAW. The scope is part boggling and part fantastical wonder at where we can go next. (Audio does get me unusually stimulated but that’s to be expected and tempered accordingly).

The questions I wanted to answer with this whole experiment was 1. Would including an audio middleware like Wwise make my Project more complex and difficult to manage? 2. Would the added Package make my build much larger? and 3. Would the performance of the Audio package be as good as the simple Unity Audio API? The answers are: No. No, and Yes. So I’m pretty happy with that and if the cost point of using the licensed version of Wwise is balanced out against the advantages of using it in the total cost of the Project then I would most definitely one hundred percent go for it.

Why Normalize()

I’ve been doing some work on the AI for enemy behaviours for an unreleased game Endless Elevator and have been delving into the book “Unity 2018 Artificial Intelligence Cookbook – Second Edition” by Jorge Palacios.

It uses the Normalize() function regularly to record the direction of an object in relation to another object and it got me thinking about the usefulness of this function.  You can see why something like knowing the direction of the Player could be good for an enemy AI behaviour but I wanted to investigate more deeply about how this worked and how I could use it.

One of the things I hadn’t consciously been aware of, but is obvious once you point it out, is that a Vector3 can define a location (ie. a point in space 0, 0, 0) but it can also define a direction if you have a starting position and a target position.

A good example of a Vector being used to denote a location and a direction in Unity is the Ray.
The Ray consists of two Vector3 data points. The first Vector3 is the source position the ray is taken from and the second Vector3 is the direction of the target.

When a direction Vector3 is Normalized it keeps it’s direction but it’s length (how far away one object is from another) is set to a point between 0 and 1.¬† In Unity they round down to 0 if the objects are very close to each other (anything under about 0.04f of a unit) anything above that gets Normalized up to 1.

As an example we can simplify a bit by limiting ourselves to a Vector2.

A Vector2 can also be either a point or a direction depending on what you want to use it for.
If it’s (3, 0) then it could be either a point at x=3, y=0, or a direction along the x axis (one dimension) with a length of 3 and a slope of 0.
If you Normalize() that example the Vector2 becomes (1, 0). It will have a length of 1 but still be pointing in the x direction.
If you add the y value (dimension) into the picture where x=3, y=3 the Normalized value becomes (0.7, 0.7). So without going into the maths of why … you can see that the axis/dimensions impact on each other to define the Normalized direction.

We can extrapolate this into three dimensions without too much difficulty but I’m not going to describe that here as it’s not that necessary to understand. We only need to grasp what this means and what sort of use you can make of it in a Unity project.

For Example…

This is a simple use case of why you might want to use Normalize().

(Note that there is also a Normalized() function where the current vector is left unchanged and a new normalized vector is returned).

I’ve created an example project where the relationship between two objects (a Cube and a Sphere) will give us a direction (using Normalize()).¬† The direction is applied to a transform.Translate function on a third object (a Cylinder) which now moves in the given direction.

In this script below attached to our Green Cube (the unchanging point of origin – 0, 0, 0) we define the Pink Sphere as the target.¬† In the script we get the difference between the position of the sphere and the Cube and Normalize() it to make a direction. It doesn’t matter how far the Sphere is from the Cube the direction stays the same.

Also to extend the example I’ve added a Ray with the same origin as the Cube (0, 0, 0) that will always point to the Sphere.¬† This shows that the Normalized Vector3 is the same as the direction of the Ray pointing to the Sphere.

Normalizing Direction

The public Vector3 lineDirection is the difference between the position of the Cube and the Sphere. This is the variable we will pass to the Cylinder to move it in the required direction.

using UnityEngine;
using UnityEngine.UI;

public class HowToNormalizeDirection : MonoBehaviour {

    public Vector3 lineDirection; 
    public GameObject target;   
    public Text vector_text;
    public Text norm_text;
    public Text ray_direction;
    public Ray whatsTheRay;    

    // Use this for initialization
    void Start () {
        lineDirection = new Vector3();
        whatsTheRay = new Ray();
	}
	
	// Update is called once per frame
	void Update () {
        lineDirection = target.transform.position - transform.position;  
        vector_text.text = "Vector3 target.transform.position :" + lineDirection;
        lineDirection.Normalize();  
        norm_text.text = "Vector3 Normalized() :" + lineDirection;
        whatsTheRay = new Ray(transform.position, target.transform.position);
        ray_direction.text = "Ray : " + whatsTheRay.ToString();   
    }
}

Moving the Capsule in the Same Direction

In this script below attached to the Cylinder we get the direction from the lineDirection variable above but we could also have used the inbuilt Ray.direction function to return the same result.

using UnityEngine;

public class MoveCapsule : MonoBehaviour {

    public Vector3 direction;
    public float speed;
    // Use this for initialization
    void Start () {
        direction = Vector3.zero;
        speed = 0.50f;
	}
	
	// Update is called once per frame
	void Update () {
        var script = GameObject.FindWithTag("Cube").GetComponent();   
        direction = script.lineDirection;
        //direction = script.whatsTheRay.direction;  
        transform.Translate(direction * speed * Time.deltaTime); 
	}
} 

In the Video below you can see this demonstrated. The position of the Pink Sphere is being manually manipulated using the Transform on the right. The text areas expose the vector3 values being accessed by the objects and scripts.

So what’s going on in this video amateur hour?

The three lines of text at the top of the game scene represent:

1. The Vector3 Position of the Pink Sphere.

2. The Vector3 Normalized  direction which is the relationship between the position of the Green Cube (0, 0, 0) and the Pink Sphere (x, y, z).

3. The uses of Vector3 in the Ray.  First the position of the origin and then the Direction of the Ray which is exactly analogous to the Normalized Vector3 in point 2 above.

The uses for something like this could extend to a weapon aiming system, a custom controller or be passed into a path finding routine. This is not the only option of course there are other built in methods like transform.LookAt(target) available which may accomplish your programming goal.

During the research for this post I found the following links helpful.

https://docs.unity3d.com/ScriptReference/Vector3.Normalize.html
https://docs.unity3d.com/ScriptReference/Vector3-normalized.html (which is a different but related function)
https://forum.unity.com/threads/what-is-vector3-normalize.164135/
https://answers.unity.com/questions/52881/vector-normalization-question.html
https://www.dummies.com/education/math/calculus/finding-the-unit-vector-of-a-vector/ (for why you really don’t want to know about the math or do it by hand)
https://www.mathsisfun.com/algebra/vector-unit.html (best for simple vector explanation)
https://docs.unity3d.com/ScriptReference/Ray.html
https://docs.unity3d.com/ScriptReference/Ray-ctor.html

Animating the Running Dog with Unity and GIMP

I finally got the game I started in November into some semblance of how I wanted it to look.¬† The running dog animation anyway.¬† Still playing with different backgrounds but this one is sure “fun”.

It took a long time to get all the dog actions done drawing each one by hand and getting a lot of unsatisfactory results.¬† The animation is all “flip-book” style with each frame a different drawing.¬† There were between 70 to 80 pictures for the entire set.

The art was done in gimp with the help from some nice veterinarian manuals on the dog gait.

I mocked up a few transitions from an opening scene to a nice zoom down to playing position.

Then moved between a sedate walk to a running gallop and a little jump.¬† She looks like she’s really flying now!

 

During this time I also had a big problem with corrupted a scene after playing in the Animator window.  Big tip and future post Рdo not restart Unity if it crashes without taking a copy of your temp directory first. It took me about two weeks to rebuild everything but more on that later.

The Five Games in Ten Weeks Challenge!

Some months back I went to a local presentation by the Unity champions at The Arcade¬†(a collaborative workspace specifically for¬†game¬†developers and creatives) in Melbourne.¬† It was one of those “learn about others and gain insight into one’s self” moments.¬† One thing I picked up from one of the presenters from Hipster Whale (makers of Crossy Road) was that when they were developing a new game they tried to keep the proof of concept to a two week period.¬† Some time later with this in mind I presented to myself the “Five Games in Ten Weeks Challenge!” to see if I could really push out five different games in ten weeks.

Five Games in Ten Weeks

Continue reading

New Game! Peace Run

Hi Zulu here¬† – We started work on a new game this week called Peace Run.¬† I wanted to do a continuous runner and really liked the idea of having a line drawing animation that looks like the player character “grows” out of the line.¬† The runners will be animal silhouettes like the dog shown here.

Doing the frame by frame animation has been really fun but it took a lot of trial an error to get to this point – which as you can tell isn’t perfect yet.

All the artwork was done in GIMP 2 and the game design is in Unity3d.¬† We used a simple animation script to play the start of the walk sequence.¬† I can’t wait to get this puppy into a run and a full sprint.