Continuing along with our audio tech series – last week I talked about wrapping Pure Data, and this week I’ll take a look at exactly what a message is, so that I can move on to how I went about dispatching messages from/to the right Unity scripts and Pure Data sub-patches.
When we want to tell a patch in Pure Data to do something, we need to send it a message. Whenever FRACT needs to tell Pure Data to change a note, modulate a synth, move an audio source in 3D space, play a sample, or do pretty much anything else, it’s done by sending a collection of messages. Pure Data veterans might find this post a little basic, but it’s important to understand this stuff before what comes after. :)
When you’re editing a Pure Data patch, the simplest kind of message is sent by a message box connected directly to an object:
Here we have a cosine wave oscillator which starts at a frequency of 440hz. When we click on the message, the oscillator will switch immediately an octave down to 220hz.
In Pure Data, each message begins with a selector (which must be a symbol, usually a single word or dash-delimited-words), and then continues with zero or more entries (each of which is either a symbol or a number). The selector tells the receiving object what sort of thing to do, and the content gives more details. In object-oriented-programming terms, the selector is the method name, and the entries after it are the arguments. In the above example, the selector is “float” and it’s followed by an argument of 220. The osc~ object implements a method for “float” which sets its oscillation frequency.
Why is the method called “float,” rather than something more intuitive like “set-frequency”? “float” is actually one of a few special selectors, which signifies that we’re simply sending a single number to an object. In fact, there’s a shorthand for sending this kind of message, we can simply write:
While it looks like we’re just sending a message containing 220, when it sees a message box starting with a number Pure Data implicitly inserts the selector “float,” making the full message [float 220(. While you can mostly get by in Pure Data without knowing this detail, it comes up in corner cases (i.e. differentiating between [float 1( and [list 1(, which is a list containing one float) and is very important to be aware of if you’re sending messages from code. So remember, a message always has a selector, even if you can’t necessarily see it.
If you’re curious, there are a few other important selectors: “symbol,” which is usually used to send a single piece of text, and “list,” which is used in a lot of built-in objects for manipulating lists of things.
From code we actually need to do something a bit more complicated, though. Manually connecting objects is fine when authoring patches in the Pure Data editor, but we can’t easily do that from Unity. Instead we’re going to use a special object called “receive” to receive the message. We give it a name, then use the “send” object to send the message.
We can also include the destination in the message itself, prefixing a semicolon to tell Pure Data we’re sending an indirect message:
So in all, a message in Pure Data consists of a symbol identifying the receiver, a symbol identifying the selector, and then any number of symbols or numbers giving the arguments.
There’s a problem, though, in that all the names of our receivers are hard-coded. The examples given are very simple, but FRACT’s subtractive synth takes dozens of different messages to configure different elements. On top of that, if we have two copies of the same patch loaded, how do we send the message to one without sending it to the other? We need some convention for easily assigning different names to different patches and the messages they receive, which is what I’ll dig into next week!
In last week’s post, I discussed how we decided to integrate Pure Data though a native Unity plugin. This week I’ll discuss what the plugin actually does, and what sort of role it plays in bridging Unity and Pure Data.
Pure Data works with metaphorical “patches,” sets of signal-processing or message-handling objects which are connected to each other with patch cords. There are three main ways we need to interact with a patch: we need to be able to send messages to our patch, to receive messages back from our patch, and to actually process and retrieve audio samples for playback. All this has to happen fairly efficiently and not block from either side.
In order to understand how this all works, we need to know a bit about how message processing interacts with audio processing in Pure Data. Pure Data works with 64-sample “ticks,” in that whenever audio is processed it is done in multiples of 64 samples. Once per tick, all currently incoming messages are handled and outgoing messages are sent out.
This poses a bit of a problem for us. In Unity, as with virtually all other game engines, audio needs to be processed in a different thread from the game logic. This is because the audio always needs to be output at a fixed rate of 44100 (or sometimes 48000) samples per second, regardless of game logic or framerate. If so much as a single sample is missed, it’s clearly audible as an unpleasant “click” in the audio. The problem comes from the fact that we want the Pure Data audio processing to happen in our game’s audio thread, but we also want to send and receive messages from the main game update loop.
To deal with this, we need a couple of queues between the API and Pure Data, one to buffer incoming messages and one to buffer outgoing messages. (I used a lockless queue implementation from PortAudio.)
Whenever FRACTOSC sends messages to Pure Data, they get put in the incoming queue. As soon as the audio thread requests samples (in this case from Unity’s MonoBehaviour.OnAudioFilterRead callback,) audio is processed and the incoming messages are handled, often resulting in new outgoing messages which are added to the outgoing queue. When the game loop runs its next update, it pulls the messages out and handles them however it wants.
This is the complete high-level flow! Next time I’ll take a look at exactly what a message is, and how I went about dispatching messages from/to the right Unity scripts and Pure Data sub-patches.
Sorry we didn’t shout this great news from the rooftops yesterday, we had a baby with a fever on our hands and very little sleep between us.
That said, we’re thrilled to announce that FRACTOSC has been Greenlit on Steam! that means, as I’m sure you can imagine, that the game will be made available on Valve’s service, great news for gamers, and great news for us.
We’ll be poking around with the SDK this week, kicking the tires and evaluating features and will keep everyone posted as we know more more.
Here is part 2 of the series discussing the audio tech we implemented for FRACTOSC, check out part 1 here.
So once we’d decided to go with Pure Data for audio synthesis, the question was, how could we integrate that with the rest of FRACT’s engine?
When used in a live performance or gallery setting, the most common way to use Pure Data in tandem with other tools is simply to run them side by side as separate programs. The programs then communicate with each other in some way, often over the network using MIDI or Open Sound Control (a different OSC from the OSC in FRACT’s title). This is a flexible way to use any combination of programs together, but it has a few downsides for a case like ours where we need to put together a robust package that “just works.” There would be work involved in starting up Pure Data as a separate process, monitoring it, and closing it properly when the game exits, and since this sort of functionaly is usually platform-dependend it would all have to be rewritten on each platform. In addition, since Windows and OSX still see the two programs as separate entities, they might decide to force close (for whatever reason) one without closing the other, so you might end up with a soundless FRACTOSC (yuck) or an abandoned audio engine blaring forever (even more yuck).
Fortunately, there was another option. A fork of Pure Data called libpd wraps most of the core functionality of Pure Data into a library that can be embedded directly into native programs. In our case, we compiled it into a native plugin that we can use from Unity. (Native plugins are only available in pro or mobile versions of Unity.) This way Pure Data runs as a built-in part of our game.
There were also problems with this approach, though. Most notably, since the Unity editor runs in the same process as your game, it’s also affected by errors your game runs into. Normally these errors are in managed C# code, and are caught by Unity and logged. Unfortunately, errors in native code can’t be caught, and so if I make a mistake coding the plugin not only does it crash but it brings down the entire Unity editor with it! This wasn’t uncommon near the beginning of development, and so we formed a consistent habit of saving all changes before hitting “Play,” to prevent from losing unsaved work.
Next time I’ll talk about how the plugin works, and how we communicate through it to Pure Data!
This will be the first post in a series discussing the audio tech we implemented for FRACTOSC. We’ve put a lot of effort into making it work (and making it work well!), and I’d like to write about the process we went through to get where we are. Along the way I’ll talk about different decisions we made and the technology we’ve built, in a good amount of detail!
When the three of us started working together we knew we wanted FRACT (which didn’t yet have the OSC suffix) to encourage musical expression on the part of the player. What we weren’t sure of at the time was what approach to take, or how far we could go. The first step was doing a few different prototypes of what we could do with the built-in Unity sound engine. Somewhat in parallel, we investigated some real synthesis options.
Working within the constraints of Unity 3.3, in theory we could get a wide range of different synth sounds just by baking a big library of samples, for different combinations of modulation and pitch. This would constrain the sort of control the user would have over the sound, but could more or less work with few changes to the engine technology. We could even combine multiple samples by fading between them to easily combine effects. But even this limited capability depended on one thing, being able to play sounds on a consistent beat.
Our brain is exceptionally good at associating what we hear to what we see, so if a sound doesn’t match an on-screen event precisely we still perceive the two things as being one event. As a result, for most games there’s a lot of leeway in audio timing, it really doesn’t need to be that precise. The moment you’re trying to synchronize sounds to anything resembling a regular beat, though, you can hear even small discrepancies in timing. With the audio functionality exposed in Unity 3.4, we had a lot of difficulty getting this timing to be reliable, and even then the precision went way down with fluctuating frame-rate (after all, we are running this alongside a full game!)
After running into this obstacle and a few others, it became clear that we should at least begin investigating alternatives. I had some experience using Max/MSP and Pure Data from previous projects, and knew they’d both allow us to quickly prototype sounds. I had never tightly integrated them into another program, though, so there was some unknown territory to explore before we knew whether we’d be able to go forward with it. If we handled all the sound generation outside of Unity, we could bypass the Unity sound engine completely by making our own connection to the sound card.
Next week I’ll go into how we started integrating Pure Data with Unity, and start discussing some of the technical issues that needed to be solved to make it work in the context of FRACTOSC.
Sorry for the delay in updates! We had a busy and tiring GDC this year, with two presentations, a ton of hangouts, and a party or two.
Henk’s talk on how we do our synthesizer magic went splendidly, and I think my talk about just what the hell we’ve been doing for the past 2 years went OK too. We met with friends, made new ones, and got inspired by what other indies are up to!
I also put together a new little teaser trailer for our GDC presentations, which we published last week and got a ton of great feedback on. This also helped out with our geenlight traffic, thanks everyone!
The presentations and the trailer also brought a lot of new and renewed attention to FRACT, which is awesome! We also have been getting a lot of requests for the old IGF award winning prototype from 2011, and a few questions as to why it’s no longer available.
Basically, we’re in the final stretch of finishing FRACTOSC, which is from the ground up completely new and shares nothing with the old prototype. And while we’re very proud of the old FRACT prototype, it doesn’t paint an accurate picture of where we’re going with FRACTOSC. If you’re still really eager to try it, send us a message by the contact form and we can get it to you.
Otherwise, we’re working hard to finish the game – stay tuned for more updates!
Another quick sample of our new Core (‘curated’ sounds pretentious) Synths. I actually stepped away from my computer mid puzzle test/tweak, and came back into the office to this lovely wash of sound. The beats, as always, are temporary :)