Basic tutorial
The "Getting Started" section covered playing and rendering audio by constructing "signal processes" modelled as the abstract type Signal. In this basic tutorial, we'll construct various such signals and explore the operators available to combine them.
Phasors and sines
Oscillators form the basic signals for composition purposes. The phasor and sinosc are two of the most basic oscillators available for composition. The code below assumes using Synth.
Play a 440Hz tone at 0.25 amplitude for 2.0 seconds.
> play(sinosc(0.25, 440.0), 2.0)Play a middle C note at 0.25 amplitude for 2.0 seconds.
> play(sinosc(0.25, midi2hz(60)), 2.0)Here,
midi2hzis used to convert a MIDI note number to a Hz frequency value. This function also supports constructing signal processes and therefore you can add a vibrato like usingmidi2hz(60 + sinosc(0.3, 5.0))for example.Play a harsh sawtooth wave.
> play(0.25 * phasor(440.0), 2.0)The
phasorgoes from 0.0 to 1.0 at the given frequency and loops around. It therefore has a discontinuous jump from 1.0 to 0.0.Play a sawtooth, square or triangular wave.
> play(0.25 * tri(440.0), 2.0) > play(0.25 * saw(440.0), 2.0) > play(0.25 * sq(440.0), 2.0)These are "softened" version where the sharpness of these waveforms is mitigated using a large cut off low-pass-filter. (See
protect).You can combine these waveforms using common addition and modulation operations.
> play(0.2 * tri(330.0) + 0.4 * tri(440.0) + sinosc(0.2, 660.0), 2.0)Yes the multipliers can themselves be signals, so the following works too.
> play(sinosc(1.0, 10.0) * sinosc(0.25, 440.0), 2.0)
Filters
Many types of linear time invariant filters are available, including first order and second order, biquad and FIR filters. A "filter" attenuates or enhances a set of frequencies present in the spectrum of a signal.
For example, to use a bandpass filter (bpf) on a noise signal,
> play(bpf(noise(), 400, 1000.0), 2.0)In general, you're probably looking to use the "biquad" filters named lpf, bpf, bpf0 and hpf.
For these filters, the filter parameters themselves can be treated as time varying "signal processes". This is a common thing in this package – where if it is reasonable to want to be able to vary a parameter over time, it is likely permissible to do so using a signal.
Clocks
One kind of a "signal process" that is useful in a number of ways, not unlike how phasor is a flexible generator, is the "clock". You can create clocks with varying or fixed tempos using clock and clock_bpm. The values generated by a clock increase monotonically, responding to the instantaneous tempo. Therefore they are useful to schedule events in a virtual time track.
While clocks are signals, they don't produce a meaningful sound on their own, so we'll defer their use until later.
Basic realtime control
A control is a signal whose value can be updated in near realtime. So you can use it to control, for example, the frequency of a sine wave after it has started playing.
> f = control()
> f[] = 440.0
> play(sinosc(0.25, f), 10.0)
> sleep(1.0)
> f[] = 880.0
# Now the frequency of the tone should shift to 880Hz
# with a little bit of smoothing.Since abrupt changes are usually detrimental to sound processes, control will "dezipper" the changes using a simple low pass filter to ensure some degree smoothness to the changes. Given sufficiently small changes, this should work well.
control is intended to be useful with a controls exposed via a graphical user interface (work in progress). Until then, you can use it to manipulate signal parameters in real time on the REPL.
Fanout
Consider the following piece of code -
> s1 = sinosc(0.3, 330.0)
> play(s1 * s1, 2.0)The above arrangement is not permitted as is because the signal s1 can only be "used" in one place since it has state. So one of the following must be done instead -
You can make a new sinosc ...
> s1 = sinosc(0.3, 330.0)
> s2 = sinosc(0.3, 330.0)
> play(s1 * s2, 2.0)This has negligible overhead compared to the previous one (assuming it was expected to work, that is). Another approach if you don't really want to make another process like s2, is to wrap the first with the ability to "fanout" ...
> s1 = fanout(sinosc(0.3, 330.0))
> play(s1 * s1, 2.0)That is fine since s1 is now a singal that can be used in multiple "receiving positions".
While "fanin" is accomplished using regular + operator and is not really a thing done in a programming language without such an operator, "fanout" needs to be carefully dealt with when constructing networks of signal processes.
Most Signal types in the package don't support fanout by default. So there is a separate fanout operator that wraps such processes with the ability to fanout under the assumption that time always moves monotonically forward. The operator ends up being a no-op for those processes that support fanout on their own.
Signal processes which support fanout natively (including Fanout) implement the abstract type SignalWithFanout which is itself a Signal.
Stereo signals
Synth.jl is centered around monophonic signals. However, there is some basic support for stereo signals through pan and stereo.
Note: An arbitrary number of channels is seen as a complexity that defeats the pedagogical purpose of this package and will perhaps be added in the future if an appropriate design that maintains the simplicity of the rest of the system is made possible.
Stereo signal processes are represented by the
Stereo{L,R}type. You turn two mono signal processes into a stereo signal process usingstereo. It is important that the two mono processes support "fanout" or are not used anywhere else. This is reflected in the<: SignalWithFanouttype constraint. The stereo signal itself is therefore fanout-capable and is a subtype ofSignalWithFanout.The usual mono operations of
+,*and-are supported on stereo signals directly as well. So you can modulate them and mix them without much fanfare.If you have a stereo signal, you can get at its left and right channels as mono signals using
leftandright.You can mix down a stereo to a mono signal using
monowhich will add the left and right channels in the simplest case, but also accepts a "panner" signal for more complex mixing.rendersupports rendering stereo signals.A
Stereosignal is itself aSignalwhich when used as a mono signal will result in the mix down of the left and right channels.