Home Business & Other Mapping and Visualization with SuperCollider

Mapping and Visualization with SuperCollider

By Marinos Koutsomichalis
books-svg-icon Book
eBook $25.99
Print $43.99
Subscription $15.99
$10 p/m for first 3 months. $15.99 p/m after that. Cancel Anytime!
What do you get with a Packt Subscription?
This book & 7000+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with a Packt Subscription?
This book & 6500+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with eBook + Subscription?
Download this book in EPUB and PDF formats, plus a monthly download credit
This book & 6500+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with a Packt Subscription?
This book & 6500+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with eBook?
Download this book in EPUB and PDF formats
Access this title in our online reader
DRM FREE - Read whenever, wherever and however you want
Online reader with customised display settings for better reading experience
What do you get with video?
Download this video in MP4 format
Access this title in our online reader
DRM FREE - Watch whenever, wherever and however you want
Online reader with customised display settings for better learning experience
What do you get with video?
Stream this video
Access this title in our online reader
DRM FREE - Watch whenever, wherever and however you want
Online reader with customised display settings for better learning experience
What do you get with Audiobook?
Download a zip folder consisting of audio files (in MP3 Format) along with supplementary PDF
What do you get with Exam Trainer?
Flashcards, Mock exams, Exam Tips, Practice Questions
Access these resources with our interactive certification platform
Mobile compatible-Practice whenever, wherever, however you want
BUY NOW $10 p/m for first 3 months. $15.99 p/m after that. Cancel Anytime!
eBook $25.99
Print $43.99
Subscription $15.99
What do you get with a Packt Subscription?
This book & 7000+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with a Packt Subscription?
This book & 6500+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with eBook + Subscription?
Download this book in EPUB and PDF formats, plus a monthly download credit
This book & 6500+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with a Packt Subscription?
This book & 6500+ ebooks & video courses on 1000+ technologies
60+ curated reading lists for various learning paths
50+ new titles added every month on new and emerging tech
Early Access to eBooks as they are being written
Personalised content suggestions
Customised display settings for better reading experience
50+ new titles added every month on new and emerging tech
Playlists, Notes and Bookmarks to easily manage your learning
Mobile App with offline access
What do you get with eBook?
Download this book in EPUB and PDF formats
Access this title in our online reader
DRM FREE - Read whenever, wherever and however you want
Online reader with customised display settings for better reading experience
What do you get with video?
Download this video in MP4 format
Access this title in our online reader
DRM FREE - Watch whenever, wherever and however you want
Online reader with customised display settings for better learning experience
What do you get with video?
Stream this video
Access this title in our online reader
DRM FREE - Watch whenever, wherever and however you want
Online reader with customised display settings for better learning experience
What do you get with Audiobook?
Download a zip folder consisting of audio files (in MP3 Format) along with supplementary PDF
What do you get with Exam Trainer?
Flashcards, Mock exams, Exam Tips, Practice Questions
Access these resources with our interactive certification platform
Mobile compatible-Practice whenever, wherever, however you want
About this book
SuperCollider is an environment and programming language used by musicians, scientists, and artists who work with audio-files SuperCollider has built-in graphical features which are used in conjunction with the sound synthesis server to create audio-visual mapping and sound visualization. If you wish to create data visualizations by acquiring data from audio and visual sources, then this book is for you.Digital sound artists need to analyze, manipulate, map, and visualize data when working on a scientific or an artistic project. As an artist, this book, by means of its numerous code examples will provide you with the necessary knowledge of SuperCollider's practical applications, so that you can extract meaningful information from audio-files and master its visualization techniques. This book will help you to prototype and implement sophisticated visualizers, sonifiers, and complex mappings of your data.This book takes a closer look at SuperCollider features such as plotting and metering functionality to dispel the mysterious aura surrounding the more advanced mappings and animation strategies. This book also takes you through a number of examples that help you to create intelligent mapping and visualization systems. Throughout the course of the book, you will synthesize and optimize waveforms and spectra for scoping as well as extract information from an audio signal. The later sections of the book focus on advanced topics such as emulating physical forces, designing kinematic structures, and using neural networks to enable you to develop a visualization that has a natural motion with structures that respect anatomy and which come with an intelligent encoding mechanism. This book will teach you everything you need to work with intelligent audio-visual systems to extract and visualize audio-visual data.
Publication date:
November 2013
Publisher
Packt
Pages
222
ISBN
9781783289677

 

Chapter 1. Scoping, Plotting, and Metering

Visualizing audio signals and numerical datasets can be very straightforward in SuperCollider with the built-in scoping, plotting, and metering functionalities. The corresponding GUI objects are simple to use, yet they are highly customizable and extremely powerful. In this chapter we will introduce a series of fundamental techniques, and learn how to design both basic as well as more advanced custom visualizers. However, it should be noted that all the examples herein assume normalized datasets and test signals, deferring the complexities of data mapping and signal optimization to be discussed in depth in subsequent chapters.

The topics that will be covered in this chapter are as follows:

  • Plotting audio, numerical datasets, and functions

  • Scoping waveforms and spectra

  • Metering signals and data

  • Nonstandard and complex visualizers

 

Plotting audio, numerical datasets, and functions


Before discussing how we can scope audio signals in real time, it is worth reviewing the various ways in which we can create static graphs and charts out of arbitrary numerical datasets or signals.

Using plot and plot graph

SuperCollider provides us with a very handy plot method. We can use this method in different situations to create graphs on the fly from instances of Function, ArrayedCollection, Env, Buffer, SoundFile, WaveTable, and from a series of other objects (also depending on what extensions we have installed). An example of this is shown in the following code:

{SinOsc.ar(100)}.plot(0.1);              // plot a 0.1 seconds of a sinewave
[5,10,100, 50, 60].plot;                 // plot a numerical dataset
Env([0,1,0],[1,1],[-10,2]).plot;         // plot an envelope
Signal[0,1,0.5,1,0].plot;                // plot a signal
Wavetable.chebyFill(513,[1]).plot;       // plot a wavetable

( // plot the contents of a sound file
Server.default.waitForBoot({ // wait for Server to boot
  Buffer.read(Server.default, Platform.resourceDir +/+ "sounds/a11wlk01.wav").plot;
});
)

Tip

Downloading the example code

You can download the example code files for all Packt books you have purchased from your account at http://www.packtpub.com. If you purchased this book elsewhere, you can visit http://www.packtpub.com/support and register to have the files e-mailed directly to you.

In all cases, the resulting graphs will be automatically normalized with respect to the kind of data plotted so that each dimensions' display range is determined by the minimum and maximum quantities it has to represent; that is, to say that the plot's graph is content-dependent. Additionally, their meaning depends upon the receiver (that is, the kind of object plotted) so that for instances of Array, Wavetable, or Signal, the graph would represent the value per index; for UGen graphs, amplitude per unit time; for instances of Env, value per unit time; and for instances of Buffer, amplitude per frame. Since its behavior is different for different kinds of objects, the plot is said to be polymorphic. We should always consider the implicit consequences of these two properties. For example, the following two waveforms could be easily mistaken as identical, even if they are not:

(  // plot two sinusoids of different amplitude
{SinOsc.ar(100)}.plot(bounds:Rect(0,0,400,400));
{SinOsc.ar(100)*2}.plot(bounds:Rect(400,0,400,400));
)

To compensate for such a phenomenon, we need to explicitly set the minima (minval) and maxima (maxval) arguments. Interestingly enough, we can also plot abstract functions as long as they are one-argument ones and return some arithmetic value. We can do this with the plotGraph method, as follows:

{arg x; tan(x**2);}.plotGraph(100,-pi,pi); // graph out of a function

Here, the interpreter calculates the output of the given function for 100 different values in the range of ± π and populates the graph with the results; the horizontal axis representing node indexes and the vertical axis representing the function's output.

Note

Buffer objects have a finite capacitance measured in frames; each frame may hold exactly one sample, therefore, a frame is the container of a sample.

Polymorphism in Computer Science refers to the ability in programming to present the same interface for different underlying forms.

Using plotter

Both plot and plotGraph are convenient methods, which ostensibly are just abstractions of a series of tasks. Whenever they are invoked, a parent Window is created containing an instance of Plotter whose specifications are configured accordingly. Explicitly creating and using Plotter allows sophisticated control over the way our data is plotted. The following code exemplifies a number of features of the Plotter object:

(  // data visualization using custom plotters
// the parent window
var window = Window.new("Plotter Example", Rect(0,0,640,480)).front;

// the datasets to visualize 
var datasetA = Array.fill(1000,{rrand(-1.0,1.0)});// random floats
var datasetB =  [ // a 2-dimensional array of random floats
  Array.fill(10,{rrand(-1.0,1.0)}),
  Array.fill(10,{rrand(-1.0,1.0)})
];

// the plotters
var plotterA = Plotter("PlotterA",Rect(5,5,630,235),window);
var plotterB = Plotter("PlotterB",Rect(5,240,630,235),window);

// setup and customize plotterA
plotterA.value_(datasetA);       // load dataset
plotterA.setProperties(          // customize appearance
  \plotColor, Color.red,         // plot color
  \backgroundColor, Color.black, // background color
  \gridColorX, Color.white,      // gridX color
  \gridColorY, Color.yellow)     // gridY color
.editMode_(true)   // allow editing with the cursor
.editFunc_({ // this function is evaluated whenever data is edited
  arg plotter,plotIndex,index,val,x,y;
  ("Value: " ++ val ++ " inserted at index: " ++ index ++  
    ".").postln;
});

// setup and customize plotterB
plotterB.value_(datasetB);   // load datasetB
plotterB.superpose_(true);   // allow channels overlay
plotterB.setProperties(
  \plotColor, [Color.blue,Color.green], // plot colors
  \backgroundColor, Color.grey, // background color
  \gridOnX, false,              // no horizontal grid
  \gridOnY, false)              // no vertical grid
.plotMode_(\steps);             // use step interpolation
)

The result is illustrated in the following screenshot:

The comments pretty much explain everything. The first Plotter object is editable, which means that we can alter the graph when dragging and clicking on it with the mouse. Whenever we do so, editFunc will be evaluated with the following that are passed as arguments:

  • The Plotter object.

  • The plot index (which is only meaningful if there is more than one graph, such as for multichannel signals, of course).

  • The index position (horizontal axis value).

  • The value of the vertical dimension.

  • The x and the y positioning of the cursor.

In this case, while clicking or dragging with the mouse, a simple message is printed in the console.

The second Plotter object that operates on a multichannel dataset will create ramps out of every individual channel and superimpose them on the same graph using different colors. Using plotMode, we can select between the following alternative data representation modes, namely, \linear (linear interpolation), \points (data points only), \plines (both lines and points), \levels (horizontal lines), and \steps (ramps).

Using SoundFileView

In a visualization context, we may encounter situations wherein we need to plot the contents of some audio file. We could do so with Buffer and Plotter, yet there does exist a dedicated class for such cases, namely, SoundFileView as shown in the following code:

(  // display the contents of a soundfile
// create the view
var view = SoundFileView.new(Window.new("A SoundFileView Example", 640@480).front,640@480);

// load a soundfile in the view using a SoundFile
var file = SoundFile.new;   // create a new SoundFile
file.openRead(Platform.resourceDir +/+ "sounds/a11wlk01.wav");  
// read a file
view.soundfile_(file);           // set the soundfile
view.read(0, file.numFrames);    // read the entire soundfile (**for big soundFiles use .readWithTask instead**)
file.close;     // we no longer need the SoundFile

// configure appearence
view.timeCursorOn_(false);         // no time cursor
view.gridOn_(false);               // no grid
view.background_(Color.green);     // background color
view.waveColors_([Color.magenta]); 
// waveform color (it has to be an array)
)

Again the code is pretty straightforward; the only implication being that we need to open and read the actual file with a SoundFile object before we can read its contents into the SoundFileView object. When large sound files are involved, we will have to use readWithTask instead to avoid overloading our computer's memory. Then, if needed, we can use the zoom (or zoomToFrac) and scrollTo methods to only display portions of the file or to animate its contents. For example, the previous code could continue as shown in the following code:

// animate the contents of the file 
fork{ 100.do { arg counter; 
  { // every time we put some GUI-related operation in a Routine we need to defer it so that it is scheduled in the AppClock instead
    view.zoomToFrac(counter/100); // to total zooming range is 0-1
    view.scrollTo(counter/100); // the total scrolling range is 0-1
  }.defer; 
  0.1.wait; // speed of animation
}}

Note that SuperCollider will refuse to schedule any GUI-related operation in the SystemClock class, hence we will have to use defer whenever such operations are involved. This is so that we can implicitly schedule them in the AppClock instead.

 

Scoping signals


Plotter and SoundFileView can be exploited in several ways, but they are not really efficient for scoping real-time audio signals. SuperCollider features dedicated built-in visualizers that we use to easily scope signals in both time and frequency domains.

Scoping waveforms

As far as signals are concerned, we can easily plot their waveforms in real time by means of simply invoking scope on UGen graphs and instances of Bus or Server. The scopemethod is a convenient one too, which creates an instance of Stethoscope in the background; the latter being a fully featured virtual oscilloscope. An example of this is shown in the following code:

( // Stethoscope Example
Server.default.waitForBoot({ // wait for server to boot
  {SinOsc.ar}.scope;  // scope a UGen graph
});
)

An instance of Stethoscope features dedicated controls so that we can configure its display ranges; select which and how many instances of Bus to plot; and switch between overlay, non-overlay, or Lissajous (that is X/Y) representational modes. We can design custom oscilloscopes through ScopeView, which is a powerful, highly parameterized waveform visualizer on its own. However, at the time of writing, and in defiance of Stethoscope being fully functional on both kinds of servers, ScopeView cooperated only with the internal one. Other than this, its use involves linking it with a manually allocated instance of Buffer whose contents are to be constantly updated using a ScopeOut UGen (and not with an Out UGen). In the following code, we have implemented a custom waveform/phase scope:

(  // a custom dual oscilloscope
Server.default = Server.internal;  // make internal the default server
Server.default.waitForBoot({

  var waveScope, phaseScope; // the two scopes
  
  // allocate two audio buffers
  var bufferA = Buffer.alloc(Server.default, 1024,2);
  var bufferB = Buffer.alloc(Server.default, 1024,2);

  // a stereo signal
  var sound = {
    var signal = Resonz.ar(
      [ ClipNoise.ar(1.7), ClipNoise.ar(1.8) ],
      SinOsc.ar(1000).range(100,500)); // a stereo signal
    ScopeOut.ar(signal, bufferA); // update first buffer
    ScopeOut.ar(signal, bufferB); // update second buffer
    Out.ar(0,signal); // write to output
  }.play;

  // create the main Window
  var window = Window("Dual Oscilloscope", 640@320).front
  .onClose_({ // on close stop sound and free buffers
    sound.free;
    bufferA.free;
    bufferB.free;
  });
  window.addFlowLayout; // add a flowLayout to the window 

  // create the ScopeViews and set their buffers
  waveScope = ScopeView(window,314@310).bufnum_(bufferA.bufnum);
  phaseScope = ScopeView(window,314@310).bufnum_(bufferB.bufnum);

  // customize waveScope
  waveScope.style_(1)   // overlay channels
  .waveColors_([Color.red, Color.yellow]).background_(Color.magenta(0.4))
  .xZoom_(1.7).yZoom_(1.2);   // scaling factors
  
  // customize phaseScope
  phaseScope.style_(2)   // lissajous mode
  .waveColors_([Color.magenta]).background_(Color.cyan(0.3))
  .xZoom_(1.2).yZoom_(1.2);   // scaling factors
})
)

Our custom scope is shown in the following screenshot:

Note

Lissajous curves, named after the 19th century French mathematician Jules Antoine Lissajous, represent the ratio between two different signals and are typically used as phase scopes to visualize the phase differences between the left and right channels of a stereo signal.

Scoping spectra

Frequency domain refers to the representation of signals where the frequency is mapped to the horizontal dimension and amplitude to the vertical dimension. As far as real-time plotting in the frequency domain is concerned, much like waveform scoping, we can either use FreqScope to globally scope the default output of Server; the scopeResponse method to scope UGen graphs on the fly; or the more sophisticated FreqScopeView method to design custom frequency visualizers. Yet, in spite of them being very similar in spirit, there are a couple of major differences between the latter and ScopeView, as illustrated in the following code:

(  // a custom Frequency Analyzer
Server.default = Server.local; // set local as the default server
Server.default.waitForBoot({
  // create the parent window
  var window = Window("Frequency Analyzer", 640@480).front
  .onClose_({ // on close
    sound.free;  // stop sound
    scope.kill;  // kill the analyzer
  });

  // the bus to scope
  var bus = Bus.audio(Server.default,2);  
  
  // a stereo signal
  var sound = {
    var signal = Resonz.ar(
      [ ClipNoise.ar(1.7), ClipNoise.ar(1.8) ],
      SinOsc.ar(1000).range(100,500)); // a stereo signal
    Out.ar(bus,signal); // update bus for scoping
    Out.ar(0,signal);   // write to output
  }.play;

  // the frequency scope
  var scope = FreqScopeView(window,640@480).active_(true); 
// activate it
  scope.background_(Color.red).waveColors_([Color.yellow]); 
// set colors
  scope.dbRange_(120);  // set amplitude range (in decibels)
  scope.inBus_(bus); // select Bus to scope
})
)

Here, we read the signal directly from an instance of Bus, rather than Buffer. Moreover, we have to explicitly set the active variable of FreqScope to true, else no scoping will occur. Ironically enough, as of this writing, FreqScopeView will only collaborate with instances of the localhost Server, thereby making it impossible to have both ScopeView and FreqScopeView based visualizers scoping the very same signal (although we can do so using Stethoscope instead).

 

Metering levels


Besides plotting the actual signal or dataset, there are situations where we merely want to monitor changes in some magnitude. The most typical scenario is metering the amplitude of some signal, but we could meter anything really, as long as it is represented by some numerical value.

Monitoring signals

Generic metering in SuperCollider is primarily addressed by the LevelIndicatorclass. To monitor some magnitude specific to a signal, we first need to track it, write the resulting values to some control-rate instance of Bus or to some instance of Buffer, and later use an instance of Routine to manually update the value of LevelIndicator as appropriate. For now, we will limit ourselves to using the Amplitude UGen to only track the amplitude; in Chapter 6, Data Acquisition and Mapping, we will discuss how to track other kinds of magnitudes and how to extract information out of audio signals. Note also that a convenient meter method does exist, yet it is only limited to instances of Server and to monitoring the global I/O streams of all its default channels (for example, Server.default.meter).

(  // Simple Level Metering
Server.default.waitForBoot({

  // create the parent window
  var window = Window.new("Level Metering", Rect(200,400,60,220)).front
  .onClose_({   // stop routine when the window is closed
    updateIndicator.stop;
    sound.free;
  });

  var bus = Bus.control();  	// create a Bus to store amplitude data

  // an audio signal
  var sound = { 
    var sound = WhiteNoise.ar(Demand.kr(Dust.kr(20),0,Dbrown(0,1,0.3)));
    var amp = Amplitude.kr(sound);  // track the signal's amplitude
    Out.kr(bus, amp);  // write amplitude data to control bus
    Out.ar(0,sound);   // write sound to output bus
  }.play;

  // create and customize Indicator
  var indicator = LevelIndicator(window,Rect(10,10,40,200))
  .warning_(0.5)           // set warning level
  .critical_(0.7)          // set critical level
  .background_(Color.cyan) // set Color
  .numTicks_(12)           // set number of measurement lines
  .numMajorTicks_(3)       // set number of major measurement lines
  .drawsPeak_(true);       // draw Peak Values

  // update the Indicator's value with a routine
  var updateIndicator = fork{loop{
    bus.get({   // get current value from the bus
      arg value;
      {indicator.value_(value);     // set Indicator's value
        indicator.peakLevel_(value); // set Indicator's peak value
      }.defer(); // schedule in the AppClock
    });
    0.01.wait; // indicator will be updated every 0.01 seconds
  }};
});
)

Again, note that we use a defer block to schedule anything that is GUI-related to the AppClock subclass.

Monitoring numerical data

Apart from a signal's magnitude, LevelIndicator can be used to monitor any kind of data we may be interested in. In the following code, we loop through an eight-channel multidimensional dataset:

(  // Monitoring a complex numerical Dataset
var indicators, updateIndicators; 
var index = 0;  // a global index used to iterate through the dataset
var dataset = Array.fill(8,{Array.fill(1000,{rrand(0,1.0)})}); 
// a multi-dimensional dataset

// create window
var window = Window.new("Monitoring a complex numerical dataset", 360@210).front.onClose_({ updateIndicators.stop });
window.addFlowLayout; // add flowLayout

// create and customize 8 Level indicators
indicators = Array.fill(8, {LevelIndicator(window,40@200)});
indicators.do { arg item;
  item.warning_(0.8).critical_(0.9).background_(Color.cyan).drawsPeak_(true);
};

// update the indicators with a routine
updateIndicators = fork{loop{
  indicators.do{ arg item, i; {
    var value = dataset[i][index];  // read value from the dataset
    item.value_(value);             // set each Indicator's value
    item.peakLevel_(value);         // set each Indicator's peak value
  }.defer();  // schedule in the AppClock
  };
  // increment index or set to 0 if it has exceeded dataset's size
  if ( index < 1000) {index = index + 1;} {index = 0; };
  0.1.wait; // indicators will be updated every 0.1 seconds
}};
)

This time an array of LevelIndicator objects is used instead of a singleton element, and of course, there is no need for some specialized tracking UGen. We merely use an instance of Routine to access the dataset by means of a global index, which is accordingly incremented once some datum is read. This is so that it always reflects the position of the next object. We will also need an if construct to zero out the index once it has exceeded our dataset's size in order to reiterate from the beginning.

 

Nonstandard and complex visualizers


Having discussed the basic ways in which we can visualize numerical data and audio, we will now demonstrate how we can exploit the built-in GUI elements to implement more complicated or nonstandard visualizers. In particular, we will discuss how we can reappropriate GUI elements originally meant to carry out different tasks, and how to combine the various built-in visualizers in more complex ones.

Nonstandard visualizers

Despite the existence of dedicated objects that cater to all our basic scoping, plotting, and metering needs, the use of simpler and less sophisticated GUI elements is to be considered sometimes because of their characteristic crudeness, which may be just what we are after for certain projects. As in the previous example, we can manually set the value of almost all GUI objects, and that being so, we can exploit them accordingly to design imaginative, uncanny visualizers. For example, we could make the previous code appropriate so that it monitors both the values as well as the distance between any two consecutive adjacent entries in our dataset using RangeSliders:

updateSliders = fork{loop{
  sliders.do{ arg item, i; {
    var value;
    // store current and previous values in an array and sort it so that the smaller number is always the first
    value = [dataset[i][index-1], dataset[i][index]].sort;
    // set each RangeSlider's value
    item.setSpan(value[0],value[1]);
  }.defer; };
  if ( index < 1000) {index = index + 1;} {index = 0; }; // increment
  0.1.wait; // sliders will be updated every 0.1 seconds
}};

The entire code can be found online.

A complex scope

After having discussed all major built-in signal visualizers in SuperCollider, it is trivial to combine them in a singleton visualizer. Nonetheless, the server inconsistency of ScopeView and FreqScopeView is an obstacle not easy to surpass. Since it is probably a matter of time before a future version of SuperCollider solves this problem, it does make sense to attempt it, even if only for educational reasons. The code for MyFancyStereoScopeClass is given online. We have to save it onto a file with the .sc extension, copy it into our extensions folder (which can be always retrieved by evaluating Platform.userAppSupportDir), and recompile the SuperCollider's Class library before we can use it as shown in the following code snippet:

( // MyfancyStereoScope Example
Server.default.waitForBoot({ // wait for server to boot
  MyFancyStereoScope.new();
  {[Saw.ar(400), Saw.ar(402)]}.play(a)
})
)

If all scopes were functional, our custom stereo scope would appear as shown in the following screenshot:

 

Summary


In this chapter we have learned how to design simple as well as more advanced visualizers in order to plot, scope, or meter audio signals and normalized numerical data in various ways. Hitherto, we have implemented custom spectrum, phase and waveform scopes, signal and dataset plotters, meters, and even nonstandard visualizers using a wide range of objects and methodologies. However, before advancing onto designing even more sophisticated techniques, we need to elaborate on the visual aspects of signals themselves so that they both sound and look interesting enough to scope.

In the next chapter, in particular, we will deal with waveform synthesis and discuss a series of techniques to synthesize appropriate waveforms.

About the Author
  • Marinos Koutsomichalis

    Marinos Koutsomichalis (Athens, 1981) is an artist and scholar working with sound and a wide range of other media. His artistic work interrogates the specifics of site, perception, technology, and material. His academic interests include computer programming, generative art, new aesthetics, and environmental sound and noise. He has widely performed, exhibited, and lectured internationally and has held residencies in miscellaneous research centers and institutions. He has an MA by research in composition with digital media by the University of York and, as of writing, he is a candidate PhD in Music, Sound, and Media Art at the De Montfort University. He is in the board of the Contemporary Music Research Center (KSYME-CMRC) and also the director of its class of Electronic Music and Sound Synthesis. As of writing, he is a research fellow in the University of Turin.

    Browse publications by this author
Mapping and Visualization with SuperCollider
Unlock this book and the full library FREE for 7 days
Start now