The aim of this hack is to explore turning the structures of the First Folio texts marked up using Text Encoding Initiative XML (TEI) into notes using the Chuck , PHP and Processing languages.
I wanted to explore the processes for transforming the texts for the user and explore different ways of presenting the textual data. In part this comes from similar experiments by groups such as Einstuerzende Neubauten with their percussion version of the First World War, “Der 1. Weltkrieg (Percussion Version)” and the Tone Matrix.
This is derived from the data created for a JavaScript visualization which showed the speakers for a particular line in a play. The visualization also showed the type of line, whether it was a prose line or verse. Using the Hamlet text marked up into TEI, this hack is an experiment to look at alternate interfaces to textual data.
The first experiment was to represent the matrix in both sound and audio, using Chuck and Processing.
Taking the stage data, the acts and scenes were added to the data. The resulting data is written to a file that is then interpreted by the Chuck script. The notes are then converted into notes by the script. The acts and scenes are below middle C but the speakers begin at middle C. The stage directions such as entrances and exits are higher than the speakers. Using a simple wave, an electronic representation with changing notes has been created of the entire play.
This has been extended with the Processing language. A visualization tool, Processing is able to understand the Open Sound Control (OSC)protocol and using the same data, a matching visual representation of the note can be portrayed. At the moment, this is a simple bar with text for the scenes and acts. Changing this to the JavaScript implementation of Processing means that the OSC bridge is broken.
The processes are linked by the OSC protocol sent from the Chuck process to the Processing language. The processes are soft real-time. Rather than being exactly simultaneous, the second process is slightly slower. This suggests that the pipeline architecture is not suitable here but a distributed architecture might perform in a more suitable manner but as yet I have not measured it. Linking these processes was quite easy once the correct library had been added to Processing and the OscEvent function was linked to the draw() functions with a global variable
Future work might look at changing this and moving to a streaming architecture for real-time, or to change the data so that one process sends the data to both process at the same time. Performance measurements might need to be taken from the receipt of the data and for the scripts to process it.
The second experiment is audio only (short WAV file – only a few seconds). This does look at the use of distributed processes to run different instruments, along the lines of laptop orchestras, for different notes.
Using the same data, the Chuck script takes the integers and turns them into different sound, such as flute for acts and scenes, mandolin for the speakers and wrench for the stage directions. This script uses OSC to stream the data between processes, rather than having one process handling all the sounds.
The differing instruments were chosen for their sounds. An improvement to this script would be to alter the instrument sounds to be more coherent. This could be used to affect the mood of the listener, depending upon the type of instruments used.
Future work on this might want to look at creating relevant instruments in Chuck, or using samples to influence the listener. The choice of instrument (or noise) might affect the listener’s experience, from a tonal experience to trying to recreate either modern notes or ones used in Elizabethan or Jacobean music.
Though overly simple at present, these experiments show the application of a non-textual user interface to textual data. Using the alternate types, different events could be presented.
Future work would involve looking at providing improved glyphs to represent the actions happening. This might help a user who is unfamiliar with the text to discover the events within it easily. Using improved glyphs would simplify the user’s task, such as a triangle pointing left for entrances and right for exits.
Using the OSC protocol, the processes and data can be distributed either a data parsing task or a web defined event and routed to a different process derived by rules. I know that laptop orchestras exist but I was wondering about using the cloud and distributed machines with different instruments. This would introduce timing and latency issues across the machines.
There are a series of developments that are in process. I can also tie this into my MSc project in terms of testing and design. There appears to be interest from various quarters in developing it as well.
As a coda to this, the idea was presented to the Bodleian as part of the ideas hack to take it further. I won second prize for ‘If Music be the food of Loue – Sonifying Drama.
No Comments