Further developing the Processing project and my goals for it, I have started work on a playback function. My intended outcome for this is that a user will draw to the screen using the light, and then when they are done drawing the software will loop through what they have drawn and play the audio representation of the pattern, while showcasing exactly where the sounds are being generated from. This should provide a sense of ‘completeness’ to the activity, by which I mean that the audience will be able to draw what they are intending to draw, and then afterwards hear start to finish the generated sounds they have produced through their actions. I think this is superior to the previous implementation I had, where the sounds would play as the user was drawing, as this way the audio is played as a whole experience and has the potential to form a cohesive piece of music (we’re probably not talking top-40, but music all the same). By enforcing the link to the visual while the audio playback is happening, the connection can still be seen between the actions of the user and the audio feedback. It is important to find a way to enforce this, then, as if the audio simply plays back to a static drawing it may not be clear that the visual and audio patterns are related.
To start trying to piece together playback functionality into the project, I realised I’d have to store the data gathered from the light tracking, rather than just using it to draw to the screen and then getting rid of it. To accomplish this, I added a new ArrayList to store the points, which I have called ‘notes’ due to the musical nature of the points. You can also see I added a boolean variable to control whether the sketch is in ‘playback mode’ or not.
I used adding this new functionality to the program as an opportunity to clean up the code a little before it becomes out of hand. The new core of the main loop section of the code checks, as it did before, whether the brightest point is higher than the threshold brightness level. If it is, a new function, lightDraw, is called. This function is intended to contain the necessary code to handle the drawing of visuals and gathering of data from the movement of the light source. I thought about how to prompt the sketch going into playback mode. Initially for testing purposes a mouse click was required, however to make the project work entirely based off of the light tracking method of interaction, I have changed the playback mode to trigger if no light source above the threshold brightness is detected (and the doPlayBack variable is defined as true, which will happen after a point has been drawn, so that playback will not occur without any data to play). I think this is an acceptable solution to this problem, although it has the potential to trigger this mode if the user stops drawing for a second, without having finished. I will monitor this situation moving forwards and try to determine whether this solution causes any problems.
The lightDraw function, in its current incarnation, takes the input x and y parameters from the brightest point on the image, and as before plots a point to the screen at these coordinates. However, additionally these values are now also added to the notes ArrayList, in a vector format. This allows the data to be reused in the future for playback purposes. The function also sets the doPlayBack boolean to true, since data has been drawn to the screen meaning the playback function can be called into effect.
If the brightest point is under the threshold, and the doPlayBack variable is declared as true, the playBack function is run. This function is currently incomplete, as it has a few problems with it, though it is a good start towards the eventual implementation I am aiming for. In its current form, the function iterates through the notes ArrayList, at each index retrieving the vector which is stored, playing a note which corresponds to the x and y values and plotting a point to the screen at the relevant coordinate. This point is plotted, this time in red, to show which point in the drawn pattern the user created is generating the sound which is played. This is an early attempt at this, and while helpful in the testing stages will need to be changed for the final project as the it is not very aesthetically appealing. I also intend to further emphasize the link between audio and visual in some way, although I am not yet sure as to what method I will use to accomplish this. As it stands the only link is the physical location of the dots – higher dots produce higher sounds, and the volume of the sound varies with the x-location of the points. I do not feel this second link is visually strong or intuitive enough to immediately comprehend so it is here that I will probably try to improve.
The code featured here gives the following outcome.
Although not the easiest thing to see on video, this showcases the current problem I’m having with the playback functionality. The nature of the for-loop iteration method I have used in the playback function means that as soon as playback mode is entered, the entire array of notes play essentially back to back far faster than I would like. The turning of the points on the screen from black to red also happens all at once, after the sound has played. This is a huge issue, as the intention of the colour change is to show, note by note, which point is producing each sound so it is important that each point changes colour as its respective note plays. Another problem with what I have so far is the need to clear the array after every playback. This is necessary as due to the aforementioned problem with how the playback happens, if the amount of data is too large the program will come practically to a standstill while it tries to play it all at once. In the final build of the project I hope to remove the need to clear the array every time, so I can preserve the possibility of collaboration between people in creating one set of data, and therefore one audiovisual production. I will continue to work on the features I have implemented and strive to fix these issues over the coming days.