Yesterday I took the final version of my project into Weymouth House to display it on the public screens in the space. Going into this, I had a piece of visual, camera-interactive software which I was pleased worked well and was effective in the environments in which I had tested it so far. I knew the project would likely face challenges on the public screens that I had not yet encountered, and prepared to learn from the experience, taking on feedback and experience.

There are multiple screens in the foyer of Weymouth House which were intended for use in displaying this project. The main groups of displays are a long, thin strip of screens high on a wall, and two sets of two horizontal displays with one screen above the other, on either side of a wall close to the entrance of the building. I ended up using the lower of the two screens in the group facing towards the entrance.

The lower of these two screens is the one my project was displayed on.

The lower of these two screens is the one my project was ultimately displayed on.

When I placed the laptop with my project on it in a location that enabled connecting it to the screen, I realised that this location, as it was facing both the front door of the building and some quite bright lights, was not the best suited for brightness tracking as it was currently implemented in my project. The brightly lit environment was interfering with the tracking of the light I was using for interaction, and causing other sources of light to instead be tracked. I disconnected the laptop and went into the code for the project, in the hopes of adjusting some of the settings to suit the new environment. Specifically, I tested different settings for the threshold brightness at which the brightest point seen by the camera would be interpreted as the light held by the user. I increased this setting several times and tested the sketch with the laptop back in its location below the screen.  Eventually, with a combination of a newly increased threshold brightness value and moving closer to the camera with the light, I enabled the light to be tracked with reasonable consistency. This did, however make it necessary for me to prompt users of the project on where to stand for the best results.

Another issue that cropped up is that unfortunately, I was unable to get my hands on a set of speakers more powerful than those found in my laptop. As evidenced by the video above (in combination with the high level of ambient noise in the room and me having a conversation about the technical aspects of the project), this made it even more necessary to be close to the laptop, in order to hear the audio feedback of the project, which is a crucial aspect of the work.

In addition to this, a piece of feedback I got was, since the default state of the screen in my project is a blank white background, it may be difficult for people to know how to interact with the project. One possible fix for this issue would be to add a splash-screen style message to the work, to instruct people on what it is and how to use the light-based interaction. Rather than this approach, however, I think it would be best for, if the installation were to be publicly displayed again, a physical sign to be present explaining the project and how to get involved, along with a few torches or lights which could be used by the audience. This would also solve the issue that currently exists of people wishing to interact with the work needing to have their own light source (although admittedly they commonly do possess one, in the form of a mobile phone).

Things didn’t necessarily go 100% according to plan, then. That isn’t to say that the project was a failure, however. The people that did test it, after I’d tweaked the code to deal with the environment as best it could, seemed genuinely interested in the work. People seemed to enjoy interacting with it, something  which I feel can be attributed to the ‘involved’ nature of the light-tracking based interaction method (although this caused issues in other areas). I also received feedback from users telling me that the project was aesthetically pleasing when in use, which made me glad of the multiple visual changes I’ve made to the project.

In the future, if I were to display the project publicly again there are changes I’d make. The project works best in a controlled environment, so I’d make sure to either get access to a screen in a more suitably-lit area, or I’d make an effort to set up an artificial environment for the project to be displayed in. This could be accomplished using techniques such as dark fabric backdrops and partitions to keep the light at a better level. I would also strive to make use of some more powerful speakers, as the ones in my laptop left something to be desired and didn’t help towards showcasing the audio aspect of the project.

Overall though, I think I adapted well to the challenges and restrictions that I faced during the screening, and feedback leads me to believe the project had a positive impact on its audience. This, in my view, is grounds to call the day a success.