This video shows the dynamic nature of a screen and how the manipulation of that screen can effect light. The screen is manipulated using an Arduino and the Firefly plugin for Grasshopper. This movement bends the paper allowing the slits to open and close according to the radius of the bend. The tighter the radius, the more open the slits, allowing more reflected light to spill through. Due to the continued use of the screen, it is beginning to show signs of creasing, making it so that it does not bend in a uniform curve but rather it folds creating nonuniform apertures. I am currently working on the script that will allow me to manipulate a surface by pulling one face of the surface in a specified direction. This one move would translate itself to the surrounding faces, creating a rippling effect or an undulating surface that can begin to define space.
The input data that would be reflected in this movement would result from space habitation. The longer someone lingers in a space the more the surface would respond, either moving downward to engage them or moving upward, resulting in a draped surface. This would begin to map a similar data set that Jeff Maas and I are exploring in our installation in the Library (visuaLatency). As an independent object in space, the Arduino would have to stand alone, processing the data and performing the manipulation natively. In a short term study, the object could be tethered to a computer running Grasshopper, which would allow a different data set to be applied, be it a shading study from Ecotect or environmental data from Pachube. This would allow the user to directly interact with the canopy, allowing an infinite amount of variations and studies to occur.