An interactive digital installation at the Chicago Museum of Contemporary Art, projecting a multi-sensory virtual tree that reflects the ever-changing energy of the physical space.
Chicago MCA - Harvest Tree
In partnership with the Chicago Ideas Week conference, Harvest Tree was a digital installation at the Chicago Museum of Contemporary Art, designed to reflect the organic nature of ideas that grow when people gather to share knowledge.
The installation projected an abstract, procedurally generated, constantly evolving digital tree onto the gallery wall. The tree was made to react to various sensory inputs from the surrounding environment and the activity of the observers.
The more people that came into the room, the larger and more mature the tree grew, gaining branches, twisting and turning, boughs thickening. Sensors monitored the flow of traffic into and out of the gallery space. As people left, the tree would shrink back to a sapling.
As people moved around the room, their collective movement was tracked and measured through visual sensors and cameras. The more movement in the room, the more the tree would start to bend and sway in the wind.
With the growing level of noise in the room, leaves grew out of the branches with increasing volume. If the audio sensors detected a drop in the level of conversation in the room, then the leaves began to wilt and fall from the tree.
The system constantly monitored Twitter for specific hashtags which would cause fruit and flowers to blossom on the tree, encouraging the audience to interact on social channels.
Through the use of fractal algorithms, every generation and iteration of the tree structure was unique. The custom visual rendering engine incorporated Lindenmayer systems, Voronoi tessellations, triangular surface subdivision, Perlin noise and inverse kinematics to create a cohesive yet dynamically morphing system of infinite variation.
Chicago Museum of Contemporary Art
Technical Concept, Technical Direction, Delivery Oversight
Interactive Installation, Projection, Gestural Interface, Data-Driven Visualisation, Procedurally Generated Graphics, Multi-Sensory Inputs (body-count, movement, volume), Kinect Sensor, Arduino, Twitter API Integration