microcontroller (Flwrs), sensors, custom software
Garden enfolds a dynamically responsive audio-visual environment within third-party video conferencing platform. The sum of the virtual and attendant participants movements modulates the ecological tone. I’m interested in investigating the potential, or lack thereof, to cultivate a feeling of co- within a virtual environment without relying on easily
mappable causal relationships between the participants, the network technology, and the virtual space.
Garden, as a system, mediates a participatory feedback loop within a video chat environment:
1) Video chat participant action within virtual environment (as it influences local light output)
2) Light sensor input and control data mapping to modulate parameters of virtual environment.
3) Output of virtual environment to video chat participants.
The system has 5 modules:
The system is designed to be reactive to changes in local light intensity by dynamically adjusting the maximum and minimum available light. While it is clearly inferior within a video chat environment to more advanced body and face tracking available in software, I like that it is a low-tech attachment that is agile enough to be in relationship to both local bodies and virtual ones simultaneously.