This paper expands on the methods used and threads woven into the proposed project, Everything is Going According to Plants. Although much of this was covered in my previous essay, Emergence and Complexity in 'Everything is Going According to Plants', I will summarize the primary goals and components of the project here as a foundation for the discussion of working methods.

The project is conceived as part sacred garden, part simulation, part interactive installation, part data visualization / sonification machine, and part instrument. Each of these aspects require different approaches and serve different purposes within the context of the work at large. The short term goal of the project is to create a provocative experience of the digital embedded within the material, an environment rich with both plant life and digital “life”, where viewers act as pollinators, unwittingly carrying data from point to point and coaxing along a process with no predetermined end. In the long term, the purpose is to seed a body of future work that, on the one hand, explores non-traditional and fringe ways of knowing, and on the other investigations into the possibility of plant consciousness as an active agent in the development of human and global trajectories. At the core of the project, Everything is Going According to Plants (EIGATP) is a complex generative audio-visual simulation of human/plant interactions. A mediated dialogue between plants, humans, and machines, it gives agency to all three, allowing each to act upon the others with little or no ability to understand or predict the potential impact of acting.

Questioning the dominant modern western assumption that humans are the greatest and perhaps the only conscious beings, and by extension, the keystone agents of affecting change, EIGATP places primary agency in the limbs of plants while shifting human agency to a secondary role. To this end, a substantial part of the project is the inclusion of actual sacred botanical specimens within the installation environment. These sacred plants will be cultivated throughout the development process and presentation period and monitored for various biometric characteristics including nutrient uptake, transpiration, turgor pressure, and rate of growth. This information will be collated over the entire cultivation period leading up to the beginning of the simulation to create fixed data sets for use within the virtual environment. Once the presentation period begins, the collected data will be used to define edge conditions of the space, while new data exerts real-time influences on both the physical and virtual environment, affecting light, sound, and perhaps even physical layout. To facilitate this, it will be necessary to create a laboratory / garden / hot-house environment. This space will house the biological specimens, provide a test bed for early prototypes, and create an embodied dialogue between the plants, programmers, and virtual agents. Requirements for the space will include controlled lighting to simulate (and actually facilitate) both diurnal and nocturnal photosynthesis, sufficient HVAC to maintain or manifest specific temperatures and levels of humidity, and enough space to allow for small audiences during exhibitions and larger, though still intimate audiences, during performances. The space will ideally be located within or near the Denver city limits and have a connection to a relevant institution such as the Denver Botanic Gardens.

The question of which plants to use for the project is important to address early on. The field of ethnobotany demonstrates that every culture the world over has considered or does consider certain plants to be sacred. These plants can be elevated to sacred status based on a wide variety of reasons such as sustenance, pigmentation, medicinal characteristics, both curative and poisonous, and, finally, the ability to elicit altered states of consciousness. While the main focus of this project is on the last category—that of plants with properties often referred to as psychotropic, psychedelic or, more recently, entheogenic or ecodelic; plants with the ability to alter the consciousness of humans—future projects will incorporate the whole array of sacred plants. Extensive monitoring of these specimens will directly influence the virtual simulation space and consequently, the visual and sonic displays of the virtual agents. This in turn alters the visual and sonic character of the physical space that the plants themselves inhabit creating a plant/machine/environment feedback loop. In addition to exerting an inherent impact based on the biometric data, the plants also serve as attention vectors, enticing unknowing or even knowing human agents to observe. This interaction and giving of attention will amplify the degree to which the plants govern the virtual agents within the simulation.

Preliminary to the creation of the lab, a number of factors relative to the choice and viability of particular specimens need to be considered. First and foremost is the question of legality. While many of the plants under consideration are legal to cultivate, there are gray areas around that legality. Finding an institution or establishment that will allow me to do this research is an important and pressing concern, and the proper language and theoretical context will be key to being successful in this pursuit. Wade Davis, author of One River and ethnobotanist for National Geographic, and Richard Doyle, author of Darwin's Pharmacy, both provide great role models for grounding the study of sacred plants in a reputable field of research. Davis more for his ability to contextualize the historical context through a biographical and phenomenological lens, and Doyle for his ability to ground the investigation firmly within modern philosophical and, to a somewhat lesser extent, scientific context. Another equally important concern is selecting specimens that either share ideal growing conditions or finding a space that can accommodate isolated growing environments.

Visualization and Sonification: Data Dramatization and the VIA

Visualization and sonification of data, both real-time and recorded, are extremely important factors in EIGATP. Unlike some visualizations that seek to distill information to a kernel, the goal is to create a complex ecosystem of data. The virtual agents that populate the simulation will develop based on two layers of data: a genetic code matrix that is passed on and evolved through reproduction and a memetic layer that is expanded and shared through interactions both within and external to the system. The genetic layer would cover things like the array of possible visual states or the particular frequency spectrum of sounds available to a given agent, the average lifespan, etc. The memetic matrix on the other hand would include things like social behaviors, vocabulary (the particular shape of sonic expressions), affinities toward certain modes of interaction such as play. The particulars of these two code bases still need to be fleshed out, but the important part is that change can occur based on both experiential data and evolutionary direction.

The visual and sonic environment will rely heavily on abstraction, using similar visual and sonic techniques as previous installations done by NoiseFold such as, 100 Monkey Garden (2005) and iIi (2007). Some of these techniques include the use of video feedback techniques on the geometries of the virtual forms, the sonification of geometry data, the use of generative textures to thwart standard expectations of 3D, and the ability for the forms to replicate and evolve. By creating an abstract visual environment and keeping a slight distance from traditional western musical vocabularies, EIGATP seeks to create a wilderness of new forms and experiences that defies easy categorization. At times this may illustrate the data, while at other times it may obfuscate it or even completely obscure it. By juxtaposing different data sets, the unexpected can emerge and the inclusion of live data, human and plant agents, and virtual agents with a certain degree of autonomy, ensures that no two runs, or even two days, of the simulation will be the same. Admittedly there is also no guarantee of compelling aesthetic spaces. In some ways this mirrors the human discovery of various admixtures that are highly unlikely to have been found by chance, but nonetheless have been found and constitute a breathtaking array of recipes.

Some specific datasets that will be used are the plant DNA of the selected botanical species, the chemical compounds that are responsible for the more active properties of the sacred plants, climate data pulled from the native regions of each plant, the growth and transpiration patterns of the garden, and a number of additional real-time inputs, described in more detail below.

Interaction

A defining aspect of the project is the role of interaction. Two-way interactions between humans and plants, plants and machines, and humans and machines will determine the growth conditions and final aesthetic outcome of the simulated environment. This interaction may be direct or indirect, intentional or incidental, with ramifications large or small, brief in manifestation or permanent and course altering in scope. Bridging the digital and analog worlds and allowing for this communication is a mesh network of physical sensors. These will monitor human and plant agents as well as environmental conditions.

For sensing human interaction, Microsoft Kinect cameras will be stationed over head throughout the space, recording movement and rest and feeding that data into the central brain of the system. Possible outcomes from human interactions could include cross-pollination of different environments (by stopping to pause at one plant, then another), enforcement of certain visual and sonic expressions (through paying special attention to screen space during a particular moment, or perhaps the clothes that people are wearing could augment the texture palate used by certain virtual populations), or alteration of virtual currents and consequently interactions among virtual agents (from standing in one place too long, like a rock in a river). Some of these may affect the possible range of mates, changing the genetic makeup of the environment, while others would act more on the memetic layer.

Plant and environment sensing is a larger task, with sensors for light, including both infrared and uv, soil dampness, nutrient levels, turgor pressure, transpiration of gasses, atmospheric makeup, electromagnetic fields, and empirical observations such as rate of growth all coming into play. Each plant species, in addition to contributing to the overall environmental factors that govern the virtual world, will also feed data into a particular subset of the virtual population. Those agents particularly effected by any given plant will exhibit the impact of plant-human interactions more actively. These affiliations will develop and change over time based on the movement of forms within the system and on patterns of human interaction.

Instrumentation

The final piece to EIGATP is the aspect of instrumentation. As an instrument it seeks to challenge the performer by constantly changing and complexifying the raw materials of the instrument. The performer is asked to enter into an improvisational dialogue with plant based intelligences and virtual agents, a call and response with a barely known other. In addition to the range of interactive possibilities outlined above, the performer has access to the entire history of the space and can move backward in time, reintroducing possibilities that exhausted themselves or that never came to pass. The performer can also amplify the impact of any or all input sources, allowing for dramatic changes to the environment or agents in short periods of time. This power is not without a cost, however, as all the changes that the performer makes are persistent, permanently altering the environment. Much like the role of the shaman in many cultures, the performer acts as radical conduit to other worldly ways of knowing and experiences, experiences that change participants, who in turn change the world around themselves.

Conclusion

EIGATP requires a wide breadth of technical skills and research foundations, from botany to micro-controller sensor networks, to motion tracking and improvisation. If successfully integrated, these different paths will converge in a space that allows a mediated dialogue between us and our physical and digital environments. At a time when undiscovered species of plants and animals are disappearing at alarming rates, we need more than ever to re-evaluate our relationships with and impacts upon our environment so that a possible future might see us the shepherds of a golden age, not the heralds of an earthly apocalypse.