Sonic Ecosystems:

Project with a direction to highlight relativity and co-dependence between self, each other and the environment. Sonic ecosystems are built by processing ambient sound using real-time weather data.





A group of listeners equipped with headphones are guided through a biologically diverse environment. The role of such environment is to introduce the listener to a bigger contrast, create connections and experience that could later be implemented in a casual environment as well.



There is a choice to be made in the website as well - you can choose if you wish to have an overlay of the raw ambience, ratio of raw to processed ambience as well.




The audio processing is being done using PureData, a graphical programmin language allowing complex audio synthesis and audio-visual system building. In this case, using PureData, I made a connection with Weather data APIs and continuosly streamed up-to-date information, which was then internally used to change the values of a granular synth/delay processing the real time data.



The processed sound would then flow into the radio servers accessed remotely by the listener’s device. In such a way it is possible to conduct the exhibition remotely if needed as well.

First exhibition was presented in Falmouth University gardens, 2022 with  plans to implement it in different environments in the near future.