As part of the Living Observatory initiative, researchers at the Media Lab's Responsive Environments Group are developing sensor networks that document ecological processes and allow people to experience the data at different spatial and temporal scales. Small, distributed sensor devices capture climate and other environmental data, while others stream audio from high in the trees and underwater. Visit at any time from dawn till dusk and again after midnight, and check the weather report below for highlights; if you’re lucky you might just catch an April storm, a flock of birds, or an army of frogs.
We've created a cross-reality browser using the Unity game engine to experiment with presence and multimodal sensory experiences. We're looking for new ways to explore and experience data about your environment. Currently we are using pre-recorded audio from the site but we will soon have streaming live sound to accompany the visuals and data sonifications. The flashes and ukulele notes you hear accompany new data updates coming into the system in real-time from Tidmarsh, with the pitch indicating the relative temperature at each sensor (hotter locations within the site will have higher pitches). Enjoy!
Learn more about the Tidmarsh project and see some sensor nodes being installed in this documentary video produced by EMC.See a video of the virtual experience, featuring sounds from both day and night.
To see more of our work with real-time sensor data browsers, check out DoppelLab.