We are building an iOS app that will allow worldwide users to each create a unique, evolving pattern-shape that reflects and reacts to their movements. A scalable view, from global to personal, will allow users to see all logged-in participants within their chosen view scale. In the graphical view, pattern-shapes will evolve in a 3D environment through which viewers can move by tapping, tilting and moving the personal access device. The project will also track some global ice, through arctic sensors that send real-time data streams. The ice will also create pattern-objects within the piece. For sound, ice data streams will be fed through an emotive classifier, which will determine the human-scale emotive equivalent of their wave pattern. After classification, the data stream will be encoded with human vocal formants appropriate to its class, which will be produced as sound, letting the ice speak.
Last Updated: 5/14/12