Image Credit: Joseph Popper
Digital Collaboration Hubs is an interdisciplinary research project that explores the potential of Mixed Reality (MR) technologies to enrich design collaborations in co-located and distributed work scenarios – focussing on the capacity of MR to support design processes in spatial design and urban planning. It comprises a constellation of business and research partners, which form multiple research streams based in interaction design, spatial design and applied psychology.

In our stream at the Critical Media Lab, we seek to develop design principles for the integration of MR into physical spaces and the interaction of physical and digital layers in networked studio environments. We work closely with VITRA International AG, our project business partner, to develop our research questions and address specific collaboration concerns for tangible design contexts. Our design research focuses on responsive environments, where we test out relationships between physical and digital spaces with sensor-actuator systems, where user behaviour is captured by sensors and the data is mapped onto the parametric settings of physical objects and spaces. We experiment by prototyping objects and furniture pieces, embedding them in different ways and testing their potential for producing data and creating adaptive and dynamic hybrid spaces.

Our interests in MR depart from the potential to bridge the physical separation of distributed collaborators. To what extent can a collaborator’s presence and situation be made more perceivable, and how can experiences be shared across local or global distances?

The main collaboration experiment involved two people working together to design an office space across various spaces and using different media tools. We developed two identical workstations with VITRA furniture and installed them in separate rooms in the Critical Media Lab. Test participants each worked from an individual station and were connected to their collaborator via wireless microphones and augmented reality headsets. The environments were technologically augmented with an array of RGB lights and surround sound speakers, which produced a series of real-time awareness cues that corresponded to the participants’ movements and biofeedback as they worked. The ambient data representations were mapped into both workstations, enabling the participants to see their own biofeedback and that of their collaborator simultaneously. These elements were designed to amplify the collaborators’ awareness of each other’s physical, emotional, psychological states and changing working activity, and to help correlate and signify moments of synchronisation as they worked together.

Academic Partners

Institute for Research and Development of Collaborative Processes (FHNW APS)
Institute for Interactive Technologies (FHNW HT)

Partners

VITRA AG International
City of Zurich
afca AG