Practice-based Experiential Learning Analytics Research and Support/PELARS (2014-2017)
PELARS explored ways of generating ‘analytics’ (data about the learning process and analysis of this data), helping learners and teachers by providing feedback from hands-on, project-based, and experiential learning situations (Spikol et al. 2018).

An analysis and comparative user study on interactions in mobile Virtual Reality games (2017)
This project explored interactions in Mobile Virtual Reality (MVR) games. A set of MVR games was analyzed with a special focus on head gaze, categorizing and isolating their mechanics implemented with this common MVR technique. This analysis was the basis for an MVR test application that was compared to a traditional game pad controller in three different challenges (Bothén et al., 2018).

Wearable personal assistant for surgeons (2013-2015)
Addressing the constant need for situation-specific information both in hospital corridors, wards, and operating theaters, we designed a wearable personal assistant for surgeons. Three realistic scenarios were selected and assessed together with professionals (Jalaliniya & Pederson, 2015). The system’s ability to offer quick and precise audiovisual guidance from remote medical colleagues was highly appreciated and so was the touch-less head-based gesture control of devices in the operating theater based on previous work (Mardanbegi et al., 2012) and in this project implemented on the Google Glass platform.

The project was supported by the EU Marie Curie Network iCareNet under grant number 264738. Partners: Rigshospitalet, Copenhagen; IT University of Copenhagen; ITX Hospital simulation facility, Copenhagen.

Smart kitchen (2012-2014)
Based on the results from modeling and classifying everyday activities in a VR-simulated (easyADL, 2005-2007), living lab-simulated (wireless sensor networked smart objects, 2007-2008; distributed multimodal interaction through gestures and speech, 2008-2010) in past projects, the Kitchen As-A-Pal project implemented adaptive and personalized support in a real world kitchen. The spatial relationships between smart objects (“containers”, “surfaces”, and actuators) were egocentrically modeled using the Situative Space Model while the interplay between multiple human actors were modeled and sensed exocentrically using computer vision techniques. (Surie et al., 2013).


ecology (Surie, 2012).
Smart objects augmented with interactive mediators providing access to virtual objects within the easy ADL ecology (Surie, 2012).
