Past projects

Practice-based Experiential Learning Analytics Research and Support/PELARS (2014-2017)

PELARS explored ways of generating ‘analytics’ (data about the learning process and analysis of this data), helping learners and teachers by providing feedback from hands-on, project-based, and experiential learning situations (Spikol et al. 2018).

Learners engaged in a collaborative hands-on activities while being monitored by the PELARS sensor infrastructure (Spikol et al., 2018).

An analysis and comparative user study on interactions in mobile Virtual Reality games (2017)

This project explored interactions in Mobile Virtual Reality (MVR) games. A set of MVR games was analyzed with a special focus on head gaze, categorizing and isolating their mechanics implemented with this common MVR technique. This analysis was the basis for an MVR test application that was compared to a traditional game pad controller in three different challenges (Bothén et al., 2018)

Head gaze interaction with circular menu: several options appear as buttons floating around an interactive object after gazing at it (Bothén et al., 2018).

Wearable personal assistant for surgeons (2013-2015)

Addressing the constant need for situation-specific information both in hospital corridors, wards, and operating theaters, we designed a wearable personal assistant for surgeons. Three realistic scenarios were selected and assessed together with professionals (Jalaliniya & Pederson, 2015). The system’s ability to offer quick and precise audiovisual guidance from remote medical colleagues was highly appreciated and so was the touch-less head-based gesture control of devices in the operating theater based on previous work (Mardanbegi et al., 2012) and in this project implemented on the Google Glass platform.

A surgeon (right) uses the Google Glass-based WPA prototype for touch-less interaction with X-rays and MRIs shown on the big display (left) without the need to involve the nurse (in the background) which otherwise typically would be necessary (Jalaliniya & Pederson, 2015).

The project was supported by the EU Marie Curie Network iCareNet under grant number 264738. Partners: Rigshospitalet, Copenhagen; IT University of Copenhagen; ITX Hospital simulation facility, Copenhagen.

A remote surgeon (right picture) uses a tablet computer to provide guidance to the local surgeon (left picture). The local surgeon sees the visual guidance on the WPA Head-Mounted Display in real-time (Jalaliniya & Pederson, 2015).

Smart kitchen (2012-2014)

Based on the results from modeling and classifying everyday activities in a VR-simulated (easyADL, 2005-2007), living lab-simulated (wireless sensor networked smart objects, 2007-2008; distributed multimodal interaction through gestures and speech, 2008-2010) in past projects, the Kitchen As-A-Pal project implemented adaptive and personalized support in a real world kitchen. The spatial relationships between smart objects (“containers”, “surfaces”, and actuators) were egocentrically modeled using the Situative Space Model while the interplay between multiple human actors were modeled and sensed exocentrically using computer vision techniques. (Surie et al., 2013).

Perception space, action space, selected set and manipulated set captured in an immersive VR-simulated easyADL ecology (Surie, 2012).
Smart objects augmented with interactive mediators providing access to virtual objects within the easy ADL
ecology (Surie, 2012).

Smart objects augmented with interactive mediators providing access to virtual objects within the easy ADL ecology (Surie, 2012).

Body posture sensing by fusing skeletal tracking and face recognition (Surie et al., 2013).