Utskrift från Malmö universitets webbplats www.mah.se

Egocentric Interaction research group

Group picture
Human behaviour is influenced by what is going on here and now, i.e the "situation". Our aim is to facilitate the design of interactive systems that provide cognitively economic and personalized experiences taking mental, physical, and digital aspects of situations into account.

News

2019.03.01 Group visit to the Humanities Lab, Lund University

humlab_visitPhoto: Magnus Johnsson.

2018.12.03 Some available MSc and BSc thesis projects

  • Player/object tracking system for a collaborative/competitive tabletop board game (EI01)
  • Wizard-of-Oz tabletop gameplay tracking simulator app for Android or iOS mobile devices (EI02)
  • Analysing gaze patterns during tabletop game (EI04)
  • Intention (strategy) recognition in collaborative/competitive tabletop board game (EI06)

More info here.

Collaboration

We work with partners in both smaller and bigger projects. Application areas that group members have experience from, or current interest in, include: healthcare / smart homes and hospitals, digital games, pedagogy, marketing / smart shopping, smart airports, individual everyday decision-making and long-term behavioural change, ethics in semi-autonomous systems. Contact: Thomas Pederson.

More about us

In the Egocentric Interaction research group we develop and study interactive systems that aim at taking a human perspective on situations, and adapt system behaviour accordingly. We also sometimes choose such a body/mind-centric perspective when analysing and designing interactive systems, e.g. by applying the Situative Space Model pictured below. This "egocentric" design stance complements more common device-centric approaches to Human-Computer Interaction in a world where an increasing number of digital systems influence our individual thoughts and actions (also when collaborating with others) in increasingly subtle ways, e.g. through wearable and embedded sensors/actuators; through local and global processing of biometric data.

Situative Space Model applied to a breakfast scenario. The spaces indicate presence and spatial relationships among physical and virtual objects based on what the specific human agent can perceive (perception space) and manipulate (action space) at the given moment in time (Pederson, Surie, & Janlert, 2011).

Members

Current research projects and themes

Selected past projects

Popular scientific talks 

Selected publications

Last updated by Thomas Pederson