Utskrift från Malmö universitets webbplats www.mah.se

Egocentric Interaction research group

Group picture
Human behaviour is influenced by what is going on here and now, i.e the "situation". Our aim is to facilitate the design of interactive systems that provide cognitively economic and personalized experiences taking mental, physical, and digital aspects of situations into account.

News

2020.06.04 External funding from Crafoord foundation for research on Situation Recognition

Arezoo has received €40.000 to support research in the Augmented Attention project, specifically looking at Situation Recognition. Congratulations! More details soon.

2020.06.03 InvAI'20 – Workshop on Invisible Artificial Intelligence at NordiCHI 2020

Exploring the When, Why, and How of Invisible AI-driven HCI systems. Organized by several members of the group in collaboration with colleagues at Rochester Institute of Technology and University of Malta. Web site and call for papers to appear very soon.

2020.04.20 Unconscious Computing through Emerging Wearable Systems – upcoming journal special issue edited by Thomas

What if future wearable systems would blend information and attentional cues into the perceived surrounding environment in a barely noticeable way, so as to not disturb ongoing thoughts and actions, yet influencing everyday microdecisions of the individual (or group)? MDPI Information open access journal. Deadline for manuscript submissions: 30 Nov 2020. More info here.

2019.12.17 BoostingHCI – External funding from STINT for collaboration with Rochester Institute of Technology, USA

Through the granted €20.000 BoostingHCI project 4 of our PhD students will get the opportunity to perform experiments in 2020/2021 at and in collaborations with the following distinguished Human-Computer Interaction labs at RIT:

2019.11.26 MSc and BSc thesis project proposals for spring 2020

For spring 2020 we propose more than 15 BSc or MSc thesis projects related to Augmented Reality, Digital Games Research, Situation Awareness (AI), Persuasive Technologies, Eye Tracking, and Interaction Design. Many of the the thesis proposals are tied to the Augmented Attention project.  Examples:

  • Storytelling techniques for influencing behaviour in everyday situations
  • A player/object tracking system for a collaborative/competitive tabletop board game
  • A Wizard-of-Oz tabletop gameplay tracking simulator app for Android or iOS
  • A head-worn AR gaze guidance prototype for tabletop game
  • Classification of game players based on behavior in an AR board game
  • Interaction design of an AR tabletop game
  • Human intention recognition in a smart environment

More info here.

2019.03.01 Group visit to the Humanities Lab, Lund University

humlab_visitPhoto: Magnus Johnsson.

2018.12.03 Some available MSc and BSc thesis projects

  • Player/object tracking system for a collaborative/competitive tabletop board game (EI01)
  • Wizard-of-Oz tabletop gameplay tracking simulator app for Android or iOS mobile devices (EI02)
  • Analysing gaze patterns during tabletop game (EI04)
  • Intention (strategy) recognition in collaborative/competitive tabletop board game (EI06)

More info here.

Collaboration

We work with partners in both smaller and bigger projects. Application areas that group members have experience from, or current interest in, include: healthcare / smart homes and hospitals, digital games, pedagogy, marketing / smart shopping, smart airports, individual everyday decision-making and long-term behavioural change, ethics in semi-autonomous systems. Contact: Thomas Pederson.

More about us

In the Egocentric Interaction research group we develop and study interactive systems that aim at taking a human perspective on situations, and adapt system behaviour accordingly. We also sometimes choose such a body/mind-centric perspective when analysing and designing interactive systems, e.g. by applying the Situative Space Model pictured below. This "egocentric" design stance complements more common device-centric approaches to Human-Computer Interaction in a world where an increasing number of digital systems influence our individual thoughts and actions (also when collaborating with others) in increasingly subtle ways, e.g. through wearable and embedded sensors/actuators; through local and global processing of biometric data.

Situative Space Model applied to a breakfast scenario. The spaces indicate presence and spatial relationships among physical and virtual objects based on what the specific human agent can perceive (perception space) and manipulate (action space) at the given moment in time (Pederson, Surie, & Janlert, 2011).

Members

Current research projects and themes

Selected past projects

Popular scientific talks 

Selected publications

Last updated by Thomas Pederson