​ASGARD

Advanced System for Gesture and Activity Recognition and Detection
 
This project is related to a Ph. D. thesis born from a collaboration between the University of Bedfordshire (UK) and the University of Applied Sciences of Western Switzerland of Fribourg (CH).
 
ASGARD is a system that allows people to interact with Smart Environments.
To enhance the interaction experience: 
  • Gestures for a natural interaction between humans and the smart environment
  • Activity recognition and context awareness to support users’ activities
  • Augmented reality for user feedback

Current prototype
ASGARD is a novel context-aware system for deictic gestures interaction with smart environments. The current prototype tracks multiple users; moreover, it recognizes inhabitants' postures and gestures in real-time. This information, enriched with smart objects coordinates, is reconstructed in a 3D model to allow the recognition process. Finally the system executes the programmed tasks to support the users' activity. Two Microsoft Kinect depth cameras have been used to acquire the data and a framework for the communication with the smart objects has been adopted. A video is available in the Demos section.
 
scenario.png 
 
​​