Skip to main content

The eGlasses project is focused on the development of an open platform in the form of multisensory electronic glasses and on the integration and designing of new intelligent interaction methods using the eGlasses platform. This is an initial development focused on long-term research and technological innovation in perceptual and super-perceptual (e.g. heart rate, temperature) computing. It is an emerging technology that is also focused on the creation of mobile, perceptual media. Perceptual media refers to multimedia devices with added perceptual user interface capabilities. These devices integrate human-like perceptual awareness of the environment, with the ability to respond appropriately. This can be achieved by using automatic perception of an object’s properties and delivering information about the object’s status as a result of reasoning operations. For example, using the eGlasses, it will be possible to control a device, which is recognized within the field of view using the interactive menu, associated with the identified device. Other examples include presentation of a recognized person name, recognition of people with abnormal physiological parameters, protection against possible head injuries, etc. The platform will use currently available user-interaction methods, new methods developed in the framework of this project (e.g. a haptic interface) and will enable further extensions to introduce next generation user-interaction algorithms. Furthermore, the goal of this project is to propose and evaluate new and intelligent user interactions, which are particularly useful for healthcare professionals, people with disabilities or at risk of exclusion, and to create and evaluate behavioural models of these mobile users. The main scientific and technological objectives of the project are to design and evaluate the following:

  • eye-tracking hardware and algorithms for a user, who is mobile in a noisy real world environment,
  • algorithms for perceptual media and for super perceptual computing,
  • methods for locating objects and guiding vision towards the identified objects,
  • methods of interactions with users and objects (menu of activities for the identified person or object),
  • a haptic interface in a form of a peripheral proximity radar,
  • methods for the recognition of the user’s own gestures and recognition of gestures of the observed person,
  • methods for context-aware behavioural studies,
  • methods for reference applications.

The result of the project will be an open platform in the form of multisensory electronic multimedia glasses and a set of new methods for intelligent user interactions, especially in the context of perceptual media.

Call Topic: Intelligent User Interfaces (IUI), Call 2012
Start date: (36 months)
Funding support: 1190660 €

Project partners

Partnership
University of Lorraine
LCOMS
France
Gdansk University of Technology
FETI
Department of Biomedical Engineering
Poland
Hochschule Luzern
iHomeLab
Switzerland
University of Applied Sciences Upper Austria
Media Interaction Lab (MIL)
Austria
University of Luxembourg
Interdisciplinary Centre for Security, Reliability and Trust
Luxembourg