Skip to main content


Our goal is to develop new methods and technologies in the area of HCI and BCI that take advantage of broad range of signals monitoring human activity at biological and psychological levels. The interfaces we consider will include, yet not be limited to human perceptual and emotional recognition. We discuss how these methods can be applied in several areas addressed in the call.

Our underlying research assumption is to create a technology that will be mostly accessible to users. Therefore, we assume the use of affordable mobile sensors in production systems [2]. Secondly, we assume high modularity of the resulting practical framework to provide a holistic solution. We are focused on collecting many information processing-related signals [3], among which the most important are: brain activity (EEG: event-related potentials [4]; frontal alpha asymmetry [5]; MRI), physiological signals (ECG, EDA), face images (via web/phone camera), multiple sensory signals and body motion. We assume that the final solution will be resistant to the lack of some of them and will make decisions based on the incomplete
or uncertain information (thanks to proper information fusion models).

One of the prospective areas of use of these readings consists in developing a computer framework for the emotive brain-computer interfaces, which structures the interaction in such systems in the form of the socalled affective loop [1]. This model of interaction assumes the existence of a feedback loop: some content is presented in the system, the user reacts emotionally to it, the system detects these affective changes and modifies the content accordingly to evoke or maintain a desired emotional state. It will provide an affective foundation for novel BCIs.

From the computer science point of view, the starting point for this solution can be the AfCAI framework we proposed [2]. It is based on the context-aware systems paradigm. In this approach we propose to collect contextual data from various sources, both directly related to the user (such as physiological signals) and loosely related, e.g. environmental data (such as current weather). The key task is to provide proper models for multimodal data fusion considered in AI.

We are considering several possible test-beds in different areas of applications. These may include affective games (classical and virtual reality-based), affective learning systems, knowledge-based augmented reality navigation systems, advertising systems, to name a few domains we have experience with. The games that offer a rich yet fully controllable experimental environment are particularly important here [6].

Yet another applications are the HCI and BCI systems that aims at using the above mentioned outputs of the mobile systems to enhance communication with patients suffering from disorders of consciousness as well as other medical conditions at which some of the sensory information is not available (sensory substitution and other systems aiding disabilities, see e.g. [7]). Finally, we will consider the possibility of using the mobile
systems to enhance cognitive functioning of the healthy adults in context of the sensory augmentation and incorporation of the mobile devices [8].


[1] G. N. Yannakakis and A. Paiva, “Emotion in Games,” in The Oxford Handbook of Affective Computing, R. Calvo, S. D’Mello, J. Gratch, and A. Kappas, Eds. Oxford University Press, 2015, pp. 459–471.

[2] G. J. Nalepa, K. Kutt, and S. Bobek, “Mobile platform for affective context-aware systems,” Futur. Gener. Comput. Syst., vol. 92, pp. 490–503, Mar. 2019.

[3] A. Dzedzickis, A. Kaklauskas, and V. Bucinskas, “Human Emotion Recognition: Review of Sensors and Methods,” Sensors, vol. 20, no. 3, p. 592, Jan. 2020.

[4] M. Goyal, M. Singh, and M. Singh, “Classification of emotions based on ERP feature extraction,” in 2015 1st International Conference on Next Generation Computing Technologies (NGCT), 2015, pp. 660–662.

[5] R. Mennella, E. Patron, and D. Palomba, “Frontal alpha asymmetry neurofeedback for the reduction of negative affect and anxiety,” Behav. Res. Ther., vol. 92, pp. 32–40, May 2017.

[6] L. Żuchowska, K. Kutt, K. Geleta, S. Bobek and G. J. Nalepa, “Affective Games Provide Controlable
Context: Proposal of an Experimental Framework,“ Proceedings of Eleventh International Workshop
Modelling and Reasoning in Context (MRC 2020), pp. 1–6, 2020.

[7] Kałwak, W., Reuter, M., Łukowska, M., Majchrowicz, B. & Wierzchoń, M. (2018). Guidelines for quantitative and qualitative studies of sensory substitution experience. Adaptive Behaviour, 26(3), 111-127, doi: 10.1177/1059712318771690

[8] Dresler, M., Sandberg, A., Bublitz, C., Ohla, K., Trenado, C., Mroczko-Wasowicz, A., ... & Repantis, D. (2018). Hacking the brain: dimensions of cognitive enhancement. ACS chemical neuroscience, 10(3), 1137-1148.