What is Presentation Mapping?
The term of “presentation mapping” is from an interaction model for visualization, which is proposed by Yvonne Jansen and Pierre Dragicevic in . The essence of biofeedback is information that is measured from the user’s body and fed back to the user self. Their interaction model enlightened us to look the biofeedback loop from a view of information pipeline. So we propose a framework of biofeedback interaction as shown in Fig 1.
In the first stage of Biofeedback computing, the raw bio-signals are processed and prepared for feedback. In the second stage, there are two key transformations between the bio-data and the final presentations that can be perceived by the users. The first transformation is ‘Audio-visual Mapping’. The bio-data are directly mapped to the basic audio-visual attributes, such as the height of a bar, or the volume of a sound. The outcome of the audio-visual mapping transformation makes up an abstract audio-visual form. We see it ‘abstract’ because, at this point, the data are only coupled with the basic audio or visual attributes (e.g., volume, pitch, color, border, width). For instance, the data are coupled with the height of bars, but at this point, the color of the bars, the location of the bars, the scale of the bars, or the direction of bars are not yet specified. And these operations will be addressed in the second transformation called Presentation Mapping.
The presentation mapping turns the abstract audio-visual form into a fully-specified presentation that can be displayed, perceived and understood by the users. In presentation mapping, the abstract forms can be rearranged into a more representational or figurative form, such as transforming the bar chart into the Heart Bloom flower pattern. In the presentation mapping, we focus on designing an appropriate interface expression to represent the biofeedback data in a more understandable and meaningful way. Specifically, here we think more about the mapping between the interface’s dynamic action and the changing pattern of the bio-data, or the mapping between the overall appearance of visualization and the meaning of this piece of data.
Expressive parameters for presentation mapping
Beyond the basic audio-visual attributes used in audio-visual mapping, for presentation mapping, we extract a set of expressive parameters from the intended interface expression. One expressive parameter often involves multiple basic attributes of visual or audio elements or the working of actuators. For instance, the frequency (basic attribute) of bird sounds and the volume (basic attribute) of wind sound are manipulated together to control the quietness (expressive parameter) of the nature soundscape. If we consider the working of the interface as a model, the expressive parameters can be considered as the model’s external parameters to control the interface expression. And the audio-visual attributes are internal to the model.
Why presentation mapping should be addressed for biofeedback display?
Most of the traditional biofeedback displays address the audio-visual mapping transformation, so they often take the form of graphics or a simple tone. In medical biofeedback devices, this type of biofeedback displays is very common. But in everyday use, for self-use without the assistance of the therapist, these abstract audio-visual display might be difficult for average users to understand its meaning. For example, what does a small heart rate variability mean? From the physiological perspective, it means a good physical resistance to stress; it means a better balance of autonomic nervous system to cope with stress. And this information should be presented to the users in the use of HRV biofeedback system. But unfortunately, few systems present this information well.
During the interaction with a biofeedback system, the interface’s expression is closely connected with users, affecting their perception, understanding, and use of the biofeedback information. So we suggest addressing presentation mapping, which is the next step after the ‘audio-visual mapping’. In my understanding, in the presentation mapping, the abstract forms are further transformed into a more specified and meaningful presentation that can be more intuitively understood and easily utilized by the users.
For instance, in the presentation mapping of the Heart Bloom visualization, we just rearranged the conventional bar chart in four concentric circles. So it looks like a flower. We try to let the visualization speak for itself. The pattern with a flexible shape looks more like a blooming flower to indicate a healthier state and a withered flower indicates a sub-healthy, high-stress state. In the design of BioMirror, the surface #2 curves and bulges outwards and flatten out to represent the IBI data, which is used for guiding the user to practice deep breathing. The shape-changes of the BioMirror look like ‘imitating’ human breathing, which makes the IBI feedback naturally link with user’s breathing regulation.
A good biofeedback display which addresses presentation mapping well can leverage the associations with the existing knowledge of the users to facilitate the decoding of information and finally facilitate the understanding of the feedback meaning and the use of feedback in self-regulation.
Embodying the idea of “Natural Coupling” in Presentation Mapping
The term of Natural Coupling is from a design framework proposed by Stephan Wensveen in his paper ‘Interaction Frogger’ . In most mechanical products, the appearance, the action possibilities, the action and the function are all naturally coupled, which allows for intuitive interaction. As stated by Stephan, to achieve intuitive interaction in electronic products the user needs information to guide his actions towards the intended function. He suggests focusing on “the creation of information through feedback and feedforward”. Although Stephan’s interaction framework is proposed for tangible interaction, it still inspired me a lot. Different from tangible interaction emphasizing ‘natural coupling’ between the appearance, the user’s action, and the device’s reaction, here we suggest that a natural coupling between biofeedback data and interface expressions in the presentation mapping. To be honest, the idea of ‘Natural Coupling’ came very late in my Ph.D. study. After I explored a lot of on biofeedback interface design and looked back on my explorations, I thought of most of our designs are consistent with his principles the idea of ‘Natural Coupling’. I found that it could also apply to the design of biofeedback display.
In the presentation mapping, we suggest addressing natural coupling between the biofeedback data and interface expressions by creating a metaphor. For instance, we use the visual tree or flower as a metaphor to represent a healthy or unhealthy state of users. The semantic relevance between the physiological data and its representation helps to interpret the physiological implications . We used the inflation and deflation of an airbag as a metaphor of lung movement during breathing to provide breathing guidance. The natural coupling between the shape changes of the interface and the user’s chest facilitates breathing regulation. The same thinking goes for the design of RESonance display. We used the brightness transition between the lights and the changes of wind volume as a metaphor of ‘airflow of respiration’ to present IBI feedback, which helps users in breathing regulation. We used the quietness of nature soundscape as a metaphor of ‘peace of mind’ to represent HRV data, which indicate the results of relaxation training.
. Jansen, Y., & Dragicevic, P. (2013). An interaction model for visualizations beyond the desktop. IEEE Transactions on Visualization and Computer Graphics, 19(12), 2396-2405.
. S. A. G. Wensveen, J. P. Djajadiningrat, and C. J. Overbeeke, “Interaction frogger: a design framework to couple action and function through feedback and feedforward,” in Proc. DIS’04, Cambridge, MA, USA, 2004, p. 177.
.Yu, B., Feijs, L., Funk, M., & Hu, J. Breathe with Touch: A Tactile Interface for Breathing Assistance System In The 15th IFIP TC.13 International Conference on Human-Computer Interaction (INTERACT 2015), Bamberg, Germany, September, 2015