Could we create narratives that adapt to the viewer and vice-versa?
Doing so, we could create media content that understands emotions in real-time and therefore change itself to modulate our watching and listening experiences.
Foreseeable uses cases of this type of media could be:
During my Master Thesis, I studied human emotions and how audiovisual content could influence the human affective state. I worked on a biofeedback system that uses emotion-rated media clips (IAPS database) to create "pseudo-narratives" that influence the affective state of the user which at the same time influences the narrative.
The system was developed using g-tec hardware, custom-made software using Matlab for the signal processing and java for the adaptative media player.
The visual stimuli used during the experiments was taken from the International Affective Picture System IAPS automatically with the software using the ratings. To better understand the database, I created a custom visualization tool to display the how the media clips are rated in the database and therefore know which clips are chosen during the "pseudonarrative" creation.
MSc Thesis. CSIM Master UPF