Team
John Desnoyers-Stewart
Roles:
- Creative Direction
- 3D Modelling
- Interaction Design
- Hardware Design
Katerina R. Stepanova
Roles
- Research Design
- User Experience Design
Project Concepts
We began the process of ideating for our project first by informally discussing concepts, then through a semi-formal brainstorming session. We categorized ideas, technologies, and goals and then further developed the ideas we were most interested in. At this stage these ideas are quite varied, but each is promising in its artistic merit, consideration of new forms of interaction in VR, and the potential to investigate the resulting products to better understand the emotional and experiential effects of such forms of interaction. We would appreciate any comments or opinions on the projects below as we begin to converge on the project we will select for this class (others may be explored at a later date or may become integrated within another core concept).
1. Tangible Narrative
This concept would explore interaction through non-visual senses and non-linear narrative through a VR installation featuring a table with a variety of tracked tangibles. By picking up and interacting with a variety of tracked objects the user would uncover elements of a story which could be visual, oral, aromatic, or haptic. An initial concept is the telling of a story from the perspective of a tree which questions the disposable products in the society and asks the participant to consider the value of the materials used in production through ordinary objects (e.g. toilet paper, coffee cup, handcrafted bowl). This could also be investigated by creating an interactive tangible sculpture where participants are encouraged to explore the sculpture in virtual space through touch.
Technology: VR headset (Vive), Tangibles and customized hardware(Arduino/Vive Tracker)
2. Virtual Dance Partner
Building on the GrooveNet by Omid Alemi et al., we would develop a virtual dance partner. This would either take form as mixed reality or cave-based VR to avoid simulator-sickness. A Kinect would be used both to train the system and for interaction. One option would be to train the artificial intelligence system on two dancers, training the algorithm to learn to predict movements of one dancer based on the other. This could then be used to develop a mixed reality interface where the virtual partner responds to the movements of the participant (and vice-versa). Alternately the AI could be trained using professional dancers and then used as an installation where participants are encouraged to dance through enhanced feedback of their movements. Multiple users (up to 6) could be used as input and would need to move in time to produce desired results (coherent rhythmic movements). This system could be sonified to produce sound from the movements. Dancing with a partner involves movement synchronization with a partner and the music that can induce flow, increase the feeling of connection and contribute to well-being. As such, this installation can explore strategies for delivering experiences that contribute to well-being and pro-social attitudes.
In one possible implementation (on the left) we can also explore integration of a smooth transition from the real world into a virtual world. Using either an MR headset or a VR headset in combination with a pre-recorded 360 video of the physical space, we can design a gradual transition from the real world into the virtual. Here, a participant will start from seeing themselves in the real surroundings through the headset. Then, a creature will appear in front of them, responding to their movement, as if inviting them for a dance. The more the participant engages with the dance with the creature, the more environment around them gets populates with virtual magical objects. If the participant doesn’t want to interact with the dancing creature, the magical elements will slowly fade away returning back to the real world view.
Alternative implementation involves beautifying one’s movements to appear as a professional dance in a virtual mirror. Here, it will appear as if participants are just marking the dance movements, that the virtual mirror will exaggerate to a full dance phrase. This interaction may encourage participants to interact and explore the range of body movements, rewarded by an aesthetic feedback.
Technology: Kinect, Machine Learning, Projection or Mixed Reality headset
3. Body Mixer
A third concept is an interactive installation which combines the movements of participants into one or more shared bodies depending on what modes of interaction work best. This would investigate participants’ identity, body ownership illusion, out-of-body experiences, formation of a shared identity, body asynchrony, movement synchronization, and forms of play and collaboration in VR.
3.1. Projection-based (cave?) Exquisite Corpse
In one form, each participant would be given a portion of the body to control (e.g. 2 users: top/bottom, 4 users, one limb each, etc.). The resulting “exquisite corpse” would become the sum of their movements and would encourage a playful interaction wherein they would move in or out of rhythm with eachother and the resulting virtual agent.
3.2. VR-based Movement Blending
Alternately, the experience could be purely done in virtual reality where the user’s bodies are variably swapped. The virtual body would change between completely swapped (partner becomes a mirror), puppeteering (mixture of partner and own body movement where both are visible and conflicts are highlighted), and combined (producing a single shared body).
3.3. Mixture: Exquisite Corpse VR
A third option would be to use the projection based exquisite corpse where a VR headset can be worn by an individual who gives control of their virtual body to the other participants.
Technology: Kinect, VR Headset (Vive), Projection
4. Bioresponsive Instrument
The fourth concept is to combine movement and biosensing to produce a virtual reality instrument. Sound would be generated relative to the movement and positions tracked using a Kinect. A Muse 2 or similar would input heartrate, breathing, and brainwaves to control parameters such as the frequency of note generation, tempo, and the qualities of notes played, complementing the movements and producing an embodied instrument. Interaction would be visualized, completing the user’s sense of embodiment as instrument. This installation will provide a representation of the participant’s internal states as an audio-visual composition, thus providing a complex and intriguing form of biofeedback, that can encourage participants to connect with their bodies.
Technology: Kinect, Muse 2, VR Headset (Vive)
5. JeL – physiological synchronization in VR
JeL is a bio-responsive generative art VR installation that encourages participants to synchronize their physiological functions, such as breath and brainwaves. This installations aims to increase interpersonal connection between participants and the resulting pro-social behaviour. Additionally, JeL encourages participants to develop the feeling of connection with underwater creature through embodying a jellyfish and collaboratively producing a glass sponge through synchronizing with their partner. We have started working on that project in the Fall 2018 and have put together a semi-functional prototype that service as a conceptual prototype. We will continue working on this prototype through refining the user interaction, improving the L-system to generate a glass-sponge like structure, adding generative audio, extending the interaction to include body tracking and allowing participants to pick their jellyfish through touch, increasing the richness of the environment and exploring brainwaves and movement synchronization, in addition to breath synchronization.
Technology: BioPlux/Muse 2, two VR Headsets (Vive), Kinect