The other day Dav shared an article that referenced this brilliant TED talk by Chris Milk: How virtual reality can create the ultimate empathy machine. Here two short excerpts:
It’s a machine, but inside of it, it feels like real life, it feels like truth. And you feel present in the world that you’re inside and you feel present with the people that you’re inside of it with. […] And that’s where I think we just start to scratch the surface of the true power of virtual reality. It’s not a video game peripheral. It connects humans to other humans in a profound way that I’ve never seen before in any other form of media.
I was hesitant so far to jump on the VR excitement train, but now I am definitely on board! Also because this semester at Uni we will be coding around with 3D pedestrian simulations viewed with a google cardboard, yay!
One thing I remembered while sitting inspired by Chris’ talk, were my synthetic synesthesia ideas from spring 2012. Two threads came together back then: learning about synesthesia and being super fascinated by it – and the other one was visiting Olafur Eliasson’s installation Your atmospheric colour atlas in Aarhus, Denmark. The installation was a room full of dense fog with light coming in different intense colors from the ceiling. You were intensely immersed in colors as you walked through it – other people with you in the room appeared only as vague outlines as they emerge from and vanish into their realm of colors.
I thought about the following addition I would make to the installation: wireless wristbands that send position and current emotion (measured by skin conductivity and heartbeat) to a computer that controls the colors of the lights in such a way that you are at all times soaked in colored light that represents your current emotional state. Let’s say red for anxious, light blue for relaxed or some scheme like that. As you encounter other people the emotional state of everyone is fully transparent by the colored aura surrounding them. Emotions, and therefore colors, might change when you meet someone. Which might turn out to be amusing, embarrassing or something else – but likely an impressive and novel experience.
This extended version of the atmospheric colour atlas, let’s call it “emotional colour atlas”, fits under the self-invented term synthetic synesthesia: temporarily faking synesthesia-like perceptional abilities by technical means with the purpose to deepen “situational awareness” and “creative presence”. So a visitor of the emotional colour atlas might feel inspired to develop a higher awareness of people’s emotions he interacts with in his life as an effect of being confronted with them in the form of intense colors. And so on… more thoughts about it with more ideas for “installations” here. Be warned though, it might be a laborious read, for it was merely a compilation for myself back then.
Besides empathy and sensitivity, could this approach serve in other fields as well? As some of my previous posts indicate, I am increasingly interested in education with the focus on elegant teaching concepts. Meaning not the circumstances of teaching but the pure point of “transmitting content” as to construct a “maximal useful and sustainable new neuronal structure” in the students brain :)
Let me reference once again Alan Kay‘s excellent TED talk from 2007: A powerful idea about ideas. THE most meaningful TED talk ever for me personally. In previous posts I referred to the excellent example he uses about the qualitatively different approaches to teach the pythagorean theorem. Now I like to show screenshots from a demo he gives using the software that he developed for the $100 Laptop with the Viewpoints Research Institute and others:
What you see there is the recording of a ball thrown from a roof, split into frames. Despite equal time between frames, you see how the distance the ball travels increases from frame to frame: acceleration. But what kind of acceleration?
To find out, Alan stacks the rectangles together that he inserted to measure the distance between the ball positions. Now it becomes visually obvious that the acceleration is constant, because the height of the rectangles increases linear! Finally he uses those insights to code a tiny program that models a ball falling down. And now comes the part where the teaching loop closes – he uses the video recording to show that the virtual model is accurately reproducing what physical reality does! Here a snapshot of the real and the virtual ball falling down side by side:
Now we come to the point of this post. In that sprit… how about using VR to augment a students vision with “Iron Man features” like tracking objects, velocities, trajectories and alike in real-time? The software could learn about object properties and help assemble experimental setups. For instance to countdown the moment to kick off the ball from a slide so that it’ll land exactly on the Lego train’s chimney that will pass by a few seconds later. Maybe the physics engine wouldn’t be parametrized correctly to begin with… maybe the student has to find the correct formula first or the right constant to put into the formula? In that way the “tangible reward” of getting the formula right are overlays of the physical world that are actually correct! Imagine a setup where a whole class has to find a set of formulas in that way in order to assemble a grand experiment where the digital model predicts accurately the physical experiment – awesome! Or a spellchecker of your handwriting? A fact-checker of what the teacher says… ?
So analog to “synthetic synesthesia”, the basic idea is: using technological augmentation of senses and perception temporarily in order to stimulate an inherent desire to expand ones own abilities and knowledge.
It stands to reason that cool, immersive and highly experiental stuff with much wow-factor is being tried out in the VR market because stakeholders want to grab their portion of the public attention. However, I think in the context of schools and universities it is also advisable to stay close to the curriculum with VR experiments as to not further the gap of “exciting new stuff” and “boring school stuff”. Also the tool must step out of the way as soon as possible! This thought has to be living at the very base. Students should not become dependent on the VR tools, but get excited about learning the underlying knowledge themselves. Having it solidly anchored in your own brain must be the desire they should feel when using edu-augmenting technology.