From the Vulcan Mind-Meld to Stranger Things, we have a continuing fascination with getting inside other people’s heads. These abilities are usually presented as being extraordinary, but advances in neuroscience tools - like electroencephalograms (EEG) and brain-computer interfaces (BCI) - are making them a reality. How can we harness these technologies to improve our quality of life and the character of our build environment?
Taking a human-centred approach means incorporating human perspectives throughout the design process. When we design a building, we design for the experience of the occupants. How does the level of lighting make them feel? How about the acoustics or the ambient temperature? The challenge is that everyone is different and, as Dean Morris from our Melbourne office noted, current feedback standards like questionnaires or surveys, don’t always accurately reflect how people actually feel in these spaces.
Keen to explore the possibilities further, Will Gouthro, Melbourne’s Acoustics leader, approached Dean, Toby Welch - 'our in-house MakerSpace guru’ - and programmer England Kwok to road-test a Brain-Computer Interface (BCI) that could then be adapted for immersive spaces. Embracing a playful, exploratory and quick ‘out of the box’ solution, they used open-source software to print an EEG head capsule out of ABS thermoplastic - the same material LEGO is made of - and wired it up to complimentary software.
Just as a polygraph lie detector test gives feedback on physiological responses like heart rate and blood pressure, this device provides a feed on the subject’s brainwaves. EEGs typically measure five brainwaves: Delta, Theta, Alpha, Beta, and Gamma. Performing various functions under various circumstances, these brainwaves and range from very low frequencies starting at 0.5Hz (Delta = deep relaxation), to very high frequencies that peak at 100Hz (Gamma= heightened perception and cognitive processing).
The team exposed their volunteer subjects to various triggers including music, turning lights on and off and physical movements like stretching and flexing limb. The resulting emotional and sensory responses appeared on a screen, shifting from dense jagged peaks to languid sinusoidal curves and back again. ‘Even putting unconscious bias aside, people can often be quite accepting or unquestioning of aesthetics unless they really know what they want or what they like. We're trying to rethink and reshape how we gain feedback from the earliest stages of a project,' Dean says.
Just imagine the positive gains to human-centric design if we had a device that allowed us to understand the emotional core of a user’s experience - Dean Morris
And just like a polygraph test, an expert is required to interpret the data. ‘The key to getting firm results is in the benchmarks,’ Dean said. ‘You need a baseline for the various waves that you can equate to a range of emotions. For example, using facial recognition to attribute an emotion, we already have a data set of more than a million people in various stages of smiling or frowning. There’s a strong baseline.' Still, the research behind the BCI is still in the nascent stage across scientific fields. Further work is also needed before we test the head capsule in an uncontrolled environment.
As further research is undertaken, and greater expertise engaged, the potential for getting a more nuanced understanding of how humans respond to environmental inputs is huge. Our recent project 'Casting New Light on Comfort' explored the potential for how changes to lighting levels could simultaneously improve occupant comfort and building sustainability. ‘I see great opportunities for air-flow, lighting and acoustic environments to positively affect our happiness and energy levels,’ said Dean, and noted that the natural next step for the project is to test it in the Arup SoundLab.
The possibilities are also huge when it comes to moving things with our minds. ‘Experimenting with BCIs has been happening in the creative space for a while now. Art and lighting installations and gaming have all used neuroscience techniques,’ said Dean. One example is Laura Jade’s 2015, Brainlight created for her UTS Masters. Wearing a cap that measures the electrical activity of the brain, the resulting waves are translated into colours that light up a laser-cut plastic brain.
If we can generate a signal from our brains using EEG technology, we can program technology to respond to that signal. This is the idea behind the BCI. Using software interface, we can link the brain with an external device and enable the brain to direct the device. The applications for assistive technology are many and powerful, such as allowing a paralysed person to control a prosthetic limb, or to move a cursor to the next letter or word on a screen. The increasing sophistication of BCIs – like wireless options – add even more flexibility to applications.
'This type of application is already being used in art installations to control lighting and sound around the world, or in games and relaxation apps via EEG controllers by companies like InteraXon, Emotiv and NeuroSky,' Dean says.
One day we might be able to control air flow or day lighting simply with our minds - Dean Morris
And there are implications for our external environment. Recent research has suggested that, contrary to conventional wisdom, using brighter lights to illuminate streets at night doesn’t necessarily make us feel safer. Could this tech help us work out what does actually make people feel safer? Whether its accessibility, sustainability or sensory aesthetics, the emerging neuroscience field is full of potential to improve our quality of life (as well as stimulating our pop culture imagination!)
We work with industry partners, governments, universities, startups and community organisations. We do this through research partnerships, and as consultants and facilitators for foresight, research, storytelling and technical writing workshops.