In research powered by machine learning, scientists found it is possible to tell which color someone is seeing by monitoring their brain activity, independent of their words for that color, adding to our understanding of how memories are formed and helping researchers better "decode" the specific features they contain.
Published April 6 in NeuroImage, the paper details an experiment in which researchers placed electrodes on the scalps of 30 people — with 61 sensors measuring at a frequency of 1,000 data points per second — to record those participants' brain activity as they viewed colors in the context of a visual memory task.
"We presented participants with two different colors and two different orientations on the screen at the same time," Jasper Hajonides, a researcher with the Oxford Centre for Human Brain Activity who studies perception and memory, and the lead author of the paper, told The Academic Times. "Then we used machine learning to decode from the pattern of activity which colors people were seeing on the screen. We were trying to read out their perceptual processes by using their brain activity.
"What we basically showed is that we're well able to decode the colors that people see on the screen," he continued. "We can reliably predict that they're seeing blue, purple, green, red, yellow, orange."
The conclusion that our brain processes color itself suggests that low-level cognitive mechanisms are at work, Hajonides said. "The chromasticity in the color — how much red it has, how much green — is really on a continuum," he explained. "It's not linked to the labels we've assigned." This separation of visual processing from the meanings we've attached to "red" or "blue" was observed in 48 colors used in the trial, though only 12 colors were included in the paper to increase the number of samples per feature.
The researchers did not observe colors being processed by people's brains in real time, though doing so is theoretically possible. In a process that takes about two hours, they deciphered which colors the participants were presented with by using a novel machine-learning algorithm, which Hajonides views as a starting point to investigate and decode many different types of perceptions.
In his future experiments, for example, participants will be instructed to ignore one item and focus on another: "Then you can start asking questions like, 'How much can the brain actually block out information that is not relevant?' Or we could use this to try to decode somebody's memory. What happens after the image disappears from the screen? How long does this perceptual stage last? We can start asking these sorts of questions using machine-learning algorithms."
Current methods of tracking memory content are somewhat limited. Previous studies using human scalp electroencephalography (EEG) have successfully decoded the orientation, spatial location, frequency and motion-direction of visual stimuli, but none had decoded nonspatial features such as color.
"That doesn't tell you anything inherent about what our memories look like or how our memories are formed," Hajonides said of spatial EEG experiments. "I tuned the algorithms a bit to better track the actual features contained in memories. That's why I looked at this algorithm, so it could inform all of my experimental questions."
Among them arises a question that strays into philosophy: Do other people see the sky in a different shade of blue, or are we assigning the same name to different colors? As the new paper demonstrates, there are measurable differences in how different people perceive colors in relation to other colors. Some people can distinguish very well between, say, red and blue, while others see them as more similar. Or someone could see red and green as really close — especially if the person is color blind — while for others, they are plainly distinct.
"It's a question we'll probably never be able to answer, whether your blue is the same as my blue," Hajonides said. "We cannot see the objective color, but we can see how the colors relate to each other."
He has turned his research focus to whether seeing one color in close succession after another influences perception. He's finding that the brain perceives red and blue as being farther apart when presented back-to-back, which could be in line with recent research suggesting that our memories work better when they exaggerate small differences. "Seeing a color a second ago shifts your perspective on the color you're currently seeing," Hajonides said. "You try to make the world as dissimilar as possible to really pick up on the changes going on."
"There's a lot more to perception than people think," he added. "A lot of factors influence it, and I'm very keen to show how perceptions are shaped by everything — your rewards, biases, history, attentional conditions. That's what I'm exploring."
The paper, "Decoding visual colour from scalp electroencephalography measurements," published April 6 in NeuroImage, was authored by Jasper E. Hajonides, Anna C. Nobre, and Mark G. Stokes, Oxford University; and Freek van Ede, Oxford University and Vrije Universiteit Amsterdam.