A new collaborative international study has shown for the first time that people having lucid dreams during rapid eye movement sleep have the capability to perceive simple questions and provide answers through signals such as eye movements and facial muscle contractions, proving that two-way communication during sleep is possible.
In a paper published Feb. 18 in Current Biology, research teams from the U.S., France, Germany and the Netherlands found that sleepers having lucid dreams were able to engage in “interactive dreaming” and correctly answer questions posed by the researchers in 18.4% of trials.
Lucid dreams occur when a sleeping person becomes aware that they are dreaming. They most often take place during REM sleep, and can be accompanied by eye-movement signals used to indicate that dreamers recognize that they are dreaming. Some of the participants in this state could do simple math, answer yes-or-no questions or tell the difference between types of sensory stimuli.
“These interactions demonstrate cognitive capabilities beyond what was thought to be feasible for a dreaming person,” Ken Paller, a neuroscience professor at Northwestern University and a senior author of the paper, told The Academic Times.
“Dreams have for so long been mysterious for humanity. And we still lack a satisfactory explanation for them, but now we can ask dreamers about their experiences at the time they are having them. Hopefully, in this way we can gain further insights into why we dream, and how dreams might reflect the brain’s behind-the-scenes efforts to maintain optimal memory functions, creatively use our memories to solve problems and cope with the various mental and emotional challenges we face each day,” he said.
Four separate studies were carried out independently in labs at Northwestern University in the U.S., Sorbonne University in France, Osnabrück University in Germany and Radboud University Medical Center in the Netherlands. The researchers then collaborated for the paper published in Current Biology.
The studies took slightly different approaches, but all came to the same conclusion that real-time communication with people having lucid dreams is possible, a finding that the researchers said “opens the door to a new approach for scientific exploration of the dream state.”
The four studies featured a total of 36 participants across three categories: people who were experienced lucid dreamers, people with minimal prior lucid dream experience who the experimenters trained to lucid dream, and one patient with narcolepsy who frequently experienced lucid dreams. Evidence of two-way communication was found among all three participant categories.
In each case, the participant had to successfully fall asleep in the laboratory environment, reach REM sleep, enter a lucid dream state and indicate this with eye movements. If all this was achieved, the researchers could attempt to ask the sleepers questions and test their response to stimuli.
In 26% of the sessions, participants successfully signaled to indicate that they were in a lucid dream. The researchers were able to run 158 trials in which all the conditions were met, and the participants gave correct answers 18.4% of the time, or in 29 of the trials.
For example, one participant correctly answered the spoken math problem 8 minus 6 with 2, by moving his eyes left to right two times. Prior to falling asleep, the participants were instructed to make horizontal eye movements that scan widely from side-to-side to answer questions. These signals stand out from typical eye movements during REM sleep.
After achieving a successful two-way communication, the researchers awakened the sleepers in order to obtain a dream report. Participants typically reported that they had received the experimenters’ questions in their dreams, though some were not recalled or distorted.
In some cases, the participants reported that signals from the researchers were received as if coming from outside the dream or superimposed over the dream, and other signals were transmitted through components of the dream, such as heard playing through a radio.
The results also show evidence of sleep learning. When a participant awoke and was able to report that they had been asked to compute the answer to a math problem, they were displaying information learned while they were asleep.
For future studies, the authors suggested shorter intervals for sleep staging, and using the same methods to assess cognitive abilities during dreams versus being awake. Further experiments could help verify the accuracy of dream reports that people give after they wake.
Paller said the team is currently working on applying the study’s methods outside the laboratory. They developed an app for people to use at home that could potentially help treat sleep problems like nightmares. The researchers plan to conduct follow-up studies to learn more about connections between sleep, dreams and memory processing.
“We’ve long known that cognition and consciousness are not shut off during sleep, but our results now broaden the opportunities for empirically peering inside the sleeping mind,” the authors said in the paper.
“The advent of interactive dreaming — with new opportunities for gaining real-time information about dreaming, and for modifying the course of a dream — could usher in a new era of investigations into sleep and into the enigmatic cognition dimensions of sleep,” they said.
The study, “Real-time dialog between experimenters and dreamers during REM sleep,” was published Feb. 18 in Current Biology. Ken Paller of Northwestern University was a senior author of the paper. Isabelle Arnulf and Delphine Oudiette, of Sorbonne University, Martin Dresler, of Radboud University Medical Center, and Gordon Pipa, of Osnabrück University, also served as senior authors. Karen Konkoly, Bruce Caughran, Sarah Witkowski, Nathan Whitmore and Christopher Mazurek, of Northwestern University; Emma Chabani, Basak Turker, Smaranda Leu-Semenescu and Jean-Baptiste Maranci, of Sorbonne University; Anastasia Mangiaruga, Jarrod Gott and Frederik Weber, of Radboud University Medical Center; Kristoffer Appel, of Osnabrück University; Remington Mallett, of the University of Texas at Austin; and Jonathan Berent, of NextSense, Inc. all served as co-authors.