top of page

Muse S Gen 2 vs Gen 1 and viability for Lucid Dream induction/analysis

Updated: Jun 14, 2022

Below are some further updates from the last post on the Muse band as I’ve been looking at additional use cases for the hardware as well as comparing the Gen 2 version against the Gen 1. A follow-up model was released earlier this year that had improvements mainly to the temporal sensors that are now much larger than in the original version. I still have the old one and you can see the sensor size difference on the right in the image. Much of the rest of the band and hardware is pretty much the same with the only other change I'd say worth noting is the improved battery charging speed:

I’ve found the data consistency to be much better on the temporal sensors and that’s helped in analyzing EEG data because they have less noise related to muscle movements and blinks than the frontal ones so I now use them more often where possible when I haven't dislodged the band overnight. These temporal sensors are still more prone to connection issues than the frontal ones in my experience despite being much larger.


Viability for Lucid Dream Induction

I’ve discussed before on the history of lucid dream induction and also a very low cost approach to building one of these using wearable Arduino boards: https://jabituyaben.wixsite.com/majorinput/post/lucid-dreaming-with-the-circuit-playground-express


For the Muse band, I’ve trained a convalutional neural network that estimates sleep stages from fairly short epoch’s of EEG data at a fast rate. This means that it should be possible to create a workflow from this to attempt triggering a lucid dream.


Lucid dream induction based on EEG isn’t new, Hpynodyne Corp’s ZMax supports this kind of thing but it doesn’t come cheap: https://hypnodynecorp.com/lineup.png also I haven’t seen any feedback on the device’s capability in this space.



Induction via an EEG headband is an appealing approach when you consider some of the alternatives:

1. Checking for eye movement (EOG)

a. Most of these require some form or hardware that either sits in front of your eyes or on your face to track for rapid eye movement. Something like the Muse band is less intrusive and more comfortable.

b. Often, by the time these devices pick up that you’re in REM, you are already in some kind of dream narrative to a point where dream cues often just get worked into the narrative rather than inducing lucidity.

2. Using a heart rate sensor

a. I haven’t really seen many approaches for this but it’s a viable approach, something like this: https://apps.apple.com/us/app/id1566419183


Although there's not many reviews so not sure how well this specific one would work but it is definitely possible to estimate REM phases with the accelerometer, HR sensor and also the body temp sensor combined. It’s not going to be as accurate as EEG and in particular from a timing perspective, which is the main downside to this approach; most approaches with a HR sensor will struggle to track anything in real-time, these tend to work better in calculating sleep stages retrospectively with all the night's data at hand to start with.


The challenge is that Muse doesn’t have a supported SDK anymore so there’s at least no officially supported way to stream the data other than Mind Monitor’s OSC streaming method. The other option is to use muse-lsl:


Now that I’ve established a way to estimate REM stage sleep, I will look at options to stream the data and then trigger an audio cue to attempt lucid dream induction.


Also of interest is what EEG signals will look like during a lucid dream. I haven’t managed to have a lucid dream with the Muse band yet but I do have a couple of spectrograms of failed attempts at wake induced lucid dreaming (WILD) that spans about 30-40mins.

In these spectrograms, you can see a high amount of alpha signals along with some theta waves. The alpha levels are much higher compared to any other situation I can compare to. I’ve not been able to reproduce these during mediation in the day or before sleep, only during WILD so it suggest circadian cycles play a big part at least for me. Theta signals generally crop up during sleep onset so that’s not so surprising because as I say, these are failed attempts so eventually I gave up and went to sleep in both of these despite attaining some level of sleep paralysis.


What I’d expect to see for a lucid dream is slightly stronger signals across all bands than what I normally have during REM. REM is observed as having a similar mix of frequencies to the wake state but just lower strength so I’d imagine lucid dreams would look a little stronger than normal if anything. This is also based on research that’s been conducted on EEG for lucid dreaming vs wake & REM: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2737577/


Astral projection or out of body experiences (OOBE’s) are observed in NREM sleep so those also have a different profile to normal REM stages, which will also be something worth trying to track.


This is an example of the dashboard I’ve been using to profile sleep that uses a few different modules in Python, a key one for me is this for hypnogram and spectrogram plotting:


It’s clear that REM and N2 are probably misinterpreted in a few places by the model I'm using and I’ll need to look at balancing the data for the neural network and other tweaks to fix some of that as a start.

And then just to show you 2 different examples of what clinical data looks like from what the neural network is based on, you can see how much stronger those EEG signals are generally:

As I said, the next stage will be to look at using Muse-LSL to stream the data rather than rely on Mind Monitor and then see if we're able to classify sleep stages as close to realtime as possible.

1,610 views3 comments

Recent Posts

See All
bottom of page