Tag Archives: EEG
On June 18-20 we’ll be hosting the QS15 Conference & Expo in San Francisco at the beautiful Fort Mason Center. This will be a very special year with two days of inspiring talks, demos, and discussion with your fellow self-trackers and toolmakers, plus a third day dedicated to the Activate Exposition. As we start to fill out our program we’ll be highlighting speakers, discussion leaders, sponsors, and attendees here.
We are excited to have Bill Schuller contributing to our growing QS15 Conference program with his “Quantified Talk.” Bill has been involved in the Quantified Self community since 2009 and currently organizes the Dallas Fort Worth QS meetup group.
This June, Bill will be sharing his process and what he’s learned from tracking his public speaking. Stemming from his very first show&tell talk in 2010 he’s been working to figure out ways to understand and ultimately quell the butterflies and nerves that come from speaking in front of an unfamiliar crowd.
We spoke with Bill about what he’s looking forward to at the conference and like many of our attendees he’s interested in what other’s are learning from their data, what new tools are being used, and how to turn vast amounts of data into actionable information.
“I love to see what wonderful things people are learning by reflecting on their tracking. Of course there’s also the gadgets. So many gadgets. I am also very interested in how QS tools and methodologies can help individuals who happen to run small businesses improve their business outcomes.”
If you’re interested in tracking and improving public speaking, or just want to meet and mingle with our great Quantified Self community members then register now for the QS15 Conference & Expo.
We’ve featured Bill Schuller here on the Quantifief Self website before. He’s given some great talks about tracking with his children. However, Bill hasn’t been satisfied with these or his other previous public speaking efforts. So like any good self-tracker he set out to see what he could learn from tracking and measuring his public speaking. In this meta-talk, presented at the Bay Area QS meetup group, Bill presents some of the tools he was using in real-time to measure himself and the audience. Once your done viewing this talk make sure to send Bill feedback by filling out his short survey here.
Today’s post comes to us from Rain Ashford. Rain is a PhD student, researcher, and hardware tinkerer who is interested in how personal data can be conveyed in new and meaningful ways. She’s been exploring ideas around wearable data and the hardware that can support it. At the 2014 Quantified Self Europe Conference, Rain led a breakout session on Emotive Wearables during which she introduced her EEG Visualizing Pendant and engaged attendees in a discussion around wearing data and devices.
By Rain Ashford
It was great to visit Amsterdam again and see friends at the 3rd Quantified Self Europe Conference, previously I have spoken at the conference on Sensing Wearables, in 2011 and Visualising Physiological Data, in 2013.
There were two very prominent topics being discussed at Quantified Self Europe 2014, firstly around the quantifying of grief and secondly on privacy and surveillance. These are two very contrasting and provocative areas for attendees to contemplate, but also very important to all, for they’re very personal areas we can’t avoid having a viewpoint on. My contribution to the conference was to lead a Breakout Session on Emotive Wearables and demonstrated my EEG Visualising Pendant. Breakout Sessions are intended for audience participation and I wanted to use this one-hour session to get feedback on my pendant for its next iteration and also find out what people’s opinions were on emotive wearables generally.
I’ve been making wearable technology for six years and have been a PhD student investigating wearables for three years; during this time I’ve found wearable technology is such a massive field that I have needed to find my own terms to describe the areas I work in, and focus on in my research. Two subsets that I have defined terms for are, responsive wearables: which includes garments, jewellery and accessories that respond to the wearer’s environment, interactivity with technology or physiological signals taken from sensor data worn on or around the body, and emotive wearables: which describes garments, jewellery and accessories that amplify, broadcast and visualise physiological data that is associated with non-verbal communication, for example, the emotions and moods of the wearer. In my PhD research I am looking at whether such wearable devices can used to express non-verbal communication and I wanted to find out what Quantified Self Europe attendees opinions and attitudes would be about such technology, as many attendees are super-users of personal tracking technology and are also developing it.
My EEG Visualising Pendant is an example of my practice that I would describe as an emotive wearable, because it amplifies and broadcasts physiological data of the wearer and may provoke a response from those around the wearer. The pendant visualises the brainwave attention and meditation data of the wearer simultaneously (using data from a Bluetooth NeuroSky MindWave headset), via an LED (Light Emitting Diode) matrix, allowing others to make assumptions and interpretations from the visualizations. For example, whether the person wearing the pendant is paying attention or concentrating on what is going on around them, or is relaxed and not concentrating.
After I demonstrated the EEG Visualising Pendant, I invited attendees of my breakout session to participate in a discussion and paper survey about attitudes to emotive wearables and in particular feedback on the pendant. We had a mixed gender session of various ages and we had a great discussion, which covered areas such as, who would wear this device and other devices that also amplified one’s physiological data? We discussed the appropriateness of such personal technology and also thought in depth about privacy and the ramifications of devices that upload such data to cloud services for processing, plus the positive and the possible negative aspects of data collection. Other issues we discussed included design and aesthetics of prominent devices on the body and where we would be comfortable wearing them.
I am still transcribing the audio from the session and analysing the paper surveys that were completed, overall the feedback was very positive. The data I have gathered will feed into the next iteration of the EEG Visualising Pendantprototype and future devices. It will also feed into my PhD research. Since the Quantified Self Europe Conference, I have run the same focus group three more times with women interested in wearable technology, in London. I will update my blog with my findings from the focus groups and surveys in due course, plus of course information on the EEG Visualising Pendant’s next iteration as it progresses.
Rain Ashford is a PhD student in the Art and Computational Technology Program at Goldsmiths, University of London. Her work is based on the concept of “Emotive Wearables” that help communicate data about ourselves in social settings. This research and design exploration has led her to create unique pieces of wearable technology that both measure and reflect physiological signals. In this show&tell talk, filmed at the 2013 Quantified Self Europe Conference, Rain discusses what got her interested in this area and one of her current projects – the Baroesque Barometric Skirt.
Arlene Ducao came to QS from using the WiFit to track personal metrics. As a researcher and maker she started to apply the lessons from self-tracking to another one of her interests, cycling. As a frequent bike ride she started with simple customizations like adding LEDs to her helmet. When consumer EEG devices came on the market she explored the possibility of a mind controlled turn signal system. While that didn’t pan out, it did lead her down a path to create the Mind Rider Project, an bike helmet and integrate EEG unit and tracking system. In this talk, filmed at the New York QS meetup group, Arlene talks about how the project evolved and what they are finding out by having riders wear the helmet during their commutes.
This guest post comes to us from QS Los Angeles member, Mark Krynsky.
I went to the Los Angeles Quantified Self Meetup meeting on March 7th and had a fantastic time meeting like-minded people that are all willing to experiment and share their experiences. The meetup was held at the artisanal engineering studios of the Two Bit Circus located in the eclectic Brewery Art Colony. This made for a really great venue.
The first speaker was Brent Bushnell who is the Circus ring leader. He walked us through a project his team worked on for the Extreme Makeover Home Edition where they built a relaxation chair for a veteran that suffered from PTSD. They used sensors to track his biometrics to help identify when he may be susceptible to trauma. When certain thresholds were met based on his heart rate the soldier would sit in the chair which would then play soothing sounds and had an aroma therapy device.
Another breakout session preview for the upcoming QS conference: feel free to connect with the leaders in the comments!
Measuring cognitive functions is difficult but provides a much richer understanding of ourselves compared to single-dimension measurements (such as steps taken, heart-rate and weight) that have been the primary focus of the QS community.
One approach to measuring cognitive functions is behavioral: inferring cognitive state from our actions and our ability to respond to stimuli. This lies at the heart of traditional psychometrics, the field of psychology concerned with such measurements. Unfortunately, traditional psychometrics mostly focused on measuring differences between individuals, treating a person as a single data point and comparing them to the general population. In QS, we care about within-person variation: how do our cognitive functions vary at different times and how does this variance relate to our actions? This kind of knowledge can lead us to choose actions that lead to desired cognitive outcomes.
Quantified Mind is a tool designed specifically for measuring within-person variation in cognitive abilities and learning which actions we can take to influence our cognitive functions. In other words, what makes you smarter? It uses short and engaging cognitive tests that are based on many years of academic research but modified to be short, repeatable and adaptive. Quantified Mind can be used by any individual to learn about their own brains, and also invites users to participate in structured experiments that examine common factors such as diet, exercise and sleep.
In the session we will also briefly discuss the ‘Smartphone brain scanner’ — a low-cost portable cognitive measuring device that can be used to continuously monitor and record the electrical activity (EEG) along the scalp in order to determine different states of brain activity in everyday natural settings. The system uses an off-the- shelf low-cost wireless Emotiv EPOC neuroheadset with 14 electrodes, which is connected wirelessly to a smartphone. The smartphone receives the EEG data with a sampling rate of 128 Hz and software on the smartphone then performs a complex real-time analysis in order to do brain state decoding.
Please join us to discuss these topics, and bring your questions and engaged minds!
With the QS conference around the corner, we’re asking our breakout session leaders to preview their topics here on the blog. There will be many overlapping sessions to choose from, so here’s your chance to learn more about what you want to see, and connect with the leaders in the comments.
Here is Martin Sona, organizer of QS Aachen/Maastricht, on his session “EEG for Self-Experimentation:”
The subject of this breakout session will be experimentation with electrophysiological methods, such as electroencephalography (EEG).
First, a short introduction will shed some light (not a lot though…) on the puzzling phenomenon of ’brainwaves’. Í’ll also tell something about how an EEG works, what the pitfalls of this method can be.
Consequently, you’ll get a short overview of the EEG ‘market’ accessible to enthusiasts and we’ll visualize some data very quickly. There will be some room for questions, but if you have a particular question in mind, feel free to email me and I’ll try to answer it during the session.
Jakob Larsen and his team at the Mobile Informatics Lab at the Technical University of Denmark have developed a way to build a real-time 3-D model of your brain using a smartphone and the Emotiv EPOC game controller headset. In the Ignite talk below, Jakob describes how the fourteen sensors in this mobile EEG device rival a traditional lab EEG setup, and where he sees this inspiring project going. (Filmed at the QS Europe conference in Amsterdam.)
Better understanding of the intricate relations between our brains and behaviors is key to future improvements in well-being and productivity. Conventional tools for measuring these relations such as functional magnetic resonance imaging (fMRI) or positron emission tomography (PET) typically rely on complex, heavy hardware that offer limited comfort and mobility for the user. This means that measuring brain activity has been confined to expensive laboratories and that it has been a challenge to perform longer term continuous monitoring of brain signals in real life conditions.
Electroencephalography (EEG) is a method for recording the electrical activity along the scalp. EEG measurements can determine different states of brain activity, for instance is this signal used by the popular Zeo Sleep Manager to determine when the user is in different sleep stages, using just a few electrodes. More electrodes enable a richer picture of the brain state and in laboratory settings typically 64, 128, or 256 electrodes are used. However, these systems are time consuming and cumbersome to install and their wiring limits user mobility and behaviors.
With our ‘Smartphone Brain Scanner’ system we aim to enabled continuos monitoring and recording of brain signals (EEG) in everyday natural settings. For that purpose we use an off-the- shelf low-cost wireless Emotiv EPOC neuroheadset with 14 electrodes, which we have connected wirelessly to a smartphone. The smartphone receives the EEG data with a sampling rate of 128 Hz and our software on the smartphone then perform a complex real-time analysis in order to do brain state decoding. That is, estimate the sources from which the brain activations occurred and then show the result in a 3D model of the brain on the smartphone display. This allows the user to observe his/her brain activations in 3D in real time. The video below provides a demonstration.
The smartphone brain scanner enable complete user mobility and continuous logging of brain activities either for real-time neuro feedback purposes or for later analysis. The user can interact with the 3D brain model on the device using touch gestures and the system allow up to 7.5 hours continuos recording.
From a personal informatics perspective the ability to obtain continuous bio-feedback is interesting as it has been shown that such bio-feedback may lead to improvements in behavior, reaction times, emotional responses, and musical performance. Within the clinical domain it has been shown to have a positive effect on attention deficit, hyperactivity disorder, and epilepsy. For such applications a low-cost and easy-to-use brain monitoring system enabling complete mobility could be beneficial. Furthermore, the ability to monitor and record brain signals over longer durations in natural settings might allow the user to gain new insights, and the low-cost setup even allow studying EEG signals in group settings.
More information about the smartphone brain scanner is available here: http://milab.imm.dtu.dk/eeg
With a Nokia N900 and the Emotiv EPOC headset you can try the system for yourself by downloading the brain scanner software from the Maemo repository. We also have a version for Android-based smartphones and tablets, however the software is not released yet.