Topic Archives: Conference

Mark Leavitt: Whipping up My Willpower

When we decide to track one thing, we sometimes find that we are indirectly tracking something else.  That is the theme of today’s talk.

When Mark Leavitt was 57, he found out that he had heart disease, a condition that runs in his family. Mark set about making some life changes. He tracked his weight while adopting a low-fat diet. His tracking showed him that he was making progress and that progress encouraged him to keep tracking. But once Mark’s weight loss stalled and then started to backslide (though he had maintained his diet) his desire to track dwindled and was then snuffed out by a major life event.

Though he was ostensibly tracking weight, this experience gave him some insight into his motivation. He began to build a mental model of his willpower. When was it strong? When was it weak? Using his background as a doctor to make assumptions on the nature of his willpower, he used the tracking of other lifestyle changes, such as movement and strength-training, to test those assumptions and better understand how to follow through on his intentions.

Watch below to see what Mark found worked for him and if you would like to see how Mark’s keeping up with his habits, you can check out his live dashboard here.

Posted in Conference, Uncategorized, Videos | Tagged , , , , , , , , , | Leave a comment

Mark Drangsholt: Deciphering My Brain Fog

One of the benefits of long-term self-tracking is that one builds up a toolbox of investigatory methods that can be drawn upon when medical adversity hits. One year ago, when Mark Drangsholt experienced brain fog during a research retreat while on Orcas Island in the Pacific Northwest, he had to draw upon the self-tracking tools at his disposal to figure out what was behind this troubling symptom.

Watch this invaluable talk on how Mark was able to combine his self-tracking investigation with his medical treatments to significantly improve his neurocognitive condition.

Here is Mark’s description of his talk:

What did you do?
I identified that I had neurocognitive (brain) abnormalities – which decreased my memory function (less recall) – and verified it with a neuropsychologist’s extensive tests.  I tried several trials of supplements with only slight improvement.  I searched for possible causes which included being an APOE-4 gene carrier and having past bouts of atrial fibrillation.

How did you do it?
Through daily, weekly and monthly tracking of many variables including body weight, percent body fat, physical activity, Total, HDL, LDL cholesterol, depression, etc.   I created global indices of neurocognitive function and reconstructed global neurocog function using a daily schedule and electronic diary with notes, recall of days and events of decreased memory function, academic and clinical work output, etc.  I asked for a referral to a neuropsychologist and had 4 hours of comprehensive neurocog testing.   

What did you learn?
My hunch that I had developed some neurocognitive changes was verified by the neuropsychologist as “early white matter dysfunction”.  A brain MRI showed no abnormalities.  Trials of resveratrol supplements only helped slightly.   There were some waxing and waning of symptoms, worsened by lack of sleep and high negative stress while working.  A trial with a statin called, “Simvastatin” (10 mg) began to lessen the memory problems, and a dramatic improvement occurred after 2.5-3 weeks. Subsequent retesting 3 months later showed significant improvement in the category related to white matter dysfunction in the brain.  Eight months later, I am still doing well – perhaps even more improvement – in neurocog function.

Posted in Conference, Videos | Tagged , , , , , , , , | Leave a comment

Eric Boyd: Tracking My Daily Rhythm With a Nike FuelBand

In 2013 Eric Boyd started using a Nike FuelBand to track his activity. Not satisfied with the built in reporting the mobile and web applications were delivering he decided to dive into the data by accessing the Nike developer API. By being able to access the minute-level daily data Eric was able to make sense of his daily patterns, explore abnormalities in his data, and learn a bit more about how the FuelBand calculated it’s core metrics. Watch Eric’s talk from our 2013 Quantified Self Europe Conference to hear more about Eric’s experience.

Posted in Conference, Videos | Tagged , , , , , | Leave a comment

Sara Riggare: How Not to Fall

We’re always interested in the way individuals with chronic conditions use self-tracking to better understand themselves. A great example of this is our good friend, Sara Riggare. Sara has Parkinson’s Disease and we’ve featured some of her amazing self-tracking work here before. At the 2014 Quantified Self Conference, Sara gave a short talk on what she feels is her most troublesome symptom: freezing of gait. In this talk, she explains why it’s such a big part of her daily life and how she’s using new tools and techniques to track and improve her gait.

Posted in Conference, Videos | Tagged , , , , , | 1 Comment

Paul LaFontaine: We Never Fight on Wednesday

Paul LaFontaine was interested in understanding his anxiety and negative emotional states. What was causing them? When were they happening? What could he do to combat them? Using TapLog, a simple Android-based tracking app (with easy data export), Paul tracked these mental events for six months as well as the triggers associated with each one. In this talk, presented at the 2014 Quantified Self Europe Conference, Paul dives deep in to the data to show how he was able to learn how different triggers were related to his anxiety and stress. While exploring his data, he also discovered a few surprising and profound insights. Watch his great talk below to learn more!

Posted in Conference, Videos | Tagged , , , , , , | 2 Comments

Jenny Tillotson: Science, Smell, and Fasion

Jenny Tillotson is a researcher and fashion designer who is currently exploring how scent plays a role in emotion and psychological states. As someone living with bipolar disorder, she’s been acutely aware of what affects her own emotions states and has been exploring different methods to track them. In this talk, presented at the 2014 Quantified Self Europe Conference, Jenny discusses her new project, Sensory Fashion, that uses wearable tracking technology and scent and sensory science to improve wellbeing. Be sure to read her description below when you finish watching her excellent talk.


You can also view the slides here.

What did you do?
I established a new QS project called ‘SENSORY FASHION’, funded by a Winston Churchill Fellowship that combines biology with wearable technology to benefit people with chronic mental health conditions. This allowed me to travel to the USA and meet leading psychiatrists, psychologists and mindfulness experts and find new ways to build monitoring tools that SENSE and balance the physiological, psychological and emotional states through the sense of smell. My objective was to manage stress and sleep disturbance using olfactory diagnostic biosensing tools and micro delivery systems that dispense aromas on-demand. The purpose was to tap into the limbic system (the emotional centre of our brain) with aromas that reduce sleep and stress triggers and therefore prevent a major relapse for people like myself who live with bipolar disorder on a day to day basis. I designed my own personalized mood-enhancing ‘aroma rainbow’ that dispenses a spectrum of wellbeing fragrances to complement orthodox medication regimes such as taking mood stabilizers.

How did you do it?
Initially by experimenting with different evidence-based essential oils with accessible clinical data, such as inhaling lavender to aid relaxation and help sleep, sweet orange to reduce anxiety and peppermint to stimulate the brain. I developed a technology platform called ‘eScent’ which is a wearable device that distributes scent directly into the immediate vicinity of the wearer upon a biometric sensed stimuli (body odor, ECG, cognitive response, skin conductivity etc). The scent forms a localized and personalized ‘scent bubble’ around the user which is unique to the invention, creating real-time biofeedback scent interventions. The result promotes sleep hygiene and can treat a range of mood disorders with counter-active calming aromas when high stress levels reach a pre-set threshold.

What did you learn?
I learnt it is possible to track emotional states through body smells, for example by detecting scent signals that are specific to individual humans. In my case this was body odor caused by chronic social anxiety from increased cortisol levels found in sweat and this could be treated with anxiolytic aromas such as sweet orange that create an immediate calming effect. In addition, building olfactory tools can boost self-confidence and communication skills, or identify ‘prodromal symptoms’ in mood disorders; they learn your typical patterns and act as a warning signal by monitoring minor cognitive shifts before the bigger shifts appear. This can easily be integrated into ‘Sensory Fashion’ and jewelry in a ‘de-stigmatizing’ manner, giving the user the prospect of attempting to offer them some further control of their emotional state through smell, whether by conscious control or bio-feedback. The next step is to miniaturize the eScent technology and further explore the untapped research data on the science of body (emotional) odor.

Posted in Conference, Videos | Tagged , , , , , , , | Leave a comment

QSEU14 Breakout: Passive Sensing with Smartphones

Today’s post comes to use from Freek Van Polen. Freek works at Sense Observations Systems, where they develop passive sensing applications and tools for smartphones. At the 2014 Quantified Self Europe Conference Freek led a breakout session where attendees discussed the opportunities, pitfalls, and ethical challenges associated with the increasing amount of passive data collection that is possible through the many different sensors we’re already carrying around in our pockets. We invite you to read his short description of the breakout below and continue the conversation on our forum.

Passive Sensing with Smartphones
by Freek van Polen

The session started out by using Google Now as an example of what passive sensing is, and finding out what people think about usage of sensor data in such a way. It quickly became apparent that people tend to be creeped out when Google Now suddenly appears to know where they live and where their work is, and especially dislike it when it starts giving them unsolicited advice. Following this discussion we arrived at a distinction between explicit and implicit sensing, where it is not so much about whether the user has to actively switch on sensing or enter information, but rather about whether the user is aware that sensing is going on.

From there the “uncanny valley” with respect to sensing on smartphones was discussed, as well as what would people be willing to allow an app to sense for. An idea for a BBC-app that would keep track of how much attention you pay to what you’re watching on television, and that would subsequently try to get you more engaged, was met with a lot of frowning. It was furthermore pointed out that passive sensing might be risky in the vicinity of children, as they are easily impressionable, are not capable of assessing whether it is desirable to have passive sensing going, and can be tricked into giving up a lot of information.

Posted in Conference | Tagged , , , , , , | Leave a comment

Stefan Hoevenaar: My Father, A Quantified Diabetic

Stefan Hoevenaar’s father had Type 1 Diabetes. As a chemist, he was already quite meticulous about using data and those habits informed how he tracked and made sense of his blood sugar and insulin data. In this talk, presented at the 2014 Quantified Self Europe Conference, Stefan describes how his father kept notes and hand-drawn graphs in order to understand himself and his disease.

Posted in Conference, Videos | Tagged , , , , | Leave a comment

QSEU14 Breakout: An Imaging Mind

Today’s post comes to us from Floris van Eck. At the 2014 Quantified Self Europe Conference Floris led a breakout session on a project he’s been working on, The Imaging Mind. As imaging data become more prevalent it is becoming increasingly important to discuss the social and ethical considerations that arise when your image it stored and used, sometimes without your permission. As Floris described the session,

The amount of data is growing and with it we’re trying to find context. Every attempt to gain more context seems to generate even more imagery and thus data. How can we combine surveillance and
sousveillance to improve our personal and collective well-being and safety?

We invite you to read Floris’ great description of the session and the conversation that occurred around this topic then join the the discussion on our forum.

QSEU14_imagingMind

Imaging Mind QSEU Breakout Session
by Floris Van Eck

Imaging Mind Introduction
Imaging is becoming ubiquitous and pervasive next to being augmented. This artificial way of seeing things is quickly becoming our ‘third eye’. Just like our own eyes view and build an image and its context through our minds, so too does this ‘third eye’ create additional context while building a augmented view through an external mind powered by an intelligent grid of sensors and data. This forms an imaging mind. And it is also what we are chasing at Imaging Mind. All the roads, all the routes, all the shortcuts (and the marshes, bogs and sandpits) that lead to finding this imaging mind. To understand the imaging mind, is to understand the future. And to get there we need to do a lot of exploring.

The amount of available imagery is growing and alongside that growth we try to find context. Every attempt to gain more context, seems to generate even more imagery and thus data. We are watching each other while being watched. How can we combine surveillance and sousveillance to improve our personal and collective wellbeing and safety? And what consequences will this generate for privacy?

Quantified Selfie
With about 15 people in our break-out session it started with a brief presentation about the first findings of the Imaging Mind project (see slides below). As an introduction, everyone in the group was then asked to take a selfie and use it to quickly introduce themselves. One person didn’t take a selfie as he absolutely loathed them. Funnily enough, the person next to him included him on his selfie anyway. It neatly illustrated the challenge for people that want to keep tabs on online shared pictures; it will become increasingly difficult to keep yourself offline. This leads us to the first question: What information can be derived from your pictures now (i.e. from the selfies we started with)? If combined and analyzed, what knowledge could be discovered about our group? This was the starting point for our group discussion.

Who owns the data
Images carry a lot of metadata and additional metadata can be derived by intelligent imaging algorithms. As those algorithms get better in the future, a new context can be derived from them. Will we be haunted by our pictures as they document more than intended? This lead to the question “who uses this data?” People in the group were most afraid of abuse by governments and less so by corporations, although that was still a concern for many.

People carrying a wearable camera gather data of other people without their consent. Someone remarked that this is the first time that the outside world is affected. Wearable cameras that are used in public are not about the Quantified Self, but about the ‘Quantified Us’. They are therefore not only about self-surveillance, but they can be part of a larger surveillance system. The PRISM revelations by Edward Snowden are an example of how this data can be mined by governments and corporations.

Context
How are wearable cameras different from omnipresent surveillance cameras? The general consensus here was that security cameras are mostly sandboxed and controlled by one organisation. The chance that its imagery ends up on Facebook is very small. With wearable devices, people are more afraid that people will publish pictures on which they appear without their consent. This can be very confronting if combined with face recognition and tagging.

One of the things that everyone agreed on, is that pictures often give a limited or skewed context. Let’s say you point at something and that moment is captured by a wearable device. Depending on the angle and perspective, it could look like you were physically touching someone which could look very compromising when not placed in the right context. Devices that take 2,000 pictures a day greatly increase the odds that this will happen.

New social norms
One of the participants asked me about my Narrative camera. I wasn’t the only one wearing it, as the Narrative team was also in the break-out session. Did we ask the group for permission to take pictures of them? In public spaces this wouldn’t be an issue but we were in a private conference setting. Some people were bothered by it. I mentioned that I could take it off if people asked me, as stated by Gary in the opening of the Quantified Self Conference. This lead to discussing social norms. Everyone agreed that the advent of wearable cameras asks for new social norms. But which social norms do we need? This is a topic we would like to discuss further with the Quantified Self Community in the online forum and at meetups.

Capturing vs. Experiencing
We briefly talked about events like music concerts. A lot of people in the group said that they were personally annoyed by the fact that a lot of people are occupied by ‘capturing the moment’ with low quality imaging devices like smartphones and pocket cameras instead of dancing and ‘experiencing the moment’. Could wearable imaging devices be the perfect solution for this problem? The group thought some people enjoy taking pictures as an action itself, so for them nothing will change.

Visual Memory
Wearable cameras create some sort of ‘visual memory’ that can be very helpful for people with memory problems like Alzheimer or dementia. An image or piece of music often triggers a memory that could otherwise not be retrieved. This is one of the positive applications of wearable imaging technology. The Narrative team has received some customer feedback that seems to confirm this.

Combining Imaging Data Sets
How to combine multiple imaging data sets without hurting privacy of any or all subjects? We talked for a long time about this question. Most people have big problems with mass surveillance and agree that permanently combining imaging data sets is not desirable. But what about temporarily? Someone in the group mentioned that the Boston marathon bombers were identified using footage submitted by people on the street. Are we willing to sacrifice some privacy for the greater good? More debate is needed here and I hope the Quantified Self community can tune in and share their vision.

Quantified Future
One interesting project I mentioned at the end of the session is called called “Gorillas In The Cloud” by Dutch research institute TNO. The goal of the “Gorillas in the Cloud” is a first step to bring people in richer and closer contact with the astonishing world of wildlife. The Apenheul Zoo wants to create a richer visitors’ experience. But it also offers unprecedented possibilities for international behaviour ecology research by providing on-line and non-intrusive monitoring of the Apenheul Gorilla community in a contemporary, innovative way. “Gorillas in the Cloud” provides an exciting environment to innovate with sensor network technology (electronic eyes, ears and nose) in a practical way. Are the these gorillas the first primates to experience the internet of things, surveillance and the quantified self in its full force?

We invite you to continue the discussion on our forum.

Posted in Conference | Tagged , , , , , | 1 Comment

Steven Jonas: Memorizing My Daybook

Memory, cognition, and learning are of high interest here at QS Labs. Ever since Gary Wolf published his seminal piece on SuperMemo, and it’s founder Piotr Wozniak, in 2008, we’ve been delighted to see how people are using space repetition software. Our friend and colleague, Steven Jonas, has been using SuperMemo since he read Gary’s article and slowly transition to daily use in 2010. Steven has been quite active in sharing how he’s used it to track his different memorization and learning projects with his local Portland QS meeup group. At the 2014 Quantified Self Europe Conference, Steven introduced a new project he’s working on, memorizing his daybook – a daily log he keeps of interesting things that happened during the day. Watch his fascinating talk below to hear him explain how he’s attempting to recall every day of this life. If you’re interested in learning more about spaced repetition we suggest this excellent primer by Gary.


You can also download the slides here.

What did you do?
I used a spaced repetition system to help me remember when an entry in my daybook occurred.

How did you do it?
Using Supermemo, I created a flashcard each morning. On the question side, I typed what I did the previous day. On the answer side, I typed down the date. SuperMemo would then schedule the review of these cards. I also played around with adding pictures and short videos from that day to the card, as well.

What did you learn?
First, that this seems to work. I’ve built up a mental map of my experiences, unlike anything I’ve ever experienced. I also learned that I hardly ever remember the actual date for a card. Instead, it’s a logic puzzle, where I can recall certain details such as, “It was on a Saturday, and it was in October, the week before Halloween. And Halloween was on a Thursday that year.” From there, I can deduce the most likely day that it occurred. I’m also learning which details are most helpful for placing a memory. Experiences involving other people and different places are very memorable. Noting that I started doing something, like “I started tracking my weight”, are not memorable.

Posted in Conference, Videos | Tagged , , , , , , | 1 Comment