Tag Archives: mobile
Sara Cambridge is an interaction designer and a frequent contributor to the Quantified Self community. This past spring she was tasked with creating a unique information visualization as part of her graduate coursework at the UC Berkeley iSchool. Given her interest in QS she chose to use her experience with tracking her diet using the Eatery mobile app as the basis for her visualization project. Using the Eatery led her down an interesting path that helped her understand her own eating habits, how she compares to others, and how people “really” rate other’s dietary choice. (Filmed at the Bay Area QS Meetup)
Jakob Larsen and his team at the Mobile Informatics Lab at the Technical University of Denmark have developed a way to build a real-time 3-D model of your brain using a smartphone and the Emotiv EPOC game controller headset. In the Ignite talk below, Jakob describes how the fourteen sensors in this mobile EEG device rival a traditional lab EEG setup, and where he sees this inspiring project going. (Filmed at the QS Europe conference in Amsterdam.)
Michael Doherty talks about the Open Source Real Time Mobile Sensor Platform he is developing, which flexibly connects a wide variety of sensors to online databases. He wants to make mobile tracking more accessible, and imagines people using it with their own sensors, as well as kids using it to collect environmental data on class field trips. (Filmed at the NY Quantified Self Show&Tell #10 at Google).
A few months ago I was fatigued and decided to try a more rigorous sleep hygiene routine to see if it would help (it did). To make the experiment fun I thought I’d look for a nifty iPhone app to track the data. After a fairly extensive search I noticed that most of the tools were either highly specialized to a domain (e.g., Sleep On It), or more general purpose (e.g., iLogger). This got me wondering about why there isn’t a universal self-tracking gadget, and what one might look like.
Below I sketch some ideas on what such a beast would need to do to support any data-driven effort. I’d love to know if this makes sense to you, and what you think. (Note: I’m excluding memories for life applications such as Gordon Bell’s MyLifeBits.)
Regardless of the particular domain – sleep, exercise, mood, sex, reading, etc. – is there a set of common tools and sensors that could satisfy the majority of data-driven activities? Overall the goal would be to help answer the types of questions we ask when self-experimenting. That is, to help us discover useful patterns. The kinds of things it would need to “know” include:
- Physiological state: Physical context like pulse and temperature. (What’s going on in your body?)
- Mental state: Cognitive context like thinking patterns, mood, and happiness. (What are you thinking? How do you feel?)
- Location: Spatial context like transitions, surroundings, environment, and activity. (Where are you? What’s going on around you? Where are you going?)
- Incidents: Temporal context like performing exercise, taking medication, attending an event, or eating. (What did you just do?)
- People: Social context (Who are you with? What interactions are you having?)
How would it collect these things? I don’t have all the answers, but I’m thinking of three sources: Direct measurement, inference, and self-reporting. The first category, direct measurement, clearly is collected by sensors, and there is exciting progress on this front. See Measuring Vital Signs From 40 Feet Away or NASA Adapts iPhone to Detect Chemicals, for example.
I’m less sure about the second category, inference, but I’m thinking of tools that deduce some of the above, such as “You’re asleep” (zeo), “You’re at work” (Skyhook), “You’re at a party” (iCal), or “You’re around someone interesting” (MeetMoi).
The final category is the most applicable to self-tracking, but also the most problematic. The closest concept I could find was Wikipedia’s Self-report inventory entry, but the gist is there’s a lot we have to report explicitly. Think of anything you’ve tracked in the above contexts and you’ll come up with plenty of examples, such as “I feel great,” “I drank a beer,” or “I just had an argument with my spouse.” This category is problematic because self-reporting is biased, and because it requires manual input (see Gary’s Which is Better: Automated or Manual?).
I think it’s this last category of data capture that’s generally applicable to most self-tracking needs. Putting on my computer science hat, it seems there’s a fixed set of data types that we’d need. The typical ones include itemized lists (mood from 1 to 5 stars, or yes/no), counts (number of push ups), durations (minutes of exercise), number (weight in pounds), and text notes. All would be time-stamped, of course.
Pros and cons of specialized vs. general
Nothing comes for free, so what would be the trade offs of using a general-purpose data capture device? The pros are that there’d be no reinventing the wheel, everyone would know how to use it, and manufacturing economies of scale would be possible. Also, if we assume a open data access API then any site could use the data, enabling custom uses, novel visualizations, and social applications.
For cons, just look at Alex’s roundup series of “vertical” tracking tools: food, location, fitness, and mood. Because these are specialized to their domain they offer benefits like precise language, customized input (such as eCBT Mood), inferred measurements, and inbuilt information such as a food/calorie database.
Workarounds are possible and would be driven by an experimental design perspective. Self-trackers would set up their experiments by specifying types of measurements, units, frequency of capture (including reminders), and measurement groupings. (An example of the latter is needing to capture a set of daily mood chart data in one shot, like exercise, medications, menses, energy, and agitation level.) By making the gadget’s UI “skinnable” we could generate interfaces automatically for each experiment.
Usage characteristics (or Why your phone should be a Tricorder)
So what would the thing actually look like? In addition to the physical sensors, there are characteristics required for a universal data-tracker to be usable. What comes to mind are ubiquitous availability, rapid manual entry, and notifications to the senses (“What’s that smell? Oh, it’s time to check if I’m procrastinating.”)
Fortunately we have a classic model to start with – the venerable Star Trek Tricorder. It was portable, had powerful recording and analysis capabilities, and could measure things like environmental make-up, life forms, and power sources. Combining the general-purpose and medical variants into your cell phone (the de facto does-it-all device), and adding additional sensors and controls (real buttons, please – much faster than touch screens), wouldn’t we have something that self-trackers would love?
A catalyst for citizen science?
Inspired by Kevin’s conclusion in A Web Page For Every Species, I wonder if having a universal device for self-experimentation could launch self-tracking for all.
As he puts it,
When anyone can buy a hand held species identifier, an amazing transformation will take place: everyone will become a taxonomist.
Could this be true for individual experimentation? Would everyone become a personal scientist? It’s exciting to imagine this kicking off a widespread movement to fulfill the promise of citizen science and social self-improvement. What would be the result, and how might that change how we interact with ourselves, each other, and the world?
What do you think?
- Is such a gadget possible?
- Would it apply to most self-tracking apps, or would it be too general?
- Do you use a general purpose app? How has it worked for you?
- Do you see it drawing people into the experiment-driven life?
(Matt is a terminally curious ex-NASA engineer and avid self-experimenter. His projects include developing the Think, Try, Learn philosophy, creating the Edison experimenter’s journal, and writing at his blog, The Experiment-Driven Life. Give him a holler at firstname.lastname@example.org)
Can your cell phone replace your therapist, or make cognitive behavior therapy more accessible to a wider audience? Margaret Morris of the Digital Health Group at Intel sent in an article that addresses this question, published today in the Journal of Medical Internet Research.
From the abstract:
One example is a participant who had been coping
with longstanding marital conflict. After reflecting on his mood data,
particularly a drop in energy each evening, the participant began
practicing relaxation therapies on the phone before entering his house,
applying cognitive reappraisal techniques to cope with stressful family
interactions, and talking more openly with his wife. His mean anger,
anxiety and sadness ratings all were lower in the second half of the
field study than in the first.
Therapy? There’s an app for that.
Is health going mobile?
It certainly seems so. Belgian blogger Bart Collet posted this fantastic compilation of tools and services for mobile health. Many of the companies in this presentation I hadn’t ever heard of before, which goes to show how quickly this field is expanding.
Bart’s full post is here. He asks provocative questions like:
“If wearable solutions are the outcome, can i take them to the dry cleaners?”
“Are hospitals and practitioners ready to handle the amount of data coming their way?”
“Is mobile health going to be a successful export product of developing countries?”
What do you think?
Today we’re going to learn how to build your own multi-purpose mobile self-tracking application. The origin of this simple tracking method lies in the second QS Show&Tell when I outlined my dream self-tracking system and expressed a wish that somebody would build it. Among other things, I wanted an easy way to capture any kind of simple self-tracking data on my phone and automatically upload into a spreadsheet, where it could be easily graphed. Somebody mentioned that this could be done using Google docs. At first it struck me as too complicated. But it wasn’t. Here’s how it’s done.
First, open Google docs, go to the “New” menu in the upper left corner and select “Spreadsheet.” In the spreadsheet, name your columns with the types of things you are tracking: for this example I’ve used weight, alcohol, and systolic and diastolic blood pressure. You do not have to name a separate column for date or time. That will be automatically added by Google.
Now that you have your spreadsheet, go to the “Form” menu and choose “Create form.” A form will immediately appear. You can name it, as I did, “all about me.” Here is what you will see:
Now that you have made your form, go to the Form menu and select “Go to live form.” You will see the form as a web page. Enter some sample data. After you click submit and return to the Google spreadsheet page, you will see that the data has been entered into your spreadsheet, and given a time stamp.
Now it is time to start entering data on your mobile phone. Go back to the “Form” menu, select “Send form” and email it to yourself. Open the link with the Web browser of your mobile phone. Now you are connected to a Web page that will accept self-tracking data and automatically place it on your spreadsheet. Here is an example of how the form looks on my iPhone.
Add a shortcut to this page, if you can, in a convenient place, such as your phone’s home screen. (On the iPhone, you can click the + sign at the bottom of the browser window and select “Add to Home Screen”)
You are done. Now you can transmit your data from a simple form on your mobile to a spreadsheet, where it is automatically time-stamped, and awaits your charting and analysis.
In a future entry I’ll show some simple ways to extend these methods.