Tag Archives: BayAreaQS2
Here is a great talk by Ryan Grant from the last QS Show&Tell. Things got especially interesting when Ryan talked started talking about how the device he is making would allow you to capture, in sound, stills, and video, moments of your life that had already passed.
Below are a few excerpts from the audio transcript to whet your interest.
KK: “Quiet please!”
RG: “Hi my name is Ryan Grant, I’m the founder of Metascopic, Incorporated. We make a memory assistant in the form of a camera you can wear. It takes tens of thousands of pictures during the day and records audio as well. I”m really excited we can get it down to a product about this size. It very wearable. This has not been done before, and I don’t know why….”
QUESTION FROM AUDIENCE: “Tell us a little more abou the specs. What is this? Is this a still camera, is it video? I’m not really sure what it is yet, can you describe what it hopes to be?”
RG: “The first thing I can tell you is that you are going to get tens of thousands of pictures a day.”
QUESTION FROM AUDIENCE: “Still pictures?”
RG: “Still pictures.”
QUESTION FROM AUDIENCE: “What resolution?”
RG: “XGA resolution. 1024 by 768.”
QUESTION FROM AUDIENCE: “What about the viewpoint?”
RG: The viewpoint is going to be very wide angle. For some reason nobody is doing this in existing semi-wearables.
QUESTION FROM AUDIENCE: How often per minute does that translate to?
RG: That translates to a picture every 2 to 5 seconds.
Now, using a buffer, you could tap the device and capture the moment that had just passed – a kind of TIVO for life.
The idea of having one’s devices automatically upload data to a Web site is much hoped for among QS readers. Here’s one take on solving from the problem, presented at the the QS Show&Tell II by Brandon from A&D Weighing, one of the leading manufacturers of medical scales. (Brandon was a good sport in dealing with the comments he received when he revealed that, no, the data was not portable.)
This post makes me happy! One of the most fun things about QS so far has been the sense of optimism and possibility emanating from the frontiers of self-tracking. There is something so obvious about applying basic methods of rational data gathering and analysis to daily life that each little experiment, however simple, hints at bigger themes.
At the last QS Show&Tell, Alex Rossi showed his Twitter apps Tweet What You Eat and Tweet What You Spend. Since then, Nathan Yau of Flowing Data has posted about this attempt to track his eating through Twitter. Nathan wrote a little bot to collect Twitter messages about what he’s eating and how much he weighs and stick them in a database, which he can then use to chart his progress. He asked on his blog whether, if he made this public, people would be interested in using it.
Alex Rossi’s experience suggests that, yes, people would be interested. Tweet What You Eat and Tweet What You Spend are free apps Rossi wrote that do similar work: take SMS messages and post them to a database. Rossi has added some good tricks, such as crowd-sourcing the calorie count, so that suggested values are quickly available. But what I enjoyed most of Alex’s presentation was how clearly he outlined the power of this simple tool. I had just come from the Health 2.0 conference, where there was discussion of all kinds of complex mechanisms for gathering and presenting patient data. Devices, networks, payment systems, regulations – who was going to solve the puzzle? And then down to the QS Show&Tell, where one intelligent person, using a pared down protocol and an extremely simple social networking platform, hinted at a solution that is just around the corner, and that can’t be seen from the perspective of “health care.”
Anyway, here’s the video. My favorite quote: “I noticed people would debit exercise from their food diary. I was like, I didn’t even know I supported negative values!”
Here is a great presentation by Tim Lundeen from the recent QS Show&Tell. Tim is running some interesting self-experiments on diet and cognition.
Diet and cognition is a topic of such obvious interest that it regularly breaks through into the popular press and the science blogs. For instance, eating blueberries and walnuts make rats smarter.
Tim was specifically inspired by the many posts Seth Roberts has made about the benefits of omega-3 fatty acids. Tim uses a simple test of cognitive function as his dependent variable: he gives himself 100 very simple math problems and records the time it takes to complete them.
Here is one of the graphs Tim made in his self-study. The y-axis is the time it take to complete his 100 problems. On about day 80, he upped his dose of DHA from fish oil.
Last Thursday’s Quantified Self Show & Tell saw some great presentations, with great questions and discussion – or rather the beginning of what could have been much longer discussions that we cut off every time out of enthusiasm for the next person’s show & tell. Average presentation time was a little under ten minutes, average discussion time was also under ten minutes, which allowed us to hear from 8 people in two hours. We cleared the room at 10 p.m. in order not to further abuse the extremely gracious hospitality of our hosts at The Institute for the Future, but as I was helping carry video equipment to the parking lot I noticed that the conversation has moved itself outside and didn’t show any sign of diminishing.
Below is a list of who presented, along with a single sentence about the topic. Fortunately, Paul Lundahl of eMotion studios, a member of the QS Show&Tell gang, was at the meeting with his digital video setup, and over the next few weeks we will be publishing short videos of some of the presentations.
Thanks to all who presented. We’re going to try to make the QS Show&Tell a monthly event, using different interesting venues. (If you’d like to hear about them, you can sign up here: Quantified Self Show&Tell Meetup.)
Here’s a brief summary of what happened:
Faren talked about her quest for a self-cure and tracking more than 40 biometric indicators.
Steve gave a short presentation raising the possibility that Bayesian analysis could help self-quantifiers who have trouble keeping testing conditions under rigid control (i.e., all of us).
Break here to say that enthusiasm for Bayesian analysis has broken out at both Quantified Self Show&Tells, raising the happiness of some and leaving others baffled. This will be a topic for future posts, but here are two quick links. The first is a wikipedia entry with links to many subtopics. The second is a beautiful, lengthy conceptual tutorial that is pretty accessible.
Wikipedia on Bayesian stuff.
Eliezer S. Yudkowsky, “An Intuitive Explanation of Bayes Theorem.”
Ryan Grant gave an impassioned and inspiring talk about life-logging with an always on wearable camera he is developing.
Tim Lundeen talked about tracking cognitive function using a standard, easy to implement test and correlating changes in diet with changes in cognition.
Luke showed a life logging tool he created using Filemaker, which he uses to correlate life events with other types of time lines, such as world events, and life events of other family members and friends.
Look for video of some of these presentations in the next few weeks, along with more details!
The second official Quantified Self Show and Tell will take place this Thursday evening, Oct 23.
Our first meeting last month exceeded our expectations, both in the number of people who came and the sophistication of the self-tracking projects that were shared and discussed. It was a real blast. Almost 30 folks showed up. So we decided to do it again. Since the last meeting maxed out the capacity of the location in my studio, with people sitting on the stairs, we are holding this month’s Show&Tell, appropriately, at The Institute for the Future, in Palo Alto, where there is room enough for all. (The Institute for the Future is a general purpose consultancy built around future studies.)
The format will be simple. We will have some extremely brief introductions, list some of the areas of interest, and move on to the Show&Tell. If you are self-tracking in any way – biometrics, mood monitoring, life-logging, DNA sequencing, etc. etc. – please come and considering sharing your methods and results. We’ve got some new things to show also.
For details go to the Quantified Self Meetup page.