Tag Archives: webcam
A quick post here to highlight some interesting developments in the heart rate tracking space. Tracking and understanding heart rate has been a cornerstone of self-tracking since, well since someone put two fingers on their neck and decided to write down how many pulses they felt. We’ve come a long way from that point. If you’re like me tracking heart rate popped up on your radar when you started training for a sporting event like a marathon or long distance cycling. Like many who used the pioneering devices from Polar it felt a bit odd to strap that hard piece of plastic around my chest. After time, and seeing the benefits of tracking heart rate, it became part of my daily ritual. Yet, for all the great things heart rate monitoring can do for physical training, there have been very few advances to provide people with a noninvasive method. That is, until now.
Thearn, an enterprising Github user and developer, has released an open source tool that uses your webcam to detect your pulse. The Webcam Pulse Detector is a python application that uses a variety of tools such as OpenCV (an open source computer vision tool) to “find the location of the user’s face, then isolate the forehead region. Data is collected from this location over time to estimate the user’s heartbeat frequency. This is done by measuring average optical intensity in the forehead location, in the subimage’s green channel alone.” If you’re interested in the research that made this work possible check out the amazing work on Eulerian Video Magnification being conducted at MIT. Now, getting it to work is a bit of a hurdle, but it does appear to be working for those who have the technical expertise. If you get it working please let us know in the comments. Hopefully someone comes along that provides a bit of an easier installation solution for those of us who shy away from working in the terminal. Until then, there are actually quite a few mobile applications that use similar technology to detect and track heart rate:
Let us know if you’ve been tracking your heart rate and what you’ve found out. We would love to explore this space together.
Typically when the Quantified Self-er talks about using photography and image capture for self-tracking they’re talking about taking pictures of their food. Pictures are a very powerful way to capture information for better understanding, you know, they are worth a thousand words. On the blog here we’ve also highlighted a few really interesting projects that take the idea of using visual images for tracking and decided to turn the lens around such as Jeff Harris and his 13 years of self portraits.
One of the projects that I found super interesting was LifeSlice by Stan James.
For those of you who want to try LifeSlice Stan has put the code online for you to use and possibly tinker with. As a new user I can say that it is pretty interesting to see how my facial characteristics map to what I’m doing on the computer. For examples here’s me looking at a new statistical software package for mac (Wizard).
And here’s me writing this post while listening at a conference on health data.
The last project I want to highlight here is the self-portrait project of Noah Kalina. Noah is a photographer who has been taking self portraits every day for 12.5 years (January 11, 2000 – June 20, 2012). A few months ago he put all 4514 images together into one amazingly insightful video.
Than Tibbetts was so intrigued by this project he decided to work some fancy image processing magic to find out what “Average Noah” looked like and found this:
I’m sure there are more projects out there that involve individuals turning the camera on themselves. We all have cameras with us in our pockets and on our computers. How are you using those image capture technologies to better understand yourself? If you’re working on something interesting let us know!