Tag Archives: github

Adam Johnson: QS Bits & Bobs

AdamJohnson_LifeLogs

This is Adam Johnson’s third QS talk. Previously he’s discussed the lifelogging tool he developed and uses and how he re-learned how to type in order to combat RSI. In this talk, Adam gives an update to his self-tracking focused on three areas: tracking an long-distance cycling trip, his streamlined lifelogging process, and how he’s using the Lift app to track his habits.

What Did Adam Do?
In general, Adam is dedicated lifelogger who’s been tracking what he’s doing for over a year. Adam cycled 990 miles from Lands End to John O’Groats with his father and brother over 14 days and tracked it along the way. Because he wasn’t able to “lug around his Mac” to complete his regular lifelogging he decided to update his custom system to accept photos and notes. Lastly, he added habit tracking to his daily lifelogging experience by using the Lift app.

How Did He Do It?
Adam tracked his long distance cycling journey by using Google location history and a Garmin GPS unit. He was able to export data from both services in order to get a clear picture of his route as well as interesting data about the trip.

He also updated his lifelogging software so that it could accept photos and notes he hand enters on his phone. The software, available on GitHub, gives him an easy way to track multiple event such as how often he drinks alcohol and how much he has to use his asthma inhaler.

Lastly, Adam tracked the daily habits he wanted to accomplish such as meditating, reading, making three positive observations, and diet, using Lift.

What Did He Learn?
Everything Adam learned is based on his ability to access and export his data for further analysis. From his cycling trip he was able to make a simple map to showcase how far he traveled based on Google location history (which did have some issues with accuracy). He also was able to see that he traveled 1,004 miles, cycled for 90 hours, burned 52,000 calories, but didn’t lose any weight.

Using his updated lifelogging system, he was able to explore his inhaler use and after a visit to the doctor was able to “find out a boring correlation” that a preventative inhaler works and his exercise induced inhaler usage went to almost zero.

Finally, because Lift supports a robust data export, Adam was able to analyze his habit data and began answering questions he was interested in, but aren’t available in the native app experience. He found that seeing a visualization of his streaks as a cumulative graph was inspiring and motivating. He also explored his failures and found that Saturdays, Sundays, and Mondays were the days he was most likely to fail at completing at least one of his habits.

Slides of this talk are available on Adam’s GitHub page here.

Tools
Google Location History, Garmin GPS, Lifelogger, Lift, Photos, Notes

Posted in Videos | Tagged , , , , , , | Leave a comment

Hands Free Heart Rate Tracking

A quick post here to highlight some interesting developments in the heart rate tracking space. Tracking and understanding heart rate has been a cornerstone of self-tracking since, well since someone put two fingers on their neck and decided to write down how many pulses they felt. We’ve come a long way from that point. If you’re like me tracking heart rate popped up on your radar when you started training for a sporting event like a marathon or long distance cycling. Like many who used the pioneering devices from Polar it felt a bit odd to strap that hard piece of plastic around my chest. After time, and seeing the benefits of tracking heart rate, it became part of my daily ritual. Yet, for all the great things heart rate monitoring can do for physical training, there have been very few advances to provide people with a noninvasive method. That is, until now.

ThearnWebcamPulse

Thearn, an enterprising Github user and developer, has released an open source tool that uses your webcam to detect your pulse. The Webcam Pulse Detector is a python application that uses a variety of tools such as OpenCV (an open source computer vision tool) to “find the location of the user’s face, then isolate the forehead region. Data is collected from this location over time to estimate the user’s heartbeat frequency. This is done by measuring average optical intensity in the forehead location, in the subimage’s green channel alone.” If you’re interested in the research that made this work possible check out the amazing work on Eulerian Video Magnification being conducted at MIT. Now, getting it to work is a bit of a hurdle, but it does appear to be working for those who have the technical expertise. If you get it working please let us know in the comments. Hopefully someone comes along that provides a bit of an easier installation solution for those of us who shy away from working in the terminal. Until then, there are actually quite a few mobile applications that use similar technology to detect and track heart rate:

Let us know if you’ve been tracking your heart rate and what you’ve found out. We would love to explore this space together.

Posted in Discussions, Tool Roundups | Tagged , , , , , , , , | 3 Comments