Topic Archives: Personal Projects
In honor of today being the last day of the existence of Moves, the app from which so many Quantified Self projects drew their location data, I thought I’d post this artwork by Sabastian Meier and Katrin Glinka, who constructed city models based on connecting Moves data with their memories.
For discussion of the demise of Moves, exporting, and alternatives, see this topic in the QS Forum: Moves shutting down? Oh, no! But note, today is the last day.
We are happy to welcome this guest post on a community tool by Bastian Greshake Tzovaras. Bastian is the director of research at the Open Humans project. He can be found online at @gedankenstuecke. -Steven
I’ve built a Twitter analysis web application that’s open to everyone to use and learn from. Often the best data for learning something about yourself are data you’ve already collected; sometimes without even being explicitly aware of collecting it. Social media activity, for example. We often send off Facebook posts or tweets with very little thought about the metadata that we generate in doing so. Where was I when I made that post? What time was it? What type of content did it contain? Did I retweet or reply to another person’s post? And, of course, what did my post contain?
This data can be extremely powerful – for others. The language you use in your Tweets can be used to predict your age as well as your income. Twitter uses the data to gather information about your likes, dislikes, and possessions – among other topics. But what if you want to learn about yourself with your own Twitter data?
The tool I created allows anybody to explore their own Twitter archive in detail. First, you’ll want to request your archive from Twitter. It will contain all the tweets you have ever sent, with not only the text but all the metadata as well. To look at these metadata, go to my small web application called TwArχiv (pronounced tw-archive), which allows you to upload your data and explore it using interactive graphs.
For instance, you can see how the nature of the tweets you send change over time. Are you replying more to people than you used to or is it all just retweets by now? For my own data it seems that finishing up my PhD work had quite an impact, starting in late 2016. With less procrastination I wrote fewer unprompted tweets. Instead, replying to people became more central to my Twitter experience.
There is also plenty of research on gender bias in social media usage and whose voices are being amplified, with men being overwhelmingly favored. TwArχiv allows one to do some soul searching on this. It tries to predict the gender of the people you interact with based on their first names and shows you whether your reply and retweet behaviour is gender-balanced.
My own graphs show that I had (and have) a good way to go here. Especially 2010 is wildly off when it comes to the gender representation in my Twitter interactions. What happened during that time? I was politically active in the German Pirate Party, which was infamous for being a “boys club”.
If you have geolocation enabled on your tweets, you can get an idea of where you tweet. With a fully zoomable map, TwArχiv allows you to explore the globe on all scales to see the broader picture as well as street-level tweet distributions. As a first attempt of seeing movement patterns, you can also get a time-stamped version of the map that highlights locations one tweet at a time.
If you want to give a try with your own archive, you can head to TwArχiv.org. The data storage is handled by Open Humans and by default your archive and the resulting visualizations will be private. (You can choose to make them public, though, to share them with your friends and followers – mine are here!).
A note: The Twitter archive does not contain any direct messages but only your tweets, so if you have a public Twitter account the archive is basically all your “public Twitter interactions”.
If you have ideas on how to extend the functionality of TwArχiv or you want to code your own Twitter archive analysis, you could even get funding to do so: The Open Humans’ mini-grants of USD 5,000 for projects that will enrich the Open Humans ecosystem are a perfect fit for this kind of data visualization and analysis.
“When I see someone driving towards me with their face buried in their phone, I get gloriously indignant about it.”
Robby Macdonell has given great talks on transportation logging and time-tracking. Here, he combined those two data streams, using Automatic and RescueTime, to prove that he does not use his phone while driving nearly as often as other drivers.
Only the data didn’t agree.
Watch how Robby confronts the realization that he is more distracted than he thought and the changes he made because of it.
In this fascinating talk Rocio Chongtay shares her novel and thoughtfully designed experiments in using music to adjust her concentration and relaxation depending on what she’s doing. Using a consumer EEG device from Neurosky, Rocio tried different types of music while tracking the relaxation and concentration dimensions identified by the Neurosky algorithm. She had experience experimenting with Neurosky in her lab, and then turned these techniques on understanding something about her own mind.
Kouris Kalligas, a long time participant and contributor at Quantified Self meetings, is the creator of the very easy to use data aggregation service AddApp. AddApp is an iPhone app that makes it simple to gain insights from data gathered on dozens of different devices. While running his startup, Kouris has also been doing ongoing self-tracking experiments. At QS Europe 2014, he gave an excellent show&tell talk about his sleep, diet, and exercise data. In the talk below, he discusses using mood data in combination with calendar data to reflect on the relationship between emotion, experience, and self-image.
Let’s start 2016 with a very interesting talk by Randy Sargent about how to visualize the very large data sets produced by some kinds of self-tracking. Randy’s idea about using spectrograms, normally used for audio signals, to create a portrait of your own time series data, is completely novel as far as I know. If you have tried something similar, please get in touch.
In this fascinating short talk by geneticist Jim McCarter, we see detailed data about the effects of a ketogenic diet: lower blood pressure, better cholesterol numbers,and vastly improved daily well being. Jim also describes the mid-course adjustments he made to reduce side effects such as including muscle cramps and increased sensitivity to cold.
Jim begins: “When I tell my friends I’ve given up sugar and starch and get 80% of my calories from fat, the first question I get is: Why?”
The rest of the talk is his very clear answer.
We recently released our QS Access app, which allows you to see HealthKit data in tabular format. Not very many tools feed data into HealthKit yet, but Apple’s platform does pick up step data gathered by the iPhone itself. I have step data on HealthKit going back about two weeks. When Ernesto Ramirez and I were playing around with QS Access, loading the data into Excel and looking at some simple charts, I learned something: Even when I’m active, I’m sedentary.
My daily step totals ranged from a depressing 3334 steps on Thursday, September 18 to an inspiring 21,634 steps on Friday, September 25, but – as these charts clearly show – even on the extreme days my activity was concentrated into relatively short periods when I got up from my desk and went out to do something. Most hours, every day, were spent with hardly any movement at all. I’m sitting at my desk, and sitting at my desk some more, and sitting at my desk still more. That’s probably not good. No, not good at all.
Pulling my data out of HealthKit and seeing a few simple charts gave me a bit of insight that I hope will lead to a change in how much I sit. It was a great to be able to easily make some simple analysis of my data. I hope you’ll find QS Access useful also (you can learn more about it here). Please share what you learn in the QS Access thread in the QS Forum or by emailing us about your projects: email@example.com.
NOTE: The methods detailed in this post no longer work. The post is kept here for archival purposes, but unfortunately you will not be able to access your Fitbit data by following the steps.
Earlier this week we posted an update to our How To instructions for downloading your Fitbit data to Google Spreadsheets. This has been one of our most popular posts over the past few years. One of the most common requests we’ve received is to publish a guide to help people download and store their minute-by-minute level step and activity data. Today we’re happy to finally get that up.
The ability to access and download the minute-by-minute level (what Fitbit calls “intraday”) data requires one more step than what we’ve covered previously for downloading your daily aggregate data. Access to the intraday data is restricted to individuals and developers with access to the “Partner API.” In order to use the Partner API you must email the API team at Fitbit to request access and let them know what you intend to do with that data. Please note that they appear to encourage and welcome these type of requests. From their developer documentation:
Fitbit is very supportive of non-profit research and personal projects. Commercial applications require additional review and are subject to additional requirements. To request access, email api at fitbit.com.
In the video and instructions below I’ll walk you through setting up and using the Intraday Script to access and download your minute-by-minute Fitbit Data.
- Set up your FitBit Developer account and register an app.
- Go to dev.fitbit.com and sign in using your FitBit credentials.
- Click on the “Register an App” at the top right corner of the page.
- Fill in your application information. You can call it whatever you want.
- Make sure to click “Browser” for the Application Type and “Read Only” for the Default Access type fields.
- Read the terms of service and if you agree check the box and click “Register.”
- Request Access to the Partner API
- Email the API team at Fitbit
- They should email you back within a day or two with response
- Copy the API keys for the app you registered in Step 1
- Go to dev.fitbit.com and sign in using your FitBit credentials.
- Click on “Manage My Apps” at the top right corner of the page
- Click on the app you created in Step 1
- Copy the Consumer Key.
- Copy the Consumer Secret.
- You can save these to a text file, but they are also available anytime you return to dev.fitbit.com by clicking on the “Manage my Apps” tab.
- Set up your Google spreadsheet and script
- Open your Google Drive
- Create a new google spreadsheet.
- Go to Tools->Script editor
- Download this script, copy it’s contents, and paste into the script editor window. Make sure to delete all text in the editor before pasting. You can then follow along with the instructions below.
- Select “renderConfigurationDialog” in the Run drop down menu. Click run (the right facing triangle).
- Authorize the script to interact with your spreadsheet.
- Navigate to the spreadsheet. You will see an open a dialog box in your spreadsheet.
- In that dialog paste the Consumer Key and Consumer Secret that you copied from your application on dev.fitbit.com. Click “Save”
- Navigate back to the scrip editor window.
- Select “authorize” in the Run drop down menu. Click run (the right facing triangle).
- Select “authorize” in the Run drop down menu. This will open a dialog box in your spreadsheet. Click yes.
- A new browser window will open and ask you to authorize the application to look at your Fitbit data. Click allow to authorize the spreadsheet script.
- Download your Fitbit Data
- Go back to your script editor window.
- Edit the DateBegin and DateEnd variables with the date period you’d like to download. Remember, this script will only allow 3 to 4 days to be downloaded at a time.
- Select “refreshTimeSeries” in the Run drop down menu. Click run (the right facing triangle).
- Your data should be populating the spreadsheet!
If you’re a developer or have scripting skills we welcome your help improving this intraday data script. Feel free to check out the repo on Github!