Tag Archives: biosensors
Emily Singer, a journalist with MIT’s Technology Review, has an extensive series of articles and interviews on “The Measured Life“. She was at the Quantified Self Conference a month ago, seems to have talked with everyone, and has since been writing up a storm.
There are also video interviews. Kyle Machulis describes his efforts to hack tracking devices so everyone can access their own data. David Marvit talks about Fujitsu’s Sprout project and the importance of obtaining biometrics in real-world conditions. And Rajiv Mehta talks about the potential for personal science to make a significant impact on healthcare and medical science, and demos Tonic.
And there are posts on social networking and games in self-tracking technologies, on astronauts measuring sleep, a physician’s perspective, the new Health Graph effort, and a wristwatch that continuously monitors blood pressure.
From the Amsterdam QS Show&Tell group: Matt Cottam talks about many of the cool personal informatics and biosensing projects designed by his company, Tellart. Some prototype projects include creative ways to encourage people to take breaks at work, remote teddy bears to connect the elderly with their families, a breath alcohol sensor for the iPhone, and online gaming to combat childhood obesity. Matt has expertise in industrial, experience, and web design, with a detour in emergency medicine. Watch the video below to hear see a whirlwind tour of his inspiring projects over the past few years.
Researchers at Concordia University and the University of London have created ‘smart’ clothing, with embedded wireless biosensors that detect your mood and play voices and videos of people you want to hear when you’re feeling sad, upset, excited, or lonely.
From the review in TechNewsDaily:
The new “smart” clothing contains wireless biosensors that measure heart rate and temperature
(among other physiological indicators), small speakers, and other
electronics that wirelessly connect to a handheld smartphone or PDA.
Data from the sensors is sent to the handheld where it is converted
into one of 16 emotional states, which cues a previously setup database
to send the wearer some inspirational message.
These “mood memos” could be a text message, which scrolls on a
display on the garment’s sleeve, a video or photograph displayed on the
handheld device, or a sound that comes through the embedded speakers.
The researchers have made two prototype garments so far, a male and a
female version, and plan to display them at museums over the next two
years. They are also looking at medical and fashion applications.
The sounds, photos and videos sent to the wearer aren’t arbitrary.
Instead, the messages are spoken by a friend or loved one.
“When you first wear the garment, you turn on the device and you tell it what person you
want to channel that day,” said Barbara Layne, professor at Concordia
University and co-developer of the garments. “That could be your lover
who’s away, it could be your deceased parent, your best friend, whoever
you want to be with that day.”
The multimedia is pre-loaded into a database for each person the
wearer wants to virtually hang out with.
“[At] multiple times during the day, you can set it for as many times
as you want, [the garment] will take your biometric readings, your
bio-sensing data, analyze it on that emotional map and then go up to the
Internet, to the database that relates that emotional state, and bring
you back something that you need,” Layne said.
Thanks to Lyn Jeffery at IFTF for the pointer.