Tag Archives: cathal gurrin

QSEU14 Breakout: Lifelogging with Photos

Today’s post comes to us from Cathal Gurrin, Rami Albatal, Dan Berglund, and Daniel Hamngren. Cathal and Rami are researchers at Dublin City University and Daniel and Dan work at Narrative, makers of a small lifelogging camera and application. Together, they led an interesting breakout session at the 2014 Quantified Self Europe Conference on photo lifelogging and how to use analytics and computer vision techniques to make sense of vast amounts of photos. You’re invited to read their description of the session below and then join the discussion on the QS Forum.

Photo Lifelogging as Context for QS Practice

by Cathal Gurrin, Rami Albatal, Dan Berglund, and Daniel Hamngren

Thank you to those that came to the breakout session! It was a lively and excellent session with plenty of audience interaction. There were about fifteen participants who had an interest in photo lifelogging.

The session started with a presentation by the session chairs. The Narrative representatives discussed the Narrative clip and their plans for supporting photo lifelogging. This was followed by the DCU team giving an overview of what is possible with photo lifelogging, covering the technical possibilities of what is reasonable to achieve today.

What came across from these presentations is that photo lifelogging is not difficult, but the computer analytics to mine and extract meaning and knowledge is certainly challenging and even state-of-the-art computer vision analytics techniques can often fail to identify the valuable content of photos.

There were a number of core points of discussion, and these were:

Food and Diet. It seemed to the panel and the audience that food and diet monitoring was a key requirement for photo lifelogging and should be the key challenge to be addressed. It was accepted that this is challenging to do, but it was pointed out that recent academic findings suggest that indeed this is possible to achieve in some circumstances. It was pointed out that the most promising technologies to achieve this required a significant investment in time to label food eating photos and there were a number of willing volunteers to help with this activity. If it is possible to release a dataset of food eating photos, then the QS community will be able to help to label the data and build a large amount of training material for machine learning and AI techniques to utilise to build better food and diet monitoring tools. The organisers have taken this point on-board and will return at the next global meet-up with a plan.

Behavior / Lifestyle. Analysing the behaviour of the individual was discussed in terms of data correlations over time and visual day logs. Visual day logs, being the easiest to achieve today is available from the current generation of lifelogging tools, so this is available to anyone to begin to manually explore today. The extraction automatically of temporal patterns of behaviour was suggested as a valuable tool to begin this analysis.

Media Consumption Analytics. It was suggested that analysing the media that a lifelogger consumed could be very valuable both for organisations and as a context source for better quality search. Once again, the discussion came to the conclusion that this was also difficult to achieve, but that it is a worthy goal for the research teams.

Other discussion points included support for and appropriateness of sharing in real-time. Past experiences were shared of when this can work and when it can go wrong. It was also suggested that a ‘loved-one’ reminder tool could be developed as a form of ‘remembering future intentions’, which was pointed out in the lifelogging talk earlier that day as one of the five use-cases for photo lifelogging.

The session ended with the organisers thanking the attendees and the post-session discussions began and continued for thirty minutes, with some continuing to this day. In summary we found out that both food / diet and behaviour / lifestyle were the most important QS-based automatic monitoring tools that should be refined and made available to the QS community.

If you’re interested in photo lifelogging we invite you to join the discussion on the QS Forum.

Posted in Conference | Tagged , , , , , , | Leave a comment

Cathal Gurrin: Seven Years of Lifelogging

ConferencePreview

Lifelogging is somewhat of a hot topic these days. With the soft release of Google Glass, the crowdfunding success of personal logging cameras like Memoto, and the release of numerous technology-enabled auto diaries it should be no surprise that Lifelogging is a one of the core themes of our upcoming Quantified Self Europe Conference. We’re looking forward to collaboratively exploring how lifelogging fits into our personal and social contexts and we’re excited to welcome an excellent group of speakers on this topic.

CathalGCathal Gurrin is a lecturer at the School of Computing, at Dublin City University, Ireland and he is an investigator at the CLARITY Centre for Sensor Web Technologies. Cathal is really a ‘hands-on’ researcher, so since June 2006, he has ‘lived his research’ and worn various sensing devices during waking hours. He has amassed a huge archive of 14 million wearable camera photos, weeks of video, sound samples and various other sensors such as location, movement, and nearby people. His research team is exploring how they can develop quantified self and lifelogging technologies that can have positive benefits in the real-world, with an initial focus on personalised healthcare and digital diaries.

CathalG_Lifelogging

One example of this work is the ‘Colour of Life’ wall. The Colour of Life wall is a touchscreen visualisation that plots a two dimensional view of a person’s life experience, in terms of colours encountered (imagine a 1 pixel camera), on a large video display wall. It is captured by wearable cameras configured to take about 2 photos per minute. The interface allows clustering of life events across weeks, months or even years. The colours displayed have a unique meaning to the camera wearer, for example, at a glance at the wall can show time periods when the wearer spent too long in the office or driving to work.

We’re excited to have Cathal at the conference where he will be sharing what he has learned during the last seven years of his personal lifelogging experiment. He will also show some of the new technologies his team are working on and will share his understanding of the likely potential pathways that this work of lifelogging will need to take in order to reach widespread use.

The Quantified Self European Conference will be held in Amsterdam on May 11th & 12th. Registration is now open. As with all our conferences our speakers are members of the community. We hope to see you there!

Posted in Conference | Tagged , , , , , , , , | 2 Comments