“Even in San Francisco, a dude wearing Google Glass looks like a dick,” a friend observed on Facebook last weekend. That neatly sums up one of the barriers to wearable technology and all that it implies: are we ready to get so intimate with technology that we’re prepared to wear it?
Justin Rattner, Intel’s chief technology officer, says we should “approach these things from the point of view of what technology needs to be invented and made production-worthy, as opposed to a great idea for a pair of glasses”.
While Google Glass is a proof-of-concept device, it points the way to a paradigm that will become increasingly part of our lives. In effect, Google Glass is a personal digital assistant that you wear – taking photographs, pointing the way to the pub where you’re meeting a friend, overlaying the view of an unfamiliar city with an augmented reality layer of information about the shops, hotels, places to eat, transport options and cultural attractions – is in many ways a clumsy first step into a world that, like it or not, is going to collect, use and exploit the data we all generate all day long.
If you believe the vision of the future enthusiastically set out by Intel at its annual Research event in San Francisco, data is going to play a much bigger part in our lives via technology that we wear and which is connected not only to the web, but to other devices.
Rattner has a point: existing wearable tech devices, such as Google Glass and the Pebble watch, are interesting riffs on devices we already have. But Intel and its team of futurologists and anthropologists have a vision of a world where the technology is not an adjunct (as the mobile phone or the tablet is now) but embedded in our lives, generating and mining data in a way that’s functional and useful to us.
Viewed through Intel’s crystal ball, in the future we’ll have devices that second-guess us, or make intelligent connections on our behalf. One narrative constructed to exemplify this is that of a glossy middle-class thirtysomething woman, whose personal device deduces from her existing music collection that she would like another band that’s coming to town, and proactively buys tickets for a forthcoming gig – calculating that the tickets are already selling like hot cakes, so if she isn’t pleased with the decision to snap them up, those tickets will find a willing buyer.
Creepy? Perhaps. But it’s also a logical extension of things we already take for granted, such as the way Amazon makes recommendations based on our previous purchases that can be surprisingly useful (or stupid – the algorithm can’t distinguish a One Direction CD bought for a teenage niece from one’s taste in general).
Intel’s story goes on to suggest a further chapter in the narrative of the woman whose device has bought gig tickets: it realises that the gig is in a town that’s a little difficult to reach, so it mines the connections between her friends and comes up with a suggestion to connect with a friend of a friend who could give her a lift to the gig. Again, that seems unsettling, yet is just a logical extrapolation of the connections Facebook already uses when it suggests we “friend” people we have in common with our existing Facebook friends. (It’s also similar to some of the scenarios that Bill Gates was setting out as long as 2000, when he unveiled Microsoft’s cloud-based .Net vision.)
At the moment, the benefit from the data we create every day flows largely in favour of the companies who use it to serve us adverts based on the demographic profile we give them. But Steve Brown, Intel’s futurist, says it’s “the individual [who] should benefit – it’s your data”.
He explains: “That data is valuable to you because it’s personal, private, and can be used to give you useful services. So what are the ways we could give people to control their own data stored on their devices? How can we empower them to have negotiations with that service provider?”
In a world where we increasingly actively generate data about ourselves – by geotagging, by uploading images to social media services, by collecting information about how many steps we take each day or how far we run or ride a bike, or about our sleep patterns – we should be able to negotiate the value of that data with the people who want to use it, says Brown.
That’s in addition to the data passively generated by our actions – the trail of “digital breadcrumbs” we leave behind us every day as we send texts, pick up our emails, check in with Facebook or Foursquare.
The data we actively create feeds into the notion of the quantified self: how we collect information to find out more about ourselves, or to define ourselves through the digital stories we tell via our tweets, status updates, Instagram posts and measurements of our fitness.
Sean Koehl, another Intel person with the slightly uncomfortable job title of “evangelist”, described the benefit he found in wearing three different sensors for a week recently. They measured his heart rate, the number of steps he took and his “galvanic skin response” – a measure of stress which, he says is “essentially a lie detector”.
When he meditated, the sensor revealed – as you’d expect – that the output went from a “jagged, rough response to a low, smooth curve representing relaxation”. But he was surprised to see the same relaxed profile output from the sensor when he went for a run – and when he listened to the 90s trip-hop band Portishead.
“I realised that different activities produced a relaxed state in me that I hadn’t considered as a way to relax,” he said – and so he learned something about himself.
Datasets that reveal things about the world we live in aren’t new, though – it’s the technology implementations of them that are more recent. Genevieve Bell, an anthropologist by training who moved to Intel in 1998, points out that the Domesday Book can be seen as just another dataset which helped William the Conqueror quantify exactly what he’d conquered. His aim, though, was to work out how to use what he’d conquered to raise money via taxation.
So datasets – whether a meticulous record of who owned which strip of land and how much it was worth, or a collection of how many runs you’ve done in the past six months – are part of our lives, and we generate more of them every day.
The point, says Bell, is to ask “what sense-making activity are you putting on that data?” That’s the question that technologies – whether they’re wearable, or whether they’re measuring something or telling the world something about ourselves – are starting to grapple with. And it’s by getting intimate with technology – or more intimate – that we might start to learn more about ourselves.
guardian.co.uk © Guardian News & Media Limited 2010