Baking Behavioral Nudges Into The Products We Own

Intel’s Senior UE Researcher discusses the Real World Web and how internet-enabled sensors will create new kinds of intimacies and engagements.

For better or worse, human beings tend to fall into predictable patterns, though we all like to think that we still have some semblance of power over the daily decisions that make up our lives.

Sure we may be able to choose the salad over the burger and fries at lunch, but so much of what constitutes our days are unconscious acts and comfortable behaviors that are far more difficult to change.

Short of having our mothers by our side to constantly remind us to sit up straight or take our daily vitamins, most of us don’t have any reliable way to stay on top of all of these important aspects. Enter technology and the Real World Web.

An emerging class of sensors and devices are beginning to passively monitor our activities to understand our patterns and intercede at key moments to provide reminders around our goals, and even help us curb the unhealthy routines we’ve fallen into of which we may not be aware.

In this trend that PSFK identified in our recent report calling Behavioral Nudge, we look at the range of internet-enabled products and related service that are helping people enact positive change throughout their days.

To dive deeper into this trend, we caught up with Maria Bezaitis, PhD and Principal Engineer of Intel’s User Experience Ethnographic Research Lab to ask whether the proliferation of sensors within everyday products means we should expect these types of ‘nudging’ devices to become more omnipresent in our lives.

Can these systems learn from our behavior to provide tailored recommendations and does this advice become more personalized over time?

“The integration of sensors in products is already careening down the path of more personalized feedback for users,” said Bezaitis.

“The challenge here, however, is how to work towards personalized feedback that doesn’t isolate people from one another and instead seeks to promote the social relationships that are so fundamental to the things that we do in our daily lives.

“A sensor in a thing (shoes, toasters, thermostats) changes the nature of that thing and it does so not just because it gives that thing the capacity to collect data. The fact of sensors in things gives those things the opportunity to start to share the data they collect. Sensors in things give things the opportunity to become social.

“I like to think of things that have sensors in them as participants in the world. Things are no longer just products; they participate in the world in fundamentally new ways.

“Instrumented things [things that have sensors in them] are interesting because they turn products into objects that can instrument new social groupings for us. Nike is a great example of a company doing this with shoes through Nike Plus. They realize that running doesn’t have to be an activity about an individual alone. How might a toaster share data? With whom? How might that change what toast means to people, what people do with toast, how people think about bread, bread alone, bread in concert with other things.

“Technology innovation has to stop imagining people [and things} as isolated beings. Sensors in things give us a way to be less alone in the world, to participate in groups that are dynamically organized and to utilize the everyday things or products in our lives as the new means for creating those new kinds of social groupings, or collectivities.”

2nest_learning_thermostat_3

Think back to even the simple ways you programmed a remote controller for your TV or set basic preferences on your phone. Baked into this scenario is the idea that our devices only become ‘ours’ when we understand how to exert control over them.

But inside the internet of things, our interactions with these same technologies are increasingly defined by a give and take as our devices become more personal and complex.

This underlines the need for a level of ‘open dialogue’ with our devices as they take on a greater role in our lives.

“People expect their interactions with these technologies to be clear and transparent,” said Bezaitis.

“If a system is going to share information or recommend a direction, people want to understand how it came to that conclusion.”

Bezaitis said people will want to know:

  • How does that system know what it knows?
  • Why is it proposing a certain direction to me?
  • How did it use my data to do that?
  • What other data helped it come to that conclusion?
  • What data does it not have access to that I might need?

“Think about how people do this with other people,” said Bezaitis. “When we accept feedback or guidance from our peers, we have some working understanding of what motivates the guidance that they give us. We trust what our peers have to tell us, or not, in part because we understand how their biases, thought processes, worldviews work. We have a working sense of their experiences.

“Even if we disagree with their conclusions, the understanding we have of how our peers think and how their experiences shape what they think, makes it possible for us to trust them. Trust is not a function of agreement. It’s a function, at least in part, of understanding produced through interaction over time. In the absence of this understanding of how someone gets to a certain conclusion, it’s harder to trust.”

3-inline-toaster-set

Transparency over how conclusions are being drawn by these devices may be a crucial point governing their mainstream adoption.

However, the invasion of privacy looms overtly over any conversation around any of these technologies.

To what extent are we comfortable with our technology monitoring us to offer more personalized feedback?

“I’m not so sure that highly personalized feedback is always what people are looking for,” said Bezaitis.

“Sometimes people just want a way to think about what’s happening to them. They want better, richer context for a situation they find themselves in that they may not really understand or have access to a situation that may feel muddled or aggravated or simply uncertain. They don’t necessarily want a product or system telling them what to do, or dealing with that uncertainty for them.

“People want to make their own decisions – this sense of agency is fundamental to our sense of humanity. The goal of a product or system that can know things that are relevant to us is not necessarily to tell us what to do. Sometimes it’s simply about giving better comprehensive information – producing the context – so that we can make the decision that’s right for us.

“People are likely to perceive products or systems that aren’t clear about how they know what they know as creepy. In the last few years, we have definitely seen a rise of creepiness. We see it used in relation to things and services that behave in ways that make people feel out of control.

“For example, a geo-location service that reveals it knows where you’ve been and that it is sharing information is creepy. A geo-location service that makes you aware that it’s collecting this information, as it’s doing it, and that it can use that information effectively to tell you about a fantastic way to spend your free hour is in a better position to build trust.

“Transparency and trust are linked from a design perspective. Commitment and engagement are in the part the effects of on-going relations with products or systems that we trust. Trust, commitment and engagement are always negotiated with systems over time.”

4-83483_1000x1000

Outside of providing detailed metrics around health and performance, could these technologies help us make positive changes in other areas of our lives?

The challenge for user experience designers is in helping people identify areas of ‘change’ and then creating the correct framework to help them meet their goals.

In other words, what are the keys to designing a successful feedback system that ensures people remain committed and engaged?

“Commitment and engagement are really powerful sentiments,” said Bezaitis. “The get to the heart of what’s important about our social relations – that we can experience commitment and engagement and the associated positive notions of dependency and obligation and loyalty. In our closest most important social ties, these are the values that are important to us.

“Today’s technologies – instrumented things, sensor networks, data – have the opportunity to deepen social relationships, to brings us new important kinds of social relationships that we don’t already have and to participate directly in those relations. When we start to think about our technologies as not simply providing incremental value – good recommendations or metrics for this or that problem – we give them room to grow.

“This is the future of smart. It’s no longer simply about speed, accuracy and connectedness, but about new kinds of intimacies, commitments and engagements with technologies and other people.”

In The Real World Web, iQ by Intel and PSFK Labs explore the role internet-enabled technologies will play in connected ecosystems of the future. This series, based on a PSFK’s recent Real World Web report, looks at the rise of the internet of things and its impact on consumer lifestyles.  

Quantcast