Title: Nest’s Smoke Alarm Stumble Is a UI Lesson for Everybody
Context: When every gesture can be interpreted as a command how can we tell which gestures are actually commands and which are simply gestures?
Synopsis: The Nest line of home systems control devices have been met with universal acclaim for their intelligent interpretation of human activity to inform their own behavior as opposed to requiring users to spend hours pouring over dense technical manuals explaining to them how to enter reams of data manually. Less cognitive burden on the end user is always a good thing, we can all agree on that. But the translation of large volumes of data via a system’s interpretation of human activity is bound to have a few gaps in between the 2. This isn’t really a fatal error when controlling the temperature of your home but when your smoke alarm starts to interpret the act of running from a fire as the gestural equivalent of “turn off my smoke alarm”, the term “fatal error” takes on a whole new level of dangerous literalism. Of course the ability to tell the difference between the 2 modes is hard wired into the human brain which is one of the things that still separates us from the machines we seem so eager to turn over control of our environments—and indeed our lives—to. The uncanny valley may well be a 2-way interpretive gap.
Best Bit: “The hard work may not be getting them to listen to us. Instead, we’ll need to figure out how to train them to know when not to listen to us, to learn, in some cases, to ignore gestures or commands.”
via wired.com
Leave a Reply