• | 8:00 am

Just because you can design an AI wearable doesn’t mean you should

Now Humane’s Ai Pin proves that we’re not even close to being ready for the era of wearable AI.

Just because you can design an AI wearable doesn’t mean you should
[Source photo: FC]

Artificial intelligence is seeping into our lives like the rising groundwater of climate change, yet it seems like it also must be heralded by a new generation of gadgets for it to have arrived in the fullest sense.

But does every new computing paradigm require sleek new hardware?

That seems to be the conclusion given all of the hype surrounding Humane’s launch of the AI Pin and the Apple Vision Pro that preceded it. The masters of AI insist that artificial intelligence be with us always and everywhere in order to be both useful and user friendly. They are even selling it as a way to take greater control over our screen-obsessed lives. And given the current state of heightened anxiety related to AI, it better arrive in a “consumer-friendly” and reassuring fashion, with rounded corners and a sleek, high-gloss white case. To succeed, it must be packaged as a lifestyle choice and, apparently, it also must be wearable.

[Photo: Humane]

Taking AI out into the world may be a compelling vision for the tech-utopians counting their paper billions. And it certainly offers incredible business potential by radically increasing the amount of data that can be captured and made available to train these systems. But fear of AI is also rampant, not just here in the U.S. but all over the world. As the first dedicated commercial AI device out of the gate, Humane is offering something small, charming, and relatively unassuming in its Pin.

Yet the ambition is anything but, as stated in their slick demo video. For $24 per month we will be able to “take the full power of AI everywhere” and have it “weave seamlessly into our lives.” This is not a marriage of design, but one of power, with the central question being one of control. Specifically, how are the design choices helping or undermining our sense of agency? Are the design patterns reassuring or deceptive and is that a good thing or an illusion?

Some of this comes down to choosing the right metaphor, as Cliff Kuang and I described in our book User Friendly. Handheld computing finally broke through (after the spectacular failure of the Apple Newton) in part due to technical advances, but also because of a very appealing metaphor. Jeff Hawkins, the founder of Palm and Handspring, famously spent over a year walking around with a small wood block that he carefully carved and sanded until it was just the right size to be pocketable, like a deck of cards (a metaphor which carried through explicitly into the Palm interface).

This same basic product archetype has governed the entire history of touch-screen based handhelds over the last 30 years, all the way through to the smartphone you undoubtedly carry in your pocket. On the one hand, this little, pocketable form factor encodes a sense of control into our relationship to a handheld computer as we have the agency to tuck it away and bring it out whenever we like. But on the other hand, this has proved to be an illusion, as our devices now seem to have taken control over our lives. Can a clever marriage of AI, voice, and projection shift these power dynamics, or will it also turn out to be an illusion?

[Photo: Humane]

DESIGN AS A MEDIUM FOR CONTROL

The Pin’s most distinctive physical feature is a sort of fold or eyebrow that tilts slightly forward along the top of the device and is marked with a blinking yellow eye. This design element serves a very practical purpose for the wearer, as it houses the LEDs that provide critical feedback: the “Trust Light” that communicates when the device is on and listening, and the “Beacon,” which alerts the wearer to incoming messages.

These features are essential for the wearer, but for the rest of us, it creates the distinct impression that the device is always paying attention, leaning forward and listening (even when technically it might not be) like the cocked ear of a border collie. The overall effect is sort of like a wearable home automation device or smart badge. As a metaphor, smart badges have strong associations with control, surveillance, and enforcement. We use them to be identified and gain access to physical spaces, and rarely by choice. We even attach these badges to our luggage and pets. Unwittingly, the designers at Humane are encoding those associations into the product despite the reassuring iPod styling.

This language of control is reinforced by the Pin’s most unique mode of interaction: an embedded laser projection and gestural interaction that combine to create a Minority Report style open air controller. I have no doubt that this feature is incredibly compelling to play with, even if it might take a few generations to fully work out all of the kinks. But while it conveys a growing sense of power and control as information is manipulated in the air like a conductor, ultimately, users are controlling just the surface, trivial elements of the choice architecture, like skipping tracks in a playlist or queuing up the next text message.

Any sense of control we might feel is actually an illusion, with all of the computing intelligence happening in the cloud, totally inaccessible to us. The Pin claims to offer “perpetual” power via its hot-swappable batteries giving it the technical capability to be an always-on surveillance device that continually records and streams our personal data into a massive AI engine. It makes me wonder how these gestures could be re-purposed to address this power asymmetry directly and provide the user with a more tangible and fundamental form of control over their relationship to AI.

For example, could you swipe your hands to erase all of the user data that has been sent over the last hour to their system? Or wag your finger to turn off location tracking? Of course the folks at Humane would argue this would only make it harder for the system to leverage its intelligent platform in the most relevant manner. But shouldn’t that choice be left to us, and not decided for us by the masters of AI?

There is a growing awareness of deceptive design patterns that coerce users into making choices that platform or app providers want them to make. These patterns are actively being captured, documented, and shared by designers in the world of software UX. But deceptive design may take quite a different form when it comes to the conventional styling of digital products, masquerading as an appealing lifestyle choice.

Rounded corners, glossy white surfaces, and even slick environmentally conscious packaging have become effective and expected ways to introduce new control and surveillance mechanisms into our daily lives—they reassure us that the products like the AI Pin will fit seamlessly alongside our Airpods and other stylish, smart accessories (even though the Pin is decidedly not an accessory). After all, it is no surprise that Jony Ive began his career designing innocent, porcelain bathroom fixtures before joining Apple.

While deception may not be the intent of Humane, the basic architecture of control and surveillance is built into the design of its first product. Hardware businesses are resource-intensive, particularly billion-dollar ones, with investors behind them pulling strings. What happens when Humane is acquired by Google or Amazon? Are these companies we would trust to retain the same humanistic values? One need only look at the cops cruising around airports on Segways to appreciate how this may all turn out in the end if consumer adoption is tepid.

It’s hard to imagine a world in which the broad adoption of a chic little smart, wearable, and connected cameras and microphones is a positive step forward for society at large—one that will leave us feeling more in control of our lives, not less.

This is exactly the sort of dystopian future that Dave Eggers described in The Circle more than a decade ago. My fear is that Humane has taken the next step in bringing this sort of (in)humane vision to reality under the guise of desirability and choice. As designers, we have made glossy screens and addictive media ubiquitous. So, our solution to a massive problem we helped to create is to attach AI-powered surveillance technology to everyone’s lapel. Really? As my dear friend Allan Chochinov is constantly reminding his product design students: Just because you can design something, doesn’t mean you should.

  Be in the Know. Subscribe to our Newsletters.

ABOUT THE AUTHOR

More

More Top Stories:

FROM OUR PARTNERS