The future of AI gadgets is just phones


/

Inside my illuminating and incredibly dumb quest to create an AI wearable from phones I had lying about.

p>span:first-child]:text-gray-13 [&_.duet–article-byline-and]:text-gray-13″>

Google Pixel 8 and Pixel 8 Pro on pink and blue backgrounds showing home screens with mineral wallpaper

a:hover]:text-black [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-e9 dark:[&>a:hover]:shadow-underline-gray-63 [&>a]:shadow-underline-gray-13 dark:[&>a]:shadow-underline-gray-63″>It’s phones, y’all.
a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Photo by Allison Johnson / The Verge

At any given time, there are between five and eight phones on my desk. And by “my desk,” I mean any combination of tables and countertops throughout my house. So when I watched the Humane AI Pin reviews start pouring in last week, I did what any logical person would do: grab the closest phone and try to turn it into my own AI wearable.

Humane would like you to believe that its AI Pin represents consumer tech at its most cutting edge. The reviews and the guts of the pin say otherwise: it uses a Snapdragon processor from four years ago and seems to run a custom version of Android 12.

“It’s a midrange Android phone!” I declared at our next team meeting, waving around a midrange Android phone for effect. “You could just download Gemini and stick this to your shirt!” Simple. Trivial. Give me 10 minutes, and I’ll have a more powerful AI gadget whipped up, I said.

Hardware is hard, y’all.

Ideally, I wanted an outward-facing camera and a decent voice assistant I could use hands-free. An iPhone in a shirt pocket was an intriguing solution but a nonstarter because a) none of my shirts have pockets, and b) Siri is just not that smart. Thus, my earliest prototype was a Motorola Razr Plus clamped to the neckline of my shirt. This, unsurprisingly, did not work but for reasons I did not anticipate. 

First off, you can’t download Gemini from the Play Store on a folding phone. That was news to me. But even once I’d sideloaded it and set it as the default assistant, I ran into another barrier: it’s really hard to use a voice assistant from the cover screen of a flip phone. The Razr wants you to flip the phone open before you can do anything aside from get its attention with “Hey Google.” 

Photo of Razr Plus on a shirt showing Gemini on the cover screen.

Photo of Razr Plus on a shirt showing Gemini on the cover screen.

a:hover]:text-black [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-e9 dark:[&>a:hover]:shadow-underline-gray-63 [&>a]:shadow-underline-gray-13 dark:[&>a]:shadow-underline-gray-63″>The things we do for content.
a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Photo by Allison Johnson / The Verge

Running Gemini in Chrome on the cover screen actually got me closer to what I was looking for. But trying to tap buttons on the screen to trigger the assistant wasn’t working very well, and neither was operating Google Lens out of the corner of my eye. Also, Gemini misread “recycle” on a tube of toothpaste as “becicle,” which it confidently told me was an old-timey word for eyeglasses. It is not!

Prototype two was the same Razr flip phone running ChatGPT in conversation mode on the cover screen. This meant the app was constantly running and always listening, so it wasn’t practical. But I gave it a shot anyway, and it was a strange experience talking to an AI chatbot that I couldn’t see. 

ChatGPT is a decent conversationalist, but we ran out of things to talk about pretty quickly once I’d exhausted my chatbot go-to’s: dinner recipes and plant care tips. I want an AI that can do things for me, not just brainstorm stir-fry ingredients.

I ditched the foldable concept and picked up a Pixel 8 and a Pixel Watch 2 instead. I set up Gemini as the default assistant on the phone and figured that would somehow apply to the watch, too. Wrong. I had one more card to play, though: a good old pair of wireless earbuds. Life on the cutting edge of technology, baby.

The yellow Pixel Buds Pro wireless earbuds sitting in their charging case with the lid open, resting on an orange table beside a plastic cup of colorful lemonade.

The yellow Pixel Buds Pro wireless earbuds sitting in their charging case with the lid open, resting on an orange table beside a plastic cup of colorful lemonade.

a:hover]:text-black [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-e9 dark:[&>a:hover]:shadow-underline-gray-63 [&>a]:shadow-underline-gray-13 dark:[&>a]:shadow-underline-gray-63″>Honestly, earbuds might be the AI wearable of the future.
a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Photo by Chris Welch / The Verge

You know what, though? It kind of worked. I had to leave Gemini open and running on my phone since Google doesn’t fully support Gemini Assistant on headphones. But I took a picture of a Blue Apron recipe I was making for dinner, told Gemini to remember it, and left my phone on the counter. As I moved around the kitchen, I asked Gemini questions I’d normally have to peek back at the recipe to answer like “How long do I roast the vegetables for?” and “How do I prep the fish?” It gave me the right answers every time.

What was more impressive is that I could ask it tangential questions. It helped me use pantry ingredients to recreate a seasoning mix I didn’t have on hand. I asked why the recipe might have me divide the sauce into two portions, and it gave me a plausible answer. And it did something the Humane pin can’t do yet: set a timer.

It wasn’t perfect. First, I had to unplug the Google Home puck sitting on the counter because it kept trying to butt in. Gemini also told me that it couldn’t play an album on Spotify, something that that Google Home speaker has been doing for the better part of a decade. The watch came in handy for that, at least.

What started as a goofy stunt has convinced me of two things: I really do think we’re going to use AI to get more things done in the future, and also, the future of AI gadgets is just phones. It’s phones! 

I love a gadget, but guys, I lived through the era of camera companies trying to convince us that we all needed to carry a compact camera and our phones everywhere. Phones won. Phones already come with powerful processors, decent heat dissipation, and sophisticated wireless connectivity. An AI gadget that operates independently from your phone has to figure all of that out.

And you know what looks a lot less doofy than a pin with a laser on your chest? Earbuds. People willingly wear them throughout the day right now. And the doofy factor definitely matters when it comes to wearables. I’m having a hard time seeing how a separate gadget can beat the humble phone plus a pair of earbuds or something like the Meta Ray Bans. Maybe there’s room in our lives and our pockets for dedicated AI hardware — the gadget lover in me is all for it. But I think it’s more likelythat we have all of the ingredients we need to make good AI hardware right in front of us.


Leave a Reply

Your email address will not be published. Required fields are marked *