Hey! Ok! Understanding Language With Desktop AI

adweek.com

SentientWe define this word as “having the power of perception by the senses; conscious.” Fans of the Terminator movie series will know this word and its implications (Skynet, cough cough). And, we can’t forget the Hal 9000 from 2001: A Space Odyssey.

Which desktop AI system is better at understanding language?

But, in this article, we’re focusing on the sentient Google Home and Assistant and how language usage affects its performance. In the same vein of the desktop AI race, there’s the ubiquitous Echo/Alexa from Amazon, as well. The Echo device comes in multiple form factors, and Alexa executes, more or less, the same type of requests that its Google counterpart does. (Apple is just now getting into the race as well with its HomePod + Siri entry.)

These tech gizmos are constantly evolving—what they are capable of now won’t come close to their limits a year from today. For example, as of this writing, Alexa can’t string commands together. CNET says “saying ‘Alexa turn on the lights, play the Evening Chill playlist on Spotify, and turn the temperature up,’ won’t work as intended.” On the other hand, you can string two commands together for Google Home, as in “what is the temperature outside and who was the first president of the United States.”

Does this mean Google’s device is superior to Amazon’s? Not at all, it just means one hasn’t caught up to the other in that aspect. It’s a constant race, and there is no clear finish line!

There’s an amazing amount of tech packed into these tiny devices, and the “wow factor” can easily make one forget that these gizmos are only as good as the language feedback you provide them. Once you say the trigger word, you have to speak clearly and articulately for it to make sense of your request. As the old adage goes, “garbage in, garbage out.”

Why did Google Home chose Hey and OK?

To interact with the cylindrical Home or its squat, doughnut-shaped Mini cousin, you have to address it with a command like “Hey Google,” or “OK Google.” A command is defined as “to direct with specific authority or prerogative; order.” You have to tell this little guy what you want! (“Hey Goo Goo” also works, as this Italian grandma discovered . . . just not as well.) Recent stories indicate that soon you’ll be able to create your own custom trigger words for the Home units. “Hey, knucklehead!”

Just how did Google choose OK and Hey for their commands, though? Dictionary.com got the techy backstory from Jason Cipriani, a freelance tech writer for CNET, ZDNet, and others. “(I’m) pretty sure the backstory to ‘OK Google’ comes from (the ill-fated) Google Glass. They were trying to figure out a way to trigger it with a phrase that was unique, but not weird. ‘Hey’ was thrown in more recently (Past six months, or so).”

Does desktop AI help us learn language?

As of this writing, the Home family of devices can translate words or phrases for multiple languages including Russian, Mandarin Chinese, French, German, Japanese, and others. It’s become a learning tool.

And, it’s a learning tool outside of foreign language learning, as well. Mashable writes about how a young boy with “significant receptive and expressive language deficits” has finally spoken his first-ever word: Google. Patrick Crispen says of his 19-month old son, “Typically, by the time a toddler reaches 18 months of age, he or she has a vocabulary of at least 10 to 15 words. Our son’s vocabulary was literally zero. Hence the concern.” Some cynics thought it might be just a crass PR stunt designed to move merchandise but Crispen says no. “A child’s first word is the name of a giant, multinational corporation? To some, that’s understandably depressing,” he says. “To me, any tool or activity that demonstrably promotes verbal language development in a language-delayed toddler is, by definition, neither dystopian nor disturbing.”

What are “hot words”?

Basically, OK Google and Hey Google are the “hot words.” According to Jason Cipriani (from above) “When you have a Home/Google Assistant enabled device around, it’s constantly listening but only for the hot words. Once it hears it, it starts transmitting the next phrase or sentence to Google so it can return results.”

Amazon calls them “wake words,” but they work the same, except you can select from four different options: Alexa, Amazon, EchoComputer. (Trekkies will recall Mr. Scott using that last one in Star Trek IV: The Voyage Home).

What does the future hold for desktop AI?

While these devices are fun and can serve a useful purpose, it’s also fair to consider how much power they will eventually have in our everyday lives. We’re still in the infancy of where these AI devices are going and what their potential is. Sure, it’s grand to be able to order a pizza with that little thing on your desk, but it’s also worthwhile to consider how far we’re going to allow these AI devices to manage our day-to-day lives.

Earlier, we mentioned sentience and the fictional world of “Skynet.” Will these little devices get to the point where they’re learning on their own, rather than running off of reactive programming? Will we be comfortable with that? Are we heading for an Orwell 1984 thing here? It’s unclear at this time, but what is clear is that these devices are here to stay and they’re making a pretty big impact in our language and our lives.

When George Jetson yelled “Jane, stop this crazy thing,” he was in the future. But, the future is here now, and it’s ready to dial us up a double pepperoni (with extra cheese.)

Sign up for our Newsletter!
Start your day with weird words, fun quizzes, and language stories.
  • This field is for validation purposes and should be left unchanged.

Sign up for our Newsletter!

Start your day with weird words, fun quizzes, and language stories.
  • This field is for validation purposes and should be left unchanged.