Artificial intelligence continues to be the latest buzzworthy buzzword floating around the tech industry. (Sorry, blockchain and NFTs, your 15 minutes are up.) And though Apple has plenty of ways that it already leverages machine learning to power up its technologies, it’s hard to deny that there are some places where the company could still benefit from jumping on this latest bandwagon.
So it’s interesting to hear a report in the New York Times that Apple engineers are actively looking into language-generating AI, similar to the systems that underlie chatbots like ChatGPT, for a number of applications. And a follow-up report by 9to5Mac that confirmed the tech, code-named “Bobcat,” is already being tested for tvOS 16.4 beta.
How could this technology be used in Apple’s products? Well, as it happens, I can think of a few ways that it might be deployed, not all of which are simply about just creating a chatbot.
Computers have by and large gotten easier over the past several decades, but it doesn’t mean that they aren’t still mystifying at times. Anyone who’s ever tried to troubleshoot a loved one’s haywire device and ended up wanting to break it into a million pieces can probably vouch for that fact.
Apple’s long tried to integrate ways for people to find the solutions to their own problems, whether it’s via the company’s online knowledge base or tools like macOS’s universal Help menu. But navigating those systems can be tricky in their own way, and they don’t always have the most up-to-date information.
That’s one place an AI chatbot could potentially help users. If you could simply type a query about some functionality in a search box and be shown precisely the steps you need to take, that would go a long way toward delivering on the promise of easy technology. And it’s hardly out of reach: I already have one friend who regularly uses ChatGPT to help him figure out the right configuration details for arcane command-line tools.
Search and ye shall find
Apple’s never attempted to compete in the internet search engine market with the likes of Microsoft or Google, but that doesn’t mean search as a concept isn’t important to the company. Whether it’s Spotlight or Siri, people use Apple’s search to look for all sorts of information, both on their computers and on the web.
But the more data there is, the harder it’s gotten to sift through all the noise to find what someone is actually looking for–as Google and Microsoft are learning. Apple’s done a reasonable job with Spotlight, but it certainly feels as though there are cases where being able to interact with it via a chat interface might be more useful. Imagine simply asking it to “show me all the spreadsheets I’ve edited in the last month.”
Likewise, we’ve all dealt with the dreaded attempt to search for information via Siri on our Apple Watch or HomePod only to be told that relevant results have been sent to our phone. Or to be simply given a list of websites that may or may not answer your query. What if, instead, an answer could be provided, with more detail sent to your phone? To be fair, Siri has gotten better at this, but the addition of an AI-based chat interface could allow for more power and flexibility than Apple has provided so far.
Time to get Siri-ous
And so we have come to the elephant in the room: yes, Siri. It’s already the closest thing Apple has to a chatbot, but anybody who’s spent a lot of time with it has quickly realized that the illusion of intelligence doesn’t extend very far: it mostly feels like you’re just dealing with canned, randomized responses that don’t offer a huge improvement over the voice command interface that Apple was shipping back in the Classic Mac OS.
This also means that Siri is ripe for a leap in capability. Rumor has it that the aforementioned language-generation model is already being deployed in the upcoming tvOS 16.4 update, albeit in a limited fashion, both in terms of what aspects it will affect and the fact that it’s being used to enhance the existing virtual assistant rather than replace it outright.
While chatbots have their limitations, it’s clear that they’re a step up over interacting with these conventional voice interfaces, potentially at last living up to the promise of a truly virtual assistant that can handle complex concepts. Unlike now, where I have to tell my HomePod mini in very precise terms how to turn on my living room lights or risk having lights instead turned on (or off) somewhere else in my house.
Siri was an impressive piece of tech at its debut, but over the past decade, it’s clear that it’s gotten stagnant, improving only in a meager fashion. I’ve been arguing for a Siri 2.0 for more years than I can remember, and that future–one where our virtual assistant adapts to us, instead of the other way around–may finally be within reach.