Exploring Apple Intelligence: Features, Models, and Partnerships

Okay, I’m gonna take a few videos to talk about apple intelligence here. This video is gonna be about just how it’s generally laid out and how it’s gonna work and how that differs from other companies. And other videos are going to cover some of the features that it’s going to enable. But before we get into that, let’s just talk about the name, which is Apple Intelligence. There’s been a little bit of criticism about this, partially just because they’re utilizing the fact that apple starts with an a. So they can say like AI, Apple intelligence, suppose artificial intelligence. But in case it’s not clear, because apparently it’s not for some people, they don’t intend for you just to walk around in day to day life saying, oh, hello, Apple Intelligence. Instead, it’s the same exact thing that they do with, say, Continuity. Apple Intelligence is just a name to describe a group of features that fall under a similar umbrella. So just like how you don’t go around day to day life saying, oh yeah, just use continuity for it. Instead, you refer to it by individual features like continuity camera or universal clipboard or universal control. You weren’t expected to go around using Apple Intelligence as part of your day to day life. You know, you’re gonna use individual features like Siri and the other ones that we’re gonna get to in this next videos. Despite me saying that I’m gonna be using no term apple intelligence a lot here because I am referring to that umbrella of features. Okay, so let’s talk about the three tiered approach, if you will. So Apple’s own models are contained within the first two tiers. The first one and lowest one are those that run locally on your device. Apple has put a lot of emphasis on the local processing of data versus in the cloud because it’s much more secure that way, and even they often can’t get access to it if you have it end to end encrypted. The same thing applies to Apple Intelligence, and they want to do basically as much as they can locally on your device. The next tier up, though, is for the things that just simply cannot be done locally on your device currently, or at least they can be, but not, and have it be a good user experience. Those sorts of tasks are actually delegated to be done in the cloud in the form of their secure cloud compute is what they’re calling it. Basically, this is their own special sauce and their own special way of doing their servers, which is intentionally designed to try to get as close as possible to the security benefits of doing it locally on the device, while still getting the performance benefits of doing it in the cloud or it’s not on your small mobile device.

Now before we get into that third tier, let’s talk a little bit about what Apple’s AI models are specialized in because that plays into that third tier. So what you might be used to when you think of AI chatbots right now is the ability to go on, log into a website or something and then enter in a query and then have it basically search the internet for you and then summarize the results that it found pertaining to your query. This is not what Apple’s AI is designed to do. However, what it is designed to do is actually very similar because Apple’s AI is designed to do the same thing. It takes in a query and then it goes into a database and pulls in the required information live on the spot, and then summarizes that information in a way in which you can act on it and gives it back to you. However, instead of pulling information from the internet, it pulls it from all of the local information on your device. Think your emails and calendar events and reminders and messages into all of this stuff, much like the other models and how they search the internet. You give it a query, it figures out what it needs and then it goes and just gets what it needs. But instead of getting it from a website after a search query, it says, oh, I need a message from a week ago. It gets just those that it needs to actually fulfill that request.

This means a few things. The first is that it is supposed to be really good at pulling up information from your day to day life. Apple gave an example where they said, hey, when does my mother’s flight land? And it was able to pluck out, okay, who your mother is, previous communications with your mother that mentioned a flight, the flight number from those communications. And then actually pulled the tracking information from that flight and then it presents that to you in a way in which you can simply view like the flight track. And then from there you could follow up as they did in the presentation and said, oh, what are our dinner plans? And it said, oh, still referring to the mother, obviously, after the flight comes in, what dinner plans did you discuss? So then it goes founds where you were talking about dinner plans with your mother and actually polls in what you guys decided were your dinner plans. That’s all cool.

The second thing it means, though, is that it’s really not designed to go out into the internet like those other models are. This means that you might be a bit more limited in what you can ask it, which we’ll get to in a second. Say you ask Siri for today’s news rather than going and finding an article or articles online and finding news headlines directly from the internet, it might have to go locally on your phone and find what news headlines pop up in the news app, for example. It’s still to be seen how this actually translates into day to day use, but it could be a potential limitation.

The third thing that this all means, though, and a potential benefit of that is that a lot of the hallucination that you see, not all of it, but a lot of it is based around conflicting information that the models are pulling from the internet. Your personal context on your phone own can be seen as much closer to just a group of gold standards. Everything in there is the gold truth, right? You have a calendar event, you’ve only got one calendar event. It’s not gonna pull in conflicting information. This means that the whole like hallucination problem should hopefully not be an issue.

Now let’s talk about that third tier that I’ve been alluding to. And this is what you might have heard about previously, and that’s the quote partnership and quote with OpenAI and ChatGPT. So what this third tier is an acknowledgment that Siri with these models can’t do everything that people might expected to do given what they’ve been doing with current existing models like Gemini, ChatGPT, etc. All it is a way for 3rd parties to integrate into iOS, macos, etc. And allow you to summon whatever model you choose to ask queries that Siri simply can’t really handle. It is not technically exclusively ChatGPT, just happens to be the first model that apple was able to secure as one of the options. But they’ve already been openly saying that, hey, look, we’re looking to get in Gemini in there and also other models further down the road. So you will have the option to choose which model you want to use, much like you can now change the default search engine in your web browser to maintain the privacy angle. You, of course, are going to be asked anytime it wants to do this, if it’s okay to send whatever information that the apple model collected to the services to fulfill your request. In fact, apple has also said that these aren’t going to be enabled by default.

Once you actually ask a query that requires a model like this, it’ll ask you if you want to turn on the ChatGPT integration. Or maybe later down the road, it’ll ask you which model you want to use. And you have the option there to say, yes, I want to start doing this or no, actually I don’t. And it’s not exactly clear what happens if you say no. If it just will start saying, sorry, I can’t do that, or if something else happens now, how much are you expected to actually be using each of these tiers?

Looking at this from where we are with these AI models today, you might assume that almost all of your actions are actually going to be done with Chat GB t. Because what we’re used to doing with the models is the stuff that it is better at. However, what apple thinks is that in reality, you’re going to be doing what their models are actually designed to do, which is the stuff that is going to be handled by either models primarily locally on your device and a relatively small subset of those things that you want to do with that are going to need to be delegated to their security, pure cloud compute.

Now I had a conversation with an apple spokesperson in which I asked if certain types of tasks will always be delegated to these click cure cloud compute or if it’s based more on the intensity, if you will, of individual tasks despite its type. And they pointed out that whether or not a task is put into the secure cloud compute versus done local end device doesn’t depend on the type of task it is as much it is how intensive of a task that is. So you could ask it a really complicated question about stuff that’s locally on your device, and it might need to rely on the cloud to do that. Or you could ask it to do something that might seem intensive. But if your specific query is actually easy enough to do, it’ll say, actually, know what, this time I’m gonna do it locally on the device rather than relying on the cloud. Okay, I’m running out of time here. In the next video, let’s talk about some of the features that there’s this actually all going to enable.