Apple Intelligence is a new set of AI features that will be available in iOS 18. These features will allow users to use apps in a new way.
Today, regulators are always going after the old App Store strategy. At the same time, people can get a lot done by asking an AI assistant like ChatGPT pretty easy questions. Some people think that AI could replace humans as the main way we find answers, get work done, and try out new ideas.
What does that mean for apps and the services they provide, which brought in more than $6 billion for Apple last quarter?
The answer gets to the heart of Apple’s plan for AI.
Right out of the box, Apple Intelligence only has a few features available. These include writing prompts, tools for summarizing, creative art, and other basic features.
That being said, Apple showed off new features at its Worldwide makers Conference (WWDC) in June of this year. These features will let makers’ apps connect with Siri and Apple Intelligence in deeper ways.
As the smart assistant gets better, Siri will be able to call up any item from an app’s menu without the creator having to do anything extra. That means people could, say, ask Siri to “show me my presenter notes” in a slide deck, and Siri would know what to do. Siri will also be able to read any text on the page, so people can use what’s on their screen to look something up or do something.
That is, if you were looking at your note to “Happy Birthday” a family member, you could tell Siri to “FaceTime him,” and it would know what to do.
In comparison to what Siri can do now, that’s already an improvement, but it gets better. Apple also gives app makers the tools they need to add Apple Intelligence to their own apps. Apple said at WWDC that Apple Intelligence would first be available to Some Apps, such as Books, Browsers, Cameras, Document Readers, File Managers, Journals, Mail, Photos, Presentations, Spreadsheets, Whiteboards, and Word Processors. Over time, Apple is likely to let all makers in the App Store use these features.
The AI features will be added on top of the App Intents platform, which is getting more intents for developers. In the end, users will be able to do more than just open their apps with Siri. They will also be able to use them.
Users wouldn’t have to look through an app’s options to find the feature they need to get something done. They could just ask Siri.
Users could also make these requests while talking normally, like in a chat, and could talk about things that were important to them.
For example, you could tell an app like Darkroom to “add a cinematic effect to the picture I took of Ian yesterday.” The Siri we have now would not agree to this request, but the Siri powered by AI would know to use the app’s Apply Filter intent and know which picture you want to use it on.
Apple says that Siri will be able to do what you want it to do even if you stutter or bring up something from a previous chat.
You could also do things in more than one app. After making changes to a picture, you could ask Siri to move it to a different app, like Notes, without touching anything.
Spotlight, the iPhone’s search function, will also be able to look for data in apps by adding app properties to its index. Here, “Apple Intelligence” means how well it knows things like pictures, messages, files, calendar events, and more.
Of course, developers need to accept this more subtle use case for AI. Apple’s revenue-sharing rules, which let the company keep 30% of sales for goods and services bought through any app, have turned off some of its bigger apps and even some of its smaller ones over the years. But developers might be interested again if Siri makes apps that were hidden in the back of the phone’s App Library easy to find using voice instructions.
Forget about boring training screens that teach users how to use and navigate their app. Instead, developers could focus on making sure Siri knows how their app works and how users might ask it to do things. If that happened, Siri users could talk to the app or type orders into it, just like they do now with an AI chatbot like ChatGPT.
Apple’s new AI architecture will also help third-party coders in other ways.
Because it works with OpenAI, Siri will be able to ask ChatGPT questions when it doesn’t know the answer. With the iPhone 16 line, Apple will add a visual search feature that lets users access OpenAI’s chatbot or Google Search by tapping on the new Camera Control button on the side. This will turn what the user sees in the camera’s viewfinder into a question that can be answered.
Because developers will probably accept these changes at different rates, they won’t feel as revolutionary right away as when ChatGPT came out.
Also Read: Everything Apple Event 2024 Showed Off: Iphone 16, Apple Intelligence, Airpods 4, and More
Also, it looks like these plans for the future are still a ways off. In the most recent betas of iOS 18, the features don’t seem to work fully. Along with being amazed by what the new Siri can do, I found it hard to understand what it couldn’t do. That includes apps made by Apple. Because of this, you can only ask Siri in the Photos app to send a photo to someone. You can’t ask it to do something more complicated, like make the picture into a sticker. Before Siri stops running into these kinds of problems, the feature might become annoying to use.
What do you say about this story? Visit Parhlo World For more.