Apple Intelligence will improve Siri in 2024, but don’t expect most updates until 2025

Screenshot by Nina Raemont

If you’ve been hoping that Siri will someday work the way it should, you may finally get your wish with iOS 18. However, the improvements likely won’t come all at once. In Bloomberg’s latest edition of his Power On newsletter, tech reporter and Apple watcher Mark Gurman discussed which advancements are likely to hit Siri in 2024 and which ones won’t pop up until 2025.

At its WWDC 2024 conference a week ago, Apple touted several of the changes in store for Siri, many of them courtesy of the company’s Apple Intelligence technology. In essence, Apple’s voice assistant should eventually be able to better understand you using natural language, remember the context of your conversations, be aware of your current screen, and perform tasks across different apps.

Also: Siri has become Apple’s Bing. It’s time for an AI-inspired change

Apple didn’t disclose exactly which features would arrive when, but Gurman gives us some clues.

With iOS 18 due to officially launch this fall, Siri will sport certain “bells and whistles” this year, according to Gurman. The interface is one item on the list. The familiar swirling ball at the bottom of the screen will be replaced by a glowing background light that wraps around the edge of your screen.

Next, Siri will be able to better understand you, even if you make a mistake. At WWDC, Apple demoed this aspect by intentionally making and then correcting a mistake in a voice request. Siri picked up the mistake and correction, helping it properly answer the question.

READ MORE  Age tech at CES was much more than gadgets

Siri should also be able to carry on a more natural conversation. For example, you wouldn’t have to say “Hey Siri” or “Siri” for each question or request you speak in a back-and-forth conversation. Further, Siri will have a greater understanding of Apple products, presumably so that it can reference and work with your devices.

Finally, the voice assistant will offer a “Type to Siri” option. This should come in handy if you’re in a public place or noisy area and would rather type your request than speak it.

That sounds like a healthy mix of features on tap for this year. If the updates all work as planned, they would go a long way toward making Siri more usable and less frustrating. That still leaves a laundry list of items that likely won’t surface until next year, Gurman said, explaining that “several employees involved in Apple Intelligence’s development have told Power On that a subset of the service won’t launch until next year.”

Also: Everything to know about Apple’s AI features coming to iPhones, Macs, and iPads

One Siri update destined for 2025 is the ability to find and act on items on your device based on their context. As one example demoed by Apple at WWDC, you would be able to ask Siri when your mom’s flight is landing. Siri would serve up the right information based on prior emails and texts you exchanged with your mom. In other examples, you might issue a simple command asking Siri to retrieve a podcast shared by your spouse or a document emailed by a colleague.

READ MORE  Our Unique Promo Codes Save You As much as $300 on Hydrow Rowing Machines

Another feature known as semantic indexing would help Siri better grasp the context of any content and personal data on your device. In this instance, you could ask questions or issue requests that deal directly with specific data.

Also: Here’s how Apple’s keeping your cloud-processed AI data safe (and why it matters)

Siri will also one day be able to control your device and apps via several actions in one request. You might ask Siri to show photos of a specific friend wearing a red jacket, then ask it to edit them, and finally tell it to send the photos in an email.

One more item that Gurman expects to pop up in 2025 is on-screen awareness. With this update, Siri would be aware of your current screen and what you’re doing. For example, maybe you’re texting with a friend about LeBron James. You could ask Siri how many points the basketball star scored last night, and it would answer the question based on the information on your screen.

Access to Siri’s new AI-infused features will also be limited at first, according to Gurman. When the Apple Intelligence features roll out in the fall, they’ll be in preview mode, a sign that they’re not quite fully baked. The AI skills will work only on certain Apple devices and only in US English. Gurman even suggested that you may have to join a waitlist to tap into the new features.

Also: Why Apple’s best new AI features from WWDC will be boring (and I’m glad)

Why dole out the new Siri and Apple Intelligence skills on such a piecemeal basis? Gurman cited several reasons.

READ MORE  Latest Israel-Hamas war news summary and Gaza death toll updates

The staged approach gives Apple the time required to work on certain features, release them when they’re finished, and then move on to other features. By starting with US English, Apple can take more time to train its AI and Siri on other languages. Further, Apple will have more time to build out its cloud environment to handle the AI and Siri services.

Speaking as an Apple user, I’m more than willing to wait for the company to gradually roll out these new features. My only hope is that they’re worth the wait and that Siri finally becomes the voice assistant it was meant to be.

Leave a Comment