Five things we hope Apple will announce at WWDC – from the new Siri in iOS 18 to a more advanced iPadOS
We’re just a few days away from Apple’s opening keynote of WWDC 2024 and expect there to be a lot of focus on software at the event. Everything starts at 10am PT / 1pm ET / 6pm BST on June 10 (3am AEST, June 11), and luckily the special event will be streamed live so you can watch.
While Ny Breaking will be on site, I surveyed some of my colleagues ahead of the developer conference to get a sense of what Apple would like to talk about and even reveal. These are some of the most common items desired and are not based on one specific platform or product.
Read on for five things we hope Apple will announce at WWDC, from a smarter Siri to a more advanced iPadOS with AI-inspired photo editing and useful iPhone features, which we hope Apple’s Tim Cook will show off on stage.
5. A smarter keyboard for iPhone
With the expected wealth of AI features, including everything from summarizing notes to letting Siri control apps, I hope we’ll see some improvements to the keyboard on the iPhone and – why not – the iPad too.
We’ve already suggested words and phrases at the top that are context-aware. However, bringing in AI and understanding the message, what was said, or even the steps up to when the keyboard appeared could make it even more practical. Given its smart nature, there are rumors going around about suggested replies for Messages and Mail, and building this extra functionality into the keyboard seems to go hand in hand.
Building in some formal system grammar, spelling, and punctuation checks would also help make messages a little cleaner. There may also be times when you have to edit something yourself or retype it from scratch.
Between third-party apps like Snapseed or Pixelmator for the iPhone, iPad, and Mac and competing phones like Google’s Pixel with Magic Eraser and the Galaxy AI suite on Samsung phones, it’s time to step up the editing game within Photos.
Whether on the iPhone, iPad, or Mac, I would like the ability to intelligently remove a person or object from the background of images. Some kind of super button that goes beyond just messing with white balance or contrast, but also integrates smart crops and other intelligent photo tools, could make the whole process a lot more concise and be a way for Apple to show off its generative AI chops .
3. Smarter battery management on iPhone
If you’ve got a lot going on and your iPhone’s battery is running low, chances are you’ve searched through the power saving mode settings. But just as the focus modes (basically modified versions of Do Not Disturb, depending on what you’re doing) are enabled automatically, Apple should add some intelligence to battery management.
So if you’re far from home and the percentage is 50% or lower, why not automatically turn on Power Saving Mode (or at least prompt you to turn it on) and get your iPhone through the rest of the day ?
2. A more advanced iPadOS
We’ve heard a lot of reports and rumors about iOS 18: summaries of notes, emails, and web pages in addition to a Siri that can control apps and a custom emoji maker. Chances are that many of these will appear on the iPad thanks to iPadOS 18, but since we’re only just off the release of the iPad Pro with the M4 chip and the new iPad Air, Pro users of Apple tablets will want still would like more features – myself included.
One idea is a further extension of Stage Manager, an advanced multitasking experience that lets you place and use multiple window applications on the same screen. It’s also available on macOS, but when paired with the Magic Keyboard on iPad, it takes on more of a laptop-like experience. It would be nice if using the iPad in the Magic Keyboard unlocked a more desktop-like mode with more freedom to place open apps and even icons or widgets on the home screen. The key here, however, is still touch first and a user can expand the controls with a trackpad, keyboard, or Apple Pencil.
1. A smarter, more intuitive Siri
Like everyone else, I want Siri to do more and be more helpful on all devices where I have access to the virtual assistant. The idea of a Siri that can control specific functions of an app and even stack them is great and could be really useful. I don’t have to search through multiple apps to copy a photo, put it in files, and then share it with a colleague… it can just be a voice request. This would also be an opportunity for Apple to expand its internal technology for large language models, as it would let Siri understand what was on the screen, what was there, and what was next; it’s all about being context aware.
The answer seems to be giving Siri the equivalent of a new brain, or rather, integrating what makes AI chatbots so interesting, and that’s an entirely new grand language model to reimagine Apple’s virtual assistant. It could be a built-in home, but rumors also point to Apple working with OpenAI, which could inject a bit of ChatGPT into Siri, but it’s all speculative now. It also raises the question of on-device processing versus sending the request to the cloud – the latter can be a privacy issue, but can also increase the time it takes to provide a response.
Ultimately, I hope Siri will be immediately helpful with requests or questions, but also smarter around the home, on Apple TV, on the wrist with an Apple Watch, and even on the Mac.