This will be the first year Google launches new Pixel phones before Apple’s next-generation iPhones hit stores. The Pixel 9 event is scheduled for August 13th, a bold move for the company. It signals that Google believes the Pixel 9 lineup is strong enough to make an impression with potential buyers more than a month before the iPhone 16 hit stores.
As with any of Google’s announcements since ChatGPT arrived on the scene, you should expect the Pixel 9 even to feature new AI experiences that Google has tailored for its handsets. Rumors going back to last December teased an AI assistant coming to the next-gen Pixel 9, bringing Google’s Gemini AI to the phones.
We already learned at I/O 2024 that a multimodal Gemini Assistant similar to ChatGPT (GPT-4o) would be available on Android later this year. At the time, I assumed this could be an exclusive Pixel 9 feature, at least initially.
Fast-forward to early July and some of Google’s purported new AI features for the Pixel 9 might have leaked — but something seems to be missing. They don’t sound as impressive as the Apple Intelligence experience that Apple introduced about a month ago.
Tech. Entertainment. Science. Your inbox.
Sign up for the most interesting tech & entertainment news out there.
By signing up, I agree to the Terms of Use and have reviewed the Privacy Notice.
There’s also some more potentially bad news: one of the new AI tricks coming to Pixel 9 phones might resemble Microsoft’s controversial Windows 11 Recall feature, which has now been recalled. However, there are key differences between the two, with Google taking a different approach that seems much smarter.
Google AI is coming
A source inside Google reportedly revealed the Pixel 9 AI features to Android Authority’s Kamila Wojciechowska. Apparently, Google is going for a “Google AI” branding for the experience, which is in line with rivals. Samsung debuted Galaxy AI earlier this year, and Apple came out with Apple Intelligence, a clever way to spin the “AI” abbreviation into an Apple thing.
According to a leaked screenshot, “Google AI at its best” will feature some interesting new experiences. The Gemini Assistant and Circle to Search will be part of the new experiences on Pixel 9 phones. Then, there are three new AI features for the upcoming handsets: Add Me, Screenshots, and Studio.
Add Me seems to be a photo feature that lets you “make sure everyone’s included in a group photo.” I certainly hope this isn’t an editing feature that lets you add people to photos even if they weren’t there when the shot was taken. But it sure sounds like that’s what it is. The feature would build on the existing Magic Editor and Best Take AI features available on Pixel 8 devices.
Studio might be related to a Creative Assistant app that leaked recently. At the time, we heard that Google wants to let Android 15 users create custom stickers and potentially emojis just like Apple does in iOS 18. It wouldn’t be surprising to hear that Pixel 9 phones will be the first to get it.
Pixel Screenshots
Finally, Pixel Screenshots is the Google AI feature that seems to resemble Microsoft’s Recall. But there are key differences here. First, the AI only looks at the screenshots you save, while Recall takes screenshots of everything you do.
Pixel Screenshots is the new home for all your screenshots. When you turn on the app’s Al processing, it will save and process helpful details to become a searchable library to help you find anything in your screenshots.
Secondly, Screenshots will answer questions about the information in those screenshots, which is information you might need. It won’t show you what you did at a certain time in the past on your handset:
If you turn on Al processing in Pixel Screenshots, the app will use Al to summarize your new and existing screenshots and answer your questions about the information in them. When you turn on this feature, new screenshots will also save metadata like web links, app names, and when the screenshot was taken, so that you can search these details later.
Finally, unlike Recall, you’ll be able to turn off the feature easily.
Pixel Screenshots data will be processed on-device, so Google won’t see your screenshots. But, unlike Recall, it might be more difficult for hackers to access your screenshots.
The real Pixel 9 magic
What’s missing from this picture is the magic. The connections between these features might not be evident, so we don’t yet have a complete picture of what the Google AI experience will be like. But I think the multimodal Gemini Assistant that Google demoed at I/O will be a key part of all that, though it’s something the report doesn’t focus on.
Google AI will be able to hear and see what you hear and see. It’ll also be able to look at your screen or at your surroundings. Add the screenshots feature, and Gemini access in Google apps, and then Google AI starts to look a lot more like Siri in Apple Intelligence.
I’m speculating here, but if that’s the case, we’ll all understand why Google might have wanted to host this Pixel 9 event before the iPhone 16 models came out. Google might want Google AI to be the first AI assistant on phones and beat Apple Intelligence’s Siri to market.