close
close

first Drop

Com TW NOw News 2024

Apple iPhone 16 Pro and iPhone 16 Pro Max review: smarter iPhones
news

Apple iPhone 16 Pro and iPhone 16 Pro Max review: smarter iPhones

Summing up seems to be something everyone wants to do with AI, and Apple Intelligence is ready to do the same. You can have it summarize your emails, messages, and even notifications from third-party apps. Some of this can be useful, like when the Mail app pulls up an urgent-sounding email in its summary that I would have missed if I’d just glanced at the massive collection of emails. But more often than not, I swipe away the summary and dive into all the notifications.

Speaking of summarizing, there is a summarizing feature built into Safari, but you have to put the web page in Reader mode. It’s things like this that make it hard to find these clever features and don’t forget they exist. I was able to at least summarize an 11,000 word story and get the gist of it when I didn’t have time to sit down and read it. (Sorry.) I’ll forgive you if you summarize this review.

Probably the most useful Apple Intelligence features for me as a journalist who attends multiple briefings per month are the new transcription tools in Notes, Voice Memos, and even the Phone app. Hit record in Voice Memos and Notes and the apps transcribe conversations in real time! When you’re on a phone call, just tap the record button and after both parties are notified, the call will be recorded and you’ll get a transcript saved to your Notes app.

For all of these things, a lot depends on the quality of the microphone for the person on the other end. Either way, it’s certainly better than no transcription at all. It’s a shame there are no speaker labels, like in Google’s Recorder app. You also can’t search these recordings to find a specific quote. (Technically, you can if you add the transcript to your note in the Notes app, but that’s an extra step.)

The Photos app is also getting an infusion of Apple Intelligence, and the highlight here is the Clean Up feature. Much like Google’s Pixel phones, which introduced Magic Eraser over three years ago, you can now remove unwanted objects from the background of your iPhone photos. It works pretty well in my experience, though I’m a little surprised that Apple gives you so much freedom to erase something. I completely erased my eye from existence in a selfie. I erased all of my fingers from my hand. (Google’s feature doesn’t let you erase parts of someone’s face.)

Video: Julian Chokkattu

Next, I erased my mug, which was in front of my face when I took a sip, and Clean Up attempted to generate the rest of my face that was previously hidden, with horrible results. (For what it’s worth, I tried this on the Pixel 9 and the results were just as bad, though Google did (Give me more options.) As my colleague on Slack said, “They both seem to have been trained on images of Bugs Bunny.”

There’s more to come in Apple Intelligence. Image Playground will let you generate images. Genmoji will let you create new kinds of emoji that currently only exist in your head. Siri will be better at delivering more contextually relevant information. But I’ll dive into Apple Intelligence again when those features roll out later this year. As a reminder, Apple Intelligence is coming in the upcoming iOS 18 update, but it’s only available on select devices: the iPhone 15 Pro, 15 Pro Max, and the entire iPhone 16 lineup.