close
close

first Drop

Com TW NOw News 2024

Apple Intelligence is here, but there is still a lot to learn
news

Apple Intelligence is here, but there is still a lot to learn

Apple Intelligence has finally arrived, and like most smartphone AI so far, it’s mostly disappointing.

Apple Intelligence’s debut features are all very familiar: there are glowing gradients and twinkling icons that indicate the presence of AI; writing tools that make your emails sound more professional; and an AI eraser in Photos that removes distractions. It’s all there, and it all works well. But none of that even comes close to the time-saving computing platform shift we’ve been promised.

Essentially, there are two Apple Intelligences: the one that’s here now and the one we might see in the future. Even in today’s launch announcement, Apple is busy teasing the features that haven’t launched yet. What’s here today is a handful of tools that broadly share a common theme: helping you eliminate distractions and find the signal in the noise. That’s the theory anyway.

Apple uses AI to summarize groups of notifications so you can catch up on what you missed more quickly. You can summarize long emails and use a new focus mode that filters out unnecessary distractions. In practice these things work, but after a week of use I don’t feel like I’ve saved much time or energy.

In the Mail app, AI summaries appear where the first line of an email would normally appear when viewing an entire inbox; there is also an option to summarize individual emails. Perhaps it reflects how useless email has become, but I didn’t find any of these features very useful. You know what feature we already use to summarize an email pretty well? The subject line. At least that’s true for most emails I receive; they are usually short and to the point. Maybe Tim Cook saves himself a lot of time reading long emails, but personally I could live without a little summary of every email the DNC sends me asking for three dollars before midnight.

Report summaries seem a bit more promising to me. It’s at least quite funny to see how AI tries to summarize a series of gossip texts or a number of notifications from your doorbell. But some important information also emerged in a series of text messages from a friend, and if I hadn’t seen that summary when I checked my phone, I might have read the messages much later. That was helpful.

In Photos, you’ll find the new Cleanup tool in your editing options. It is designed to quickly remove objects from a scene; you can tap something that the tool automatically highlights, or you can sketch something yourself that you want to remove. It runs on the device, so you just have to wait a few moments and you’ll see the selected object (mostly) disappear.

I erased a table behind this kid using Google’s older Magic Eraser tool in Google Photos.

Apple’s Clean Up does a better job of removing the table, but isn’t exactly miles ahead of Google’s tool.

The tool does a good enough job, especially for smaller objects in the background. But it’s only about as good as Google’s older Magic Eraser tool in Google Photos – occasionally it’s better, but it’s not as good as Google’s Magic Editorthat uses generative AI for incredibly convincing object removal. That tool runs in the cloud, so it’s a bit of apples and oranges, but still. I can use Google Photos’ Magic Eraser tool on my four-year-old iPhone 12 Mini, and the results are pretty close to what I get with Clean Up on the iPhone 16 – not a good argument for the AI ​​phone upgrade cycle .

There is of course also an improved Siri. Sure, it looks different, and typing searches is a handy addition, but you don’t have to use it for long to realize that it’s basically the same old Siri, just with a new guise. It handles natural language better and includes more product knowledge to help you find the settings on your iPhone, but that’s about it. Apple has promised major updates to Siri, with features like a ChatGPT extension expected to arrive by the end of the year. But the big things – contextual awareness, the ability to take action in apps – are all planned for 2025.

Other features – like AI-generated photo reminders and smart replies – do what they’re supposed to do, but lack a certain human touch. I didn’t send any of the AI-suggested replies in my messages, even though they conveyed the right feelings. If I’m going to take the time to respond to a text, I might as well write “That’s hard” myself instead of AI doing it, you know? Isn’t that the point of texting someone? I also asked Photos to create a memory of my child’s moments, which it did, but called it the eerily impersonal “Joyous Moments with Child.”

Apple is catching up a bit.

To be clear, criticism of Apple Intelligence is not an endorsement of the intelligence of other phones; they are all useless to varying degrees at the moment. If you want to make it look like a helicopter crashed into an empty meadow, then there’s definitely AI for that. But what if you want help getting things done? That’s not quite finished yet.

And honestly, this is v1, and Apple has made it pretty clear that its more impressive Intelligence features will be coming in the coming year. But Apple also put a big, bright ‘Built for Apple Intelligence’ bow on every new iPhone, iPad and Mac it now sells, suggesting we’d regret buying an Apple device that can’t handle with AI. If Apple Intelligence is a letdown right now, it’s because Apple has built it up to impossible heights.

There’s more to come, and some of it looks promising. With this first wave of AI features, Apple is playing catch-up to Google and Samsung. But no phone maker has yet developed a coherent set of time-saving AI tools. Apple may be late, but the game has only just begun.