
Apple Intelligence gets even more powerful with new capabilities across Apple devices
Apple today announced new Apple Intelligence features that
elevate the user experience across iPhone, iPad, Mac, Apple Watch, and Apple
Vision Pro. Apple Intelligence unlocks new ways for users to communicate with
features like Live Translation; do more with what’s on their screen with
updates to visual intelligence; and express themselves with enhancements to
Image Playground and Genmoji.1 Additionally, Shortcuts can now
tap into Apple Intelligence directly, and developers will be able to access the
on-device large language model at the core of Apple Intelligence, giving them
direct access to intelligence that is powerful, fast, built with privacy, and
available even when users are offline. These Apple Intelligence features are
available for testing starting today, and will be available to users with
supported devices set to a supported language this fall.
“Last year, we took the first steps on a journey to bring
users intelligence that’s helpful, relevant, easy to use, and right where users
need it, all while protecting their privacy. Now, the models that power Apple
Intelligence are becoming more capable and efficient, and we’re integrating
features in even more places across each of our operating systems,” said Craig
Federighi, Apple’s senior vice president of Software Engineering. “We’re also
taking the huge step of giving developers direct access to the on-device
foundation model powering Apple Intelligence, allowing them to tap into
intelligence that is powerful, fast, built with privacy, and available even
when users are offline. We think this will ignite a whole new wave of
intelligent experiences in the apps users rely on every day. We can’t wait to
see what developers create.”
Apple Intelligence features will be coming to eight more
languages by the end of the year: Danish, Dutch, Norwegian, Portuguese
(Portugal), Swedish, Turkish, Chinese (traditional), and Vietnamese.
Live Translation Breaks Down Language Barriers
For those moments when a language barrier gets in the way,
Live Translation can help users communicate across languages when messaging or
speaking. The experience is integrated into Messages, FaceTime, and Phone, and
enabled by Apple-built models that run entirely on device, so users’ personal
conversations stay personal.
In Messages, Live Translation can automatically translate
messages. If a user is making plans with new friends while traveling abroad,
their message can be translated as they type, delivered in the recipient’s
preferred language, and when they get a response, each message can be instantly
translated.2 On FaceTime calls, a user can follow along with
translated live captions while still hearing the speaker’s voice. And when on a
phone call, the translation is spoken aloud throughout the conversation.3
News Ways to Explore Creativity with Updates to Genmoji
and Image Playground
Genmoji and Image Playground provide users with even more
ways to express themselves. In addition to turning a text description into a
Genmoji, users can now mix together emoji and combine them with descriptions to
create something new. When users make images inspired by family and friends
using Genmoji and Image Playground, they have the ability to change expressions
or adjust personal attributes, like hairstyle, to match their friend’s latest
look.
In Image Playground, users can tap into brand-new styles
with ChatGPT, like an oil painting style or vector art. For moments when users
have a specific idea in mind, they can tap Any Style and describe what they
want. Image Playground sends a user’s description or photo to ChatGPT and
creates a unique image. Users are always in control, and nothing is shared with
ChatGPT without their permission.
Visual Intelligence Helps Users Search and Take Action
Building on Apple Intelligence, visual intelligence extends
to a user’s iPhone screen so they can search and take action on anything
they’re viewing across their apps.
Visual intelligence already helps users learn about objects
and places around them using their iPhone camera, and it now enables users to
do more, faster, with the content on their iPhone screen. Users can ask ChatGPT
questions about what they’re looking at on their screen to learn more, as well
as search Google, Etsy, or other supported apps to find similar images and
products. If there’s an object a user is especially interested in, like a lamp,
they can highlight it to search for that specific item or similar objects
online.
Visual intelligence also recognizes when a user is looking
at an event and suggests adding it to their calendar.4 Apple
Intelligence then extracts the date, time, and location to prepopulate these
key details into an event.
Users can access visual intelligence for what’s on their
screen by simply pressing the same buttons used to take a screenshot. Users
will have the choice to save or share their screenshot, or explore more with
visual intelligence.
Apple Intelligence Expands to Fitness on Apple Watch
Workout Buddy is a first-of-its-kind workout experience on
Apple Watch with Apple Intelligence that incorporates a user’s workout data and
fitness history to generate personalized, motivational insights during their
session.5
To offer meaningful inspiration in real time, Workout Buddy
analyzes data from a user’s current workout along with their fitness history,
based on data like heart rate, pace, distance, Activity rings, personal fitness
milestones, and more. A new text-to-speech model then translates insights into
a dynamic generative voice built using voice data from Fitness+ trainers, so it
has the right energy, style, and tone for a workout. Workout Buddy processes
this data privately and securely with Apple Intelligence.
Workout Buddy will be available on Apple Watch with
Bluetooth headphones, and requires an Apple Intelligence-supported iPhone
nearby. It will be available starting in English, across some of the most
popular workout types: Outdoor and Indoor Run, Outdoor and Indoor Walk, Outdoor
Cycle, HIIT, and Functional and Traditional Strength Training.
Apple Intelligence On-Device Model Now Available to
Developers
Apple is opening up access for any app to tap directly into
the on-device foundation model at the core of Apple Intelligence.
With the Foundation Models framework, app developers will be
able to build on Apple Intelligence to bring users new experiences that are
intelligent, available when they’re offline, and that protect their privacy,
using AI inference that is free of cost. For example, an education app can use
the on-device model to generate a personalized quiz from a user’s notes,
without any cloud API costs, or an outdoors app can add natural language search
capabilities that work even when the user is offline.
The framework has native support for Swift, so app
developers can easily access the Apple Intelligence model with as few as three
lines of code. Guided generation, tool calling, and more are all built into the
framework, making it easier than ever to implement generative capabilities
right into a developer’s existing app.
Shortcuts Get More Intelligent
Shortcuts are now more powerful and intelligent than ever.
Users can tap into intelligent actions, a whole new set of shortcuts enabled by
Apple Intelligence. Users will see dedicated actions for features like
summarizing text with Writing Tools or creating images with Image Playground.
Now users will be able to tap directly into Apple
Intelligence models, either on-device or with Private Cloud Compute, to
generate responses that feed into the rest of their shortcut, maintaining the
privacy of information used in the shortcut. For example, a student can build a
shortcut that uses the Apple Intelligence model to compare an audio
transcription of a class lecture to the notes they took, and add any key points
they may have missed. Users can also choose to tap into ChatGPT to provide
responses that feed into their shortcut.
Additional New Features
Apple Intelligence is even more deeply integrated into the
apps and experiences that users rely on every day:
- The
most relevant actions in an email, website, note, or other content can now
be identified and automatically categorized in Reminders.
- Apple
Wallet can now identify and summarize order tracking details from
emails sent from merchants or delivery carriers. This works across all of
a user’s orders, giving them the ability to see their full order details,
progress notifications, and more, all in one place.
- Users
can create a poll for anything in Messages, and with Apple
Intelligence, Messages can detect when a poll might come in handy and
suggest one. In addition, Backgrounds in the Messages app lets a user
personalize their chats with stunning designs, and they can create unique
backgrounds that fit their conversation with Image Playground.
These features build on a wide range of Apple Intelligence
capabilities that are already available to users:
- Writing
Tools can help users rewrite, proofread, and summarize the text
they have written. And with Describe Your Change, users can describe a
specific change they want to apply to their text, like making a dinner
party invite read like a poem.
- Clean
Up in Photos allows users to remove distracting elements while
staying true to the moment as they intended to capture it.
- Visual
intelligence builds on Apple Intelligence and helps users learn
about objects and places around them instantly.
- Genmoji allow
users to create their own emoji by typing a description. And just like
emoji, they can be added inline to messages, or shared as a sticker or
reaction in a Tapback.
- Image
Playground gives users a way to create playful images in moments,
with concepts like themes, costumes, accessories, and places. And they can
add their own text descriptions, and create images in the likeness of a
family member or friend using photos from their photo library.
- Image
Wand can transform a rough sketch into a polished image that
complements a user’s notes.
- Mail
summaries give users a way to view key details for an email or
long thread by simply tapping or clicking Summarize.
- Smart
Reply provides users with suggestions for a quick response in
Mail and Messages.
- Siri
is more natural and helpful, with the option to type to Siri and
tap into its product knowledge about the features and
settings on Apple products; Siri can also follow along if a user stumbles
over their words, and maintain context from one request to the next.
- Access
to ChatGPT is integrated in Writing Tools and Siri,
giving users the option to tap into ChatGPT’s image- and
document-understanding capabilities without needing to jump between tools.
- Natural
language search in Photos makes it easier for users to find a
photo or video by simply describing it.
- Users
can create a memory movie in Photos by typing a
description.
- Summaries
of audio transcriptions in Notes are automatically generated to
surface important information at a glance.
- Users
can generate summaries of call transcriptions to
highlight important details.
- Priority
Messages, a section at the top of the inbox in Mail, shows the most
urgent emails, like a same-day invitation to lunch or a boarding pass.
- Priority
Notifications appear at the top of a user’s notifications,
highlighting important notifications that may require immediate attention.
- Notification
summaries give users a way to scan long or stacked notifications
and provide key details right on the Lock Screen.
- Previews
in Mail and Messages show users a brief summary of key
information without needing to open a message.
- The Reduce
Interruptions Focus surfaces only the notifications that might
need immediate attention.
A Breakthrough for Privacy in AI
Designed to protect users’ privacy at every step, Apple
Intelligence uses on-device processing, meaning that many of the models that
power it run entirely on device. For requests that require access to larger
models, Private Cloud Compute extends the privacy and security of iPhone into
the cloud to unlock even more intelligence so a user’s data is never stored or
shared with Apple; it is used only to fulfill their request. Independent
experts can inspect the code that runs on Apple silicon servers to continuously
verify this privacy promise, and are already doing so. This is an extraordinary
step forward for privacy in AI.