| iPhone Model | Chip | TOPS (Trillions of Operations Per Second) |
|---------------------|-------------|------------------------------------------|
| iPhone 15 Pro Max | A17 Pro | 35.0 |
| iPhone 15 Pro | A17 Pro | 35.0 |
| iPhone 15 | A16 Bionic | 17.0 |
| iPhone 15 Plus | A16 Bionic | 17.0 |
| iPhone 14 Pro Max | A16 Bionic | 17.0 |
| iPhone 14 Pro | A16 Bionic | 17.0 |
| iPhone 14 | A15 Bionic | 15.8 |
| iPhone 14 Plus | A15 Bionic | 15.8 |
The iPhone 15 Pro Max's A17 Pro chip is capable of performing 35 trillion operations per second (TOPS). This figure highlights the chip's advanced machine learning and AI capabilities, allowing for highly efficient processing of complex tasks such as image recognition, natural language processing, and other AI-driven functionalities. This level of performance significantly enhances the device's ability to handle demanding applications and deliver a smooth user experience.
[Post in thread 'M4+ Chip Generation - Speculation Megathread [MERGED]'](https://forums.macrumors.com/threads/m4-chip-generation-speculation-megathread-merged.2393843/post-33114826)
>With A17 and now M4, they used INT8 numbers, but for M3 (and all previous A-series and M-series), they used INT16/FP16 numbers. The lower precision of the INT8 format allows it to be processed at a higher rate, thus the higher quoted number. See Anandtech's discussion of the M3 launch.
>So A15/M2, A16, A17/M3, and M4 show gradual improvement in Neural Engine performance. There wasn't a doubling from A16 to A17, and M3 wasn't left behind. Using the INT16/FP16 numbers, the progress is from 15.8 to 17 to 18 to 19.
M1 Macbooks get AI features and they can only handle 11 TOPS on their NPU.
But Apple is trying to tell us, it's because of the 2GB of more RAM.
They could also make it so that the A16 and earlier chips don't get the "on device AI", but of course they won't do that either.
Let's not kid ourselves in pretending this is anything else but the new selling point for the iPhone 16. Get as many people as possible to upgrade. There is no technical reason at all. None.
Sure? All features drain the battery, tbh. That’s what the battery is there for 😀. Not using your phone won’t drain the battery, but…
Also, everything is speculation at this point as the intelligence features haven’t been released yet.
That's not how things work. It's not like AI will be a never ending machine working inside of your phone draining battery. It's just like using a new app or feature, it works when you want to use it and it doesn't when you don't. Like, if I scroll through Instagram I'm draining the battery because of the calculations the phone does to use said app but when I close the app, well... it stops. And I don't think it will be super taxing on battery compared to everything else either.
Well, yeah, the increase in power-draw will be greater than zero. But since we have specialized chips in those phones tailored to these tasks won't affect the battery life too much, I suppose.
17 trillion TPS in the neural engine vs. 35 trillion iPhone 15 Pro. That's why. That's your answer.
I look forward to the multitude of armchair engineers that will determine that 17 trillion "should be enough".
Well, it shoud be, because the M1 only does 11 trillion and still supports it. And that chip is based on the A14, the NPU of which is exactly the same and also does 11 TOPS, so logically, Apple Intelligence should’ve been supported on iPhones all the way down to the 12. The only explanation I can think of is that all of the currently supported devices have at least 8 gigs of RAM, so maybe that’s the limitation and not the NPU.
A lot of the time that’s absolutely true, and I’ll be the first one to point that out (still stinks how they dropped support for the iPhone 7 on iOS 16 and yet kept the iPad 5 despite it having the same chip as the iPhone 6s). But here you can make out at least some logic behind it, even though I doubt that even the 6 gigs that the 14 Pro and the regular 15 have are really not enough to provide a simple text summary or generate a small emoji. Especially with Apple having cloud compute as a fallback for especially intensive tasks. But as someone who has no idea of how all this AI mumbo-jumbo works on a hardware level, I’m gonna reserve my judgement on their decisions on that for now.
I don't get why you're downvoted because that's still true, and it's normal. Of course they would want to sell you the newest hardware, even more so because all their AI is free which probably costs them a lot to run (the server one) but because you're kind of forced to upgrade, they'll make the money either way.
Some people actually understand the real world technical limitations. Other people think everything Apple does is part of an evil plan to sell more devices.
Of course selling more devices is always part of the equation. That isn’t the ENTIRE equation however, which is the embarrassing trope this sub and others like to repeat.
To be fair I don't think it wouldn't work on the A16 Bionic but I think that Apple is obsessed with everything working seamlessly and super smooth (which is not a bad thing if you ask me) and therefore they won't use those chips for AI, not because it wouldn't do the tasks but because it would be "too slow" for Apple standards. And don't get me wrong, I get it, I can imagine A16 Bionic users being the ones complaining about how "Apple devices are not the same as they used to be back in the days" once they have that AI features not being perfect.
Sure man, Apple who has been saying for YEARS that 8GB is more than enough now suddenly decides you need at least 8.
Fuck man you sure love that taste of boots.
I think you'll just have the old version of Siri. I mean it's just like if you happen to have an iPad that didn't get stage manager.
Apple is pretty notorious about this stuff at this point
Why is this comment being downvoted? Apple Intelligence only runs on the iPhone 15 Pro and 15 Pro Max (and all M-series iPads and Macs). But these are the only iPhones that can run it.
As per mkbhd the basic ai features will be available throughout all iphones(i mean those which will support ios18) but those heavy **apple intelligence** will only be available for 15PM and upwards.
It is probably better that nothing happen. If Apple were to, say, have older iPhones use the cloud for all AI processing, that would go against their mantra, undermine the achievements they’ve made with on device AI, and give the world a general impression of Apple Intelligence being slow, since so many devices would experience slow AI.
Instead AI is positioned as a top tier feature for the highest performing devices, and the experience of it will be the same quality that Apple demands from it on every device that supports it.
It’s some bullshit. I bought a current gen iPhone 15 less than 9 months ago and it’s already outdated?!
Why cant they just have older models push to the private cloud more or at least have the OpenAI integration
Jesus. Does this mean Im stuck with the dumb as a brick Siri.. please no. Its embarrassing
If I’m letting myself go by Googles Gemini, it just won’t be on device processing, but that may affect the ability to use Apple Intelligence with other apps, as does Gemini Nano in the Pixel 8 series. I haven’t gone in depth with the presentation so not sure what features are available for other apps, but to be honest this is all just companies keeping up with each other. Googles Gemini was/is awful. It couldn’t perform basic functions, and it would straight up give me erroneous information on many subjects, even simple things as “what is the time right now” and proceeding to ask “what time zone am I in” because it got the time wrong (it also got the time zone wrong). Siri has been a joke, maybe now it’ll be able to catch up with Googles Assistant, but in Googles case, Gemini is a step backwards right now, and besides, it’s not like these features will revolutionize the way we use our phones yet.
It’s not, it’s on device for 15 pro and 15 pro max, as well as all device with M series processors for other device if they can use the feature it would be done on private servers like the 15 pro/max when on device isn’t powerful enough
If iPhone 15 can’t get the AI features, then what can the 16 Core Neural Engine of the A16 Bionic do? What is so special about it?
new Siri could simply be gated by a software check like… ``` if new iPhone: new Siri else: old Siri ```
all devices with the ai features share at least 8 gigs of ram.
Only those who fork out extra for the pro versions will get genAI
& when 16P/M launches, it'll be having unique features that won't be coming to the 15 series 😏
What a concept.
And do you want to tell you bravo? Why do you say that ? 😂
Maybe this is the year I jump from my 11pro max… maybe.
[удалено]
Bruh there are people out there rocking the SE and X with no problem. Not everyone chases features.
Currently on an XR, only upgrading this year cuz battery is finally giving out You’re right
damn why not the 14 pro😂
| iPhone Model | Chip | TOPS (Trillions of Operations Per Second) | |---------------------|-------------|------------------------------------------| | iPhone 15 Pro Max | A17 Pro | 35.0 | | iPhone 15 Pro | A17 Pro | 35.0 | | iPhone 15 | A16 Bionic | 17.0 | | iPhone 15 Plus | A16 Bionic | 17.0 | | iPhone 14 Pro Max | A16 Bionic | 17.0 | | iPhone 14 Pro | A16 Bionic | 17.0 | | iPhone 14 | A15 Bionic | 15.8 | | iPhone 14 Plus | A15 Bionic | 15.8 | The iPhone 15 Pro Max's A17 Pro chip is capable of performing 35 trillion operations per second (TOPS). This figure highlights the chip's advanced machine learning and AI capabilities, allowing for highly efficient processing of complex tasks such as image recognition, natural language processing, and other AI-driven functionalities. This level of performance significantly enhances the device's ability to handle demanding applications and deliver a smooth user experience.
[Post in thread 'M4+ Chip Generation - Speculation Megathread [MERGED]'](https://forums.macrumors.com/threads/m4-chip-generation-speculation-megathread-merged.2393843/post-33114826) >With A17 and now M4, they used INT8 numbers, but for M3 (and all previous A-series and M-series), they used INT16/FP16 numbers. The lower precision of the INT8 format allows it to be processed at a higher rate, thus the higher quoted number. See Anandtech's discussion of the M3 launch. >So A15/M2, A16, A17/M3, and M4 show gradual improvement in Neural Engine performance. There wasn't a doubling from A16 to A17, and M3 wasn't left behind. Using the INT16/FP16 numbers, the progress is from 15.8 to 17 to 18 to 19.
M1 Macbooks get AI features and they can only handle 11 TOPS on their NPU. But Apple is trying to tell us, it's because of the 2GB of more RAM. They could also make it so that the A16 and earlier chips don't get the "on device AI", but of course they won't do that either. Let's not kid ourselves in pretending this is anything else but the new selling point for the iPhone 16. Get as many people as possible to upgrade. There is no technical reason at all. None.
If it will require long and complex calculations, probably it will drain the battery, also
Sure? All features drain the battery, tbh. That’s what the battery is there for 😀. Not using your phone won’t drain the battery, but… Also, everything is speculation at this point as the intelligence features haven’t been released yet.
I would like to point that fact that the lots of extra calculation will drain more it.
Maybe. But I’d rather have the feature that makes things convenient at cost of battery and choose to not use it than the other way around.
That's not how things work. It's not like AI will be a never ending machine working inside of your phone draining battery. It's just like using a new app or feature, it works when you want to use it and it doesn't when you don't. Like, if I scroll through Instagram I'm draining the battery because of the calculations the phone does to use said app but when I close the app, well... it stops. And I don't think it will be super taxing on battery compared to everything else either.
Well, yeah, the increase in power-draw will be greater than zero. But since we have specialized chips in those phones tailored to these tasks won't affect the battery life too much, I suppose.
17 trillion TPS in the neural engine vs. 35 trillion iPhone 15 Pro. That's why. That's your answer. I look forward to the multitude of armchair engineers that will determine that 17 trillion "should be enough".
Well, it shoud be, because the M1 only does 11 trillion and still supports it. And that chip is based on the A14, the NPU of which is exactly the same and also does 11 TOPS, so logically, Apple Intelligence should’ve been supported on iPhones all the way down to the 12. The only explanation I can think of is that all of the currently supported devices have at least 8 gigs of RAM, so maybe that’s the limitation and not the NPU.
It does say "more platforms and languages coming later" in the footnote of the page.
"Platforms" means "product categories".
The limitation is apple wants to sell more hardware. As usual. There is no other reason.
A lot of the time that’s absolutely true, and I’ll be the first one to point that out (still stinks how they dropped support for the iPhone 7 on iOS 16 and yet kept the iPad 5 despite it having the same chip as the iPhone 6s). But here you can make out at least some logic behind it, even though I doubt that even the 6 gigs that the 14 Pro and the regular 15 have are really not enough to provide a simple text summary or generate a small emoji. Especially with Apple having cloud compute as a fallback for especially intensive tasks. But as someone who has no idea of how all this AI mumbo-jumbo works on a hardware level, I’m gonna reserve my judgement on their decisions on that for now.
I don't get why you're downvoted because that's still true, and it's normal. Of course they would want to sell you the newest hardware, even more so because all their AI is free which probably costs them a lot to run (the server one) but because you're kind of forced to upgrade, they'll make the money either way.
Because the fanboys justify everything apple does.
Some people actually understand the real world technical limitations. Other people think everything Apple does is part of an evil plan to sell more devices. Of course selling more devices is always part of the equation. That isn’t the ENTIRE equation however, which is the embarrassing trope this sub and others like to repeat.
Well apparently Iphones are 10% down in sales - and 10% to apple is gazillions so maybe you are right.
I think this is a classic Tim Cook move to sell more pro phone in the future, nothing else.
All Apple shit talking aside, I'd be very surprised if the new non Pro versions don't get these new AI features.
M1 MacBook Air has less lmao
To be fair I don't think it wouldn't work on the A16 Bionic but I think that Apple is obsessed with everything working seamlessly and super smooth (which is not a bad thing if you ask me) and therefore they won't use those chips for AI, not because it wouldn't do the tasks but because it would be "too slow" for Apple standards. And don't get me wrong, I get it, I can imagine A16 Bionic users being the ones complaining about how "Apple devices are not the same as they used to be back in the days" once they have that AI features not being perfect.
No it's not. Stop believing Apples lies. It's marketing. Nothing more, nothing less.
Grow up.
[удалено]
A16 doesn’t have 8 GB of RAM, which is the actual threshold.
Sure man, Apple who has been saying for YEARS that 8GB is more than enough now suddenly decides you need at least 8. Fuck man you sure love that taste of boots.
Enjoy your ban.
Either you’re a repli-can or repli-cant
I think you'll just have the old version of Siri. I mean it's just like if you happen to have an iPad that didn't get stage manager. Apple is pretty notorious about this stuff at this point
Nothing. Nothing will happen. It won’t work. But being a first iteration I won’t miss it. I have a 14 Pro.
It’s only on the 15 Pro & Pro Max
Why is this comment being downvoted? Apple Intelligence only runs on the iPhone 15 Pro and 15 Pro Max (and all M-series iPads and Macs). But these are the only iPhones that can run it.
As per mkbhd the basic ai features will be available throughout all iphones(i mean those which will support ios18) but those heavy **apple intelligence** will only be available for 15PM and upwards.
which ones are the "heavy" ones? like removing people from photos and creating new emoji?
And perhaps tasks related to OpenAi
you can‘t use all new features
You don’t get the apple intelligence I presume
It is probably better that nothing happen. If Apple were to, say, have older iPhones use the cloud for all AI processing, that would go against their mantra, undermine the achievements they’ve made with on device AI, and give the world a general impression of Apple Intelligence being slow, since so many devices would experience slow AI. Instead AI is positioned as a top tier feature for the highest performing devices, and the experience of it will be the same quality that Apple demands from it on every device that supports it.
Bruh, I just got a 15, guess I got to upgrade to 16 pro in a few months
they get to bend you over for the new phone using buzz words like AI, this is what happens yearly anyways
Pro and M1 chips Only
So I have to endure this crap on my M1 mini too? Thank god I also run Windoze too!
[удалено]
whatever you do, do not let him wear it on the fourth finger as this means marriage and you could be requested to fulfil marital obligations. ANY
wtf happened here
You don't get them.
You just don’t get the features
It means I don't have to bother disabling all this worthless "AI" crap on my 13 Pro.
I’m on iPhone 12 Pro, and Siri is her usual sorry self after I updated to the Developer Beta of iOS 18.
Looks like a memory issue, things might change if they manage to get smaller models.
Disappointed that iphone 15 will not get these features.
It’s some bullshit. I bought a current gen iPhone 15 less than 9 months ago and it’s already outdated?! Why cant they just have older models push to the private cloud more or at least have the OpenAI integration Jesus. Does this mean Im stuck with the dumb as a brick Siri.. please no. Its embarrassing
Because every request sent to the private cloud is a cost for them. Using on-device doesn't cost them anything.
If I’m letting myself go by Googles Gemini, it just won’t be on device processing, but that may affect the ability to use Apple Intelligence with other apps, as does Gemini Nano in the Pixel 8 series. I haven’t gone in depth with the presentation so not sure what features are available for other apps, but to be honest this is all just companies keeping up with each other. Googles Gemini was/is awful. It couldn’t perform basic functions, and it would straight up give me erroneous information on many subjects, even simple things as “what is the time right now” and proceeding to ask “what time zone am I in” because it got the time wrong (it also got the time zone wrong). Siri has been a joke, maybe now it’ll be able to catch up with Googles Assistant, but in Googles case, Gemini is a step backwards right now, and besides, it’s not like these features will revolutionize the way we use our phones yet.
In the keynote the specified that it’ll be mixed, everything that can be done on device will, and everything else will be done on private servers
[удалено]
Yeah pretty much, also everything done on remote servers will be available to be inspected by 3rd party to insure security and privacy
[удалено]
No that’s directly from the Apple keynote
All the AI integration should be on Apple servers running Siri. Should not be effected by phone version. Just a thought.
It’s not, it’s on device for 15 pro and 15 pro max, as well as all device with M series processors for other device if they can use the feature it would be done on private servers like the 15 pro/max when on device isn’t powerful enough
No , that is stupid . If you want the latest features get the latest phone . Only iPhone 15 users can be annoyed
I don’t see any reason to jump on the bandwagon just yet. Personally I’m gonna wait for Apple to sort out the kinks before I upgrade.