Apple Intelligence memory footprint discussion
by Poster
May 13, 2025
22
As early as more than a year ago, when I first heard the news that Apple had all Phone 16 series equipped with 8G memory as standard for Apple Intelligence, I had this question: Is the process of Apple Intelligence permanent?
Although the entire series comes standard with 8G memory, if the resident AI process always occupies a part of the memory, and even when the memory is tight, this part of the memory will not be recycled, then the benefits of memory upgrade will not be so great. I'm even thinking that if the memory is still not enough because the AI process is resident, I'd rather turn off the AI function and enjoy the performance improvement brought by the large memory.
Especially recently, there have been rumors on the Internet that the whole iPhone17 series comes standard with 12GB memory, which makes me have a new question: the 16 series has just come standard with larger memory, so why does the 17 series need to upgrade the memory?
Regarding this problem, due to a shallow understanding of AI models, I think it may be because the 17 series will upgrade more powerful but larger model files, so it requires more memory footprint, so Apple has to upgrade the memory again. If this is the case, the iPhone16 series, as a product that has just been released for a year, has no reason not to follow up and upgrade. So will it also make the memory usage of the 16 series even more stretched?
Replies
-
Anonymous13621 May 13, 2025Of course it is not permanent, otherwise the fever will kill you, but ai is in overlay form. When you run an application that consumes performance, you have to call out ai to ensure that the memory is still enough and cannot kill the foreground app. The previous 6GB is not enough, so Must add memory On the contrary, the rumor of 12GB is most likely untrue
-
Poster May 13, 2025@ Anonymous13621 If it is not resident and loads the AI model and initializes some personalized settings every time you wake up, will the response delay become very high?
-
Anonymous1715 May 13, 2025@ Poster Do you know how many tokens there are per second in the workstation now? 8g also has permanent background station, so the phone can be lost. What other local models do you think you can run? Thank God I can recognize a person's face.
-
Poster May 13, 2025@ Anonymous1715 I don't have a definite conclusion about whether the AI process is resident or not. I just raised a question about what problems would be difficult to solve if it wasn't resident. I didn't seem to express a clear point of view. I don't know what you are refuting. So what's your point of view? Is Apple's Apple Intelligence feature not implemented by loading an AI model? Speaking of which, I actually have another guess: From a design point of view, Apple's AI process can be resident, but the resident process will only load a miniature language model in advance, and will not load a particularly large model file in advance. Then, after receiving user instructions, it will pass the analysis and judgment of some miniature language models, and determine the majors and fields involved in the problem, and then load some expert models in corresponding fields to handle this task. When the task is processed, the memory used to load the expert model can be recycled. From the performance point of view, the processing mode of such a process can respond in a timely manner, but the processing process may actually be time-consuming. The above are still some of my guesses, and I hope some big guys can give some solid conclusions or discussion contents.
-
Anonymous1715 May 13, 2025@ Poster As far as the memory footprint of ios is concerned, you are overthinking. If cold start is unacceptable, ask Earth how many times you have seen it. Reading stupid?
-
Anonymous1715 May 13, 2025Originally, this ai requires the server to respond most of the time. How big do you think the impact is on pulling up the local model? Local models mostly exist inside the app. Resident model + occasionally turn on the camera you are looking at the earth all the time. If you want to leave a background, it is basically for the ai service that quickly starts the camera.
-
Anonymous448 May 13, 2025How can it be permanent, let alone the memory battery can't stand it
-
Anonymous13671 May 14, 2025Why is there an idiot upstairs who speaks so aggressively? It seems that the landlord doesn't talk nonsense. He just discusses something he doesn't understand. If you can speak well, speak and answer it. If you think the landlord has a problem with understanding, you can speak straight out. If you can provide some basis, it is better, but what does it mean to talk and swear? Is it so unqualified? I'm too lazy to @, I'm bad luck
-
Anonymous1901 May 14, 2025I really don't want to explain it, check the analysis video about Apple Intelligence yourself Let me just mention one thing, why do you think Apple will give you the freedom to turn off ai and give space to other applications? Don't you know that there are so many background tasks in iOS? Your phone keeps running tasks every night when you go to bed.
-
Anonymous4085 May 14, 2025@ Anonymous13671 read the historical speech of this guy upstairs, and it was all in the same tone of training.....
-
Anonymous6511 May 14, 2025Both kinds are available depending on your needs For cold start, it can usually be achieved within 3s or even < 1s Currently, none of the AI functions available on the iPhone require such a low response speed. Compared with the system pressure, I prefer that they are mainly cold start. There should be dozens of models by now (maybe some are resident, but none of them are the kind of LLM that everyone usually says)
-
Anonymous2013 May 14, 2025In fact, even a cold start is not slow. The time for interacting with the ai application to input information is enough to start in the background. What's more, not all tasks are executed by the local model. siri can handle some simple tasks without connecting to the Internet before promoting ai. For tasks, I think it should be possible to distinguish whether it is executed by the local model or the cloud model.
-
Anonymous94 May 14, 2025Apple Intelligence is fine to use on Macs. Don't have much hope for mobile phones in the short term
-
Anonymous3278 May 14, 2025@ Anonymous13671 has no quality and thinks he is powerful
-
Anonymous13914 May 14, 2025I am very disappointed with the battery life of the iPhone 16 pro. When I first used the iPhone 11, it lasted for three days. The iPhone 16 pro is basically charged once a day. I don't know if it has something to do with apple intelligence, or if there are too many other functions in iOS now, which consumes too much power. The chip manufacturing process has improved, but there is no improvement in battery life
-
Anonymous1982 May 14, 2025@ Anonymous13914 # 15 The iPhone11 is used every three days. It seems that it is extremely light use of the phone 🤣
-
Anonymous13914 May 14, 2025@ Anonymous1982 I feel that the iPhone used to be durable. Now although many functions have been added, the heat generation and power consumption have also increased
-
Anonymous6513 May 14, 2025The newly released qwen3 0.6 b is only 480M after quantization, and it can be done in real time or loaded. Accelerate MLX, the local token is also very fast, and the pure CPU can be 37 token/s. What I mentioned above is basically outdated experience
-
Anonymous13684 May 15, 2025Even without LLM, it is unreasonable for mobile phones to not have 12G in 2025.
-
Poster May 15, 2025@ Anonymous13684 My iPad air6 has 8G memory. bilibili opens a video, plays it for a while and then pauses, then opens Genshin Impact to play for an hour, and then switches back to the bilibili application. The playback page is still in progress. I think this performance is already good