It looks like Samsung is making some major changes, inside and out, for the next Galaxy S flagship. So far, leaked renders have imagined a sharper-looking Galaxy S25 Ultra with slimmer bezels, cleaner lines, and a more boxy design.
Now, according to reliable leakster UniverseIce, the Galaxy S25 Ultra will come fitted with 16GB RAM. For comparison, the Galaxy S24 Ultra offers 12GB of
Now, Samsung won’t exactly be setting any new industry standards with the Galaxy S25 Ultra and its 16GB
The S25 Ultra will definitely have a 16GB
RAM version, this is 100% confirmed, don't worry.— ICE UNIVERSE (@UniverseIce) September 27, 2024
Of course, you don’t need 24GB on any “normal” smartphone at this moment, unless you need it for cracking those obvious “my phone has more
But jumping from 12GB to 16GB can have some practical benefits, apart from just keeping more apps running in the background and some extra boost at gaming, as
It’s all about AI, probably
The real beneficiary is AI, and specifically, the on-device flavor where tasks are executed by a local AI processing unit rather than offloading it to the cloud. Remember the hoopla around Google limiting Gemini Nano chops to the Pixel 8 Pro and not extending it to the vanilla Pixel 8?
Well, Android VP and general manager Seang Chau later confirmed that the 12GB
Evidently, four gigs of added memory can make a lot of difference, though the performance gulf may not be as evident as jumping from an already good enough 12GB
But at the same time, we are seeing AI models – including Google’s own Gemini kit and Samsung’s GalaxyAI bundle — getting more sophisticated and adding more to their token bandwidth, so it’s always a safe choice to pack in more
Or, as a certain Obi-Wan Kenobi would concur, having the higher ground is advantageous.
“While current basic AI features use around 100MB of memory on mobile devices, LLM-based features could require up to 7GB of additional
For example, the Gemini Nano model handling AI chores on the Galaxy S24 series eats up 2GB memory, as per estimates, but to effectively handle AI models with a 7 billion parameter range, 16GB of
In fact, experts estimate that advanced LLM-based (Large Language Models) tasks could require an additional 7GB of
How fast data processing happens and whether parameters are handled effectively decides how quickly the AI outputs are received. An insufficient amount of