You get a priority scheduling algorithm that intelligently allocates resources based on actual requirements, thus improving smoothness, a rapid dynamic effect engine for the same purpose, and new system dynamic effects featuring a new Origin animation which “incorporates principles from human factors research and integrates the natural laws of the physical world into the dynamic effect design”.
There are many new wallpapers, both static and immersive ones, you can set local videos as live wallpapers now, you can have large folders on your home screens, there are new clock widgets, Ultra Game Mode comes with multiple quick setting options, screenshot capture is improved, the Smart sidebar now has access to Google Lens for quick translations of on-screen content, Notes has more text editing tools and the ability to export in Word format, and Settings comes with better categorization and hierarchies.
You can drag a split-screen to the middle of the screen to switch to a small window, and drag a small window to the edge of an app to switch to split-screen, while the startup gestures for split-screen and small window modes have been optimized. You can swipe up from the bottom to the upper left or right corner to quickly enter split screen or small window mode.
Now it’s time for AI. Live Transcribe is present, as is Circle to Search, and the Albums app gets Cleanup suggestions and AI erase to get rid of unwanted things or people in your photos.
I don’t know what everybody sees in circle to search. Isn’t it just Google Lens which you can get on any other phone? I can make a screenshot and share ot with Google Lens to get the same result right?
I don’t get it either. They’re pushing it so hard yet it feels like a gimmick. My phone showed me how to circle-to-search half a dozen times now when I accidentally trigger the gesture. Thank god you can disable the assistant overlay in settings.