Besides the background, the clock (when there are no notifications) is literally the biggest attraction and it uses Dynamic Color to adapt to your set wallpaper. Designer Philip Chang (Twitter + Instagram) took inspiration from iOS 16 to imagine what the Android lockscreen could look like in the future. Namely, depth is applied so that the clock also adapts to what’s in the actual background image. In the example seen above, “10” is displayed behind the rock formations and the seagulls are clearly layered above the “12.” Also note how the hours and minutes are using different colors.
One of Google’s most recognisable examples of Material You is the double-line clock in Android 12. One concept design imagines how the Android and Pixel lockscreen could develop with the inclusion of depth, after the major emphasis placed on depth in iOS 16. The day and date are displayed at the top-left of the Android lockscreen on Pixel phones today, with the weather (including the condition and temperature) appearing below. Android’s status bar is located above this At a Glance widget and indicates the carrier on the left and the battery level and connection statuses on the right. You can access Google Pay and smart home controls via a shortcut at the bottom of the lockscreen (soon Google Wallet).
Other examples show the time as if it were appearing on the other side of a bridge, behind a mountain’s peak, and against a waterfall. My favorite example is the hour appearing behind clouds, while the minutes are partially submerged underneath the waves so that visibility is not impacted. Meanwhile friend of the site RKBDI also imagined different font styles on the Android lockscreen: This depth effect will certainly be popularized by iOS 16, but it dates back to watchOS 8 with the introduction of the Portraits watch face:
However, another thing for Google to consider is making new live wallpapers that feature a depth effect for the clock. By curating the experience, the company can guarantee that readability of the time is never impacted while also allowing for motion.
The Portraits watch face uses Portrait mode photos from your iPhone to create a multilayered watch face with depth. You can choose from three different type styles and select up to 24 photos. On the Apple Watch, a photo with depth data is needed, but iOS 16 just leverages machine learning for a more scalable solution that also allows for non-people backgrounds. A hypothetical Android lockscreen could take the same approach. Google certainly has that depth know-how as seen by Cinematic photos in Google Photos where ML predicts an image’s depth and produces a 3D representation of the scene.