The bulk of the iPhone 13 and 13 mini’s upgrades are around photography and video. Apple improved both the rear sensors here and the ultrawide lens is supposed to let in more light than before. Sadly, a lot of the notable additions are contingent on the A15 chipsets, meaning things like Cinematic Mode won’t be coming to older iPhones.
For example, the faster image signal processor (ISP) on the chip means things like nighttime photos won’t take as long. I certainly didn’t have to hold the iPhone 13 still for as many seconds as the iPhone 12 when I used both of them to shoot a candlelit globe in a super dark room. The difference was probably about one second, which sounds insignificant, but can feel like forever when you’re struggling to remain motionless.
The ultrawide photos I shot with the new phone were actually darker than the iPhone 12’s, but they were better exposed overall. Buildings against the night sky had cleaner lines, less noise and a more neutral tone than those from the iPhone 12. Google’s Night Sight on the Pixel 5 still rendered more details in the shadows, though, and I preferred the cooler images it produced.
I used to prefer photos from Pixels because Apple’s pictures had a yellowish tinge. But with the iPhone 13s, Apple is introducing a way to better match users’ individual preferences called Photographic Styles. It’ll let you choose from one of five profiles: Standard, Rich Contrast, Vibrant, Warm and Cool, which differ in contrast levels and color temperature.
You can tweak these modes to your preference, too. But at their original settings, my favorite Style was Vibrant. Unlike filters, this felt more like a set-and-forget kind of thing — nice for people like me who have never been into Apple’s default treatment. Overall, the iPhone 13 took colorful and crisp shots, though compared to Google’s images they were needlessly brighter with obvious HDR effects.
In addition to the hardware and software improvements I’ve already mentioned, the company also updated its HDR algorithm to better accommodate every person in the scene. It also worked to enhance video quality, promising better dynamic range, details and highlights. Plus, you can now record in Dolby Vision in 4K resolution at up to 60 frames per second.
But the most intriguing new video feature (and arguably of all the camera updates) is Cinematic Mode. Using the A15 chip’s neural engine, the iPhone 13 can create a Portrait mode-like effect in your clips, keeping your selected subjects in focus while blurring out the rest of the scene. You can tap on parts of your viewfinder to change focal points as you shoot or let the iPhone decide for you by analyzing who and what’s in the scene.
On its own, Apple’s system is pretty clever. The iPhone 13 did a great job of identifying faces (both human and canine) in my shots, and yellow or white boxes appeared to indicate potential things to focus on. As my subjects turned toward and away from the camera, they became clearer and blurrier respectively. But when I tried to exert more control and adjust the focal point, the system struggled. Sometimes, my intended subject remained blurry even after I tapped on its rectangle. Other times, the iPhone didn’t follow the person I selected after they walked behind an obstruction, though that’s a reasonable situation.
When it did perform as expected, Cinematic Mode produced a pleasant effect that gave videos a professional air. But at the default intensity, the blurriness looked strange or artificial. The outline of my colleague’s head was stark against the softened background and I had to adjust the F-stop to the highest (f/16) to get a more natural feel.
It’s worth noting that Cinematic Mode only works in 1080p at 30 frames per second, even if you’ve set your camera to record at a higher quality.
Cinematic mode is also available via the 12-megapixel selfie camera, which offers Photographic Styles too, and both features were just as effective via the front sensors as through the rear.
Gallery: iPhone 13 camera sample photos | 16 Photos
Gallery: iPhone 13 camera sample photos | 16 Photos
In low light, the iPhone 13 took selfies that were slightly blurry compared to the Pixel 5 and Galaxy S21, but when I was well lit, Apple’s camera delivered images that were just as sharp as the competition. It even had a more neutral tone than the other two, with a more accurate white balance (though Samsung was pretty close).
I covered most of the changes coming via iOS 15 when I tested the beta, including things like Focus modes and SharePlay. Focus modes, which lets you set custom home pages and notification profiles based on your location or time of day, is still one of the most useful new features on any smartphone platform in recent years. Meanwhile, SharePlay won’t be available until a later release.
Each time you open a relevant app, like Photos or Tips, Apple shows you what’s new this time around — like Memories set to tunes from the company’s Music library. Safari also had a redesign (and a few tweaks during the beta window), primarily making it easier to browse and organize your tabs.
I’ve never been a big Safari user, preferring Chrome for its convenience, but it’s nice to see Apple update its interface for easier navigation with one hand. Chrome and Safari are pretty similar on iOS, although Google sadly still has its address and search bar at the top of the screen. If you prefer, you can also go back to the traditional layout in Safari.
Other noteworthy iOS updates include Live Text in Photos, which makes finding specific pictures from the Spotlight search much easier. The Maps and Weather apps also received a refresh, while Shared With You in Messages makes it slightly easier to find things you and your friends chatted about. Since most of these will be coming to older iPhones, though, iOS 15 features are unlikely to sway your decision on whether to upgrade this year.
We’ll have a more in-depth review of Apple’s latest OS soon, but for now, I’m pleased with the level of control iOS 15 offers and look forward to testing out a stable version of SharePlay.