I’m torn on the iPhone 16’s camera controls – it’s useful but still unfinished

If you’ve read my previous thoughts on iPhones here on Ny Breaking and its sister site Tom’s guideYou know I have pretty strong opinions about Apple’s smartphones.

Since switching from Android to iPhone in late 2021, I haven’t gone back to the platform Google built, despite trying some of the best Android phones. The convenience of iOS attracted me; I love the titanium construction, I’ve found Ceramic Shield glass to be a little game changer, I enjoy the action button, and the cameras almost never let me down on iPhones.

But for once, I’m on the fence.

What I’m thinking about is the camera control button. In some ways it’s a cool new feature that puts haptics to good use. In other ways it is redundant and not fully featured.

I’ve been trying out the iPhone 16 Pro Max for a few weeks now and when it comes to taking a photo, I try to use Camera Control as much as possible. Being 37 and a millennial, I still enjoy taking photos in landscape orientation on my phone, so having a physical button where my finger naturally sits is good for taking a photo without breaking the framing. messing up by tapping the screen or trying to press the action button – I mapped this out to activate the ‘torch’ anyway, which is surprisingly useful.

I also like being able to cycle through zoom ranges with a swipe on the camera controls without having to tap small icons. The exposure control is pretty cool, although switching between the functions Camera Control can control doesn’t feel entirely intuitive to me yet, and often my taps cause me to lose the precise design of a scene.

So yes, camera controls are interesting. But…

Did anyone really ask for it? It feels like a function in the interest of Apple’s mobile executives to discuss something new at the Apple event in September. It’s just a ‘nice to have’ feature, but it’s hardly a game changer for phone photography.

Not my pace

(Image credit: Future/Lance Ulanoff)

But maybe over time I’ll warm up to it. But the biggest problem is the lack of AI tools at launch for Camera Control. Apple is actively promoting its Camera Control AI features that can be used to smartly identify where the cameras are pointed and provide all kinds of information. That hasn’t happened yet, as the rollout will happen post-launch when Apple Intelligence is fully available; there is a beta option but i don’t want to try it on my headphones.

I still have to understand that. Sure, other phone makers have touted AI features that will arrive after their phones release and may be limited to certain regions at first, but at least they’re launching with some of the promised AI suites. The iPhone 16 series was launched without any Apple Intelligence features.

This isn’t what I expected from Apple, a company known for not adopting new technology until it’s refined and ready for polished prime time. So it’s baffling to me to see smartphones launching without the latest generation of smarts. But it’s also the main reason I feel torn about camera controls; If it had Google Lens-like capabilities baked into a hardware format at launch, I can see myself being much more positive about Camera Control.

Of course, Apple’s use of such a camera button will undoubtedly lead to other phone makers following suit. I just hope they don’t skimp on features when their phones launch.

As for camera controls in the here and now, I will keep an open mind and continue to use it; I’m crossing my fingers that it will become seriously useful once it gets the prescribed dose of AI smarts.

You might also like it

Related Post