The iPhone 14 Pro adds an always-on display, a 48 MP camera and a new user interface paradigm to an already outstanding smartphone lineup, but the already excellent iPhone 13 Pro looms large.
Apple’s annual refresh of its flagship smartphone saw the debut of the iPhone 14 Pro, the smaller of the two Pro-tier smartphones for 2022. As usual, Apple brings several changes to the product lineup, including quite a few. frequently rumoured and much-needed additions.
The iPhone 14 and iPhone 14 Plus are discussions for another day. The iPhone 14 Pro is where Apple’s annual update cycle kicks off properly.
Today at Amazon
Similar appearance, different inside
From the outside, the iPhone 14 Pro does not give many clues that it is a new smartphone. Apple has not made any major changes to the design of the smartphone. The iPhone 14 Pro looks similar to the iPhone 13 Pro, which in turn borrowed its design from the iPhone 12 Pro.
That means you get a stainless steel chassis between the front and back glass, protected with Ceramic Shield glass. The rear glass is frosted, like the 2021 models, instead of glossy.
iPhone 14 Pro Max in dark purple
This time around, buyers will choose from four colours, Space Black, Gold, and Silver accompanied by Deep Purple.
The few changes that are visible from the outside are also extremely minimal. At 5.81 inches tall and 2.81 inches wide, it’s a bit taller and slimmer than its predecessor, but also a bit thicker at 0.31 inches.
It has also gained a bit of weight, inching up to 7.27 ounces. But again, this isn’t a massive change, and most people won’t even notice it.
Apple still rates the iPhone 14 Pro as IP68 for water and dust resistance, so it can survive a depth of 6 meters for up to 30 minutes.
Screen always on
The front of the iPhone 14 Pro offers the first set of obvious indicators that something is different about the model than before: an always-on display.
When not in use, instead of going dark, the screen becomes dimmer and duller in appearance, but it doesn’t go completely dark unless you turn iPhone completely off. Using a combination of ProMotion’s 1Hz refresh rate, automated settings for the new lock screen, and other factors, the screen can now stay on constantly.
It does dim, and it seems that how much it dims depends on how bright you had the screen before you slept. It’s not a huge difference between the maximum and minimum screen brightness, but it’s noticeable.
Screen always dimmed
Like the always-on display on the Apple Watch, the iPhone display is handy for information at a glance. You can not only see the time, but also a selection of widgets.
And speaking of that Apple Watch, if you have yours on and you walk away from your phone for a few minutes, it turns off the screen, mostly. In a house-sized environment, we had to be as far away as possible from the phone at other ends of the house for it to work reliably.
If you leave your phone at home it will turn off the screen. But in that scenario, you probably have your phone with you anyway.
And yes, you can disable this in the Display & Brightness control panel if it’s an issue for you. There’s nothing else you can do with it in settings at the moment other than turning it on or off, and we hope there will be more granular settings as iOS 16 evolves, or as more models get the feature.
The display remains a 6.1-inch OLED-based Super Retina XDR display, with a 2 million:1 contrast ratio, Wide Color (P3) support, HDR capabilities, and the 120Hz adaptive refresh rate system. from ProMotion, but surprisingly there’s more here.
For typical content, it still has a maximum brightness of 1,000 nits, but HDR’s 1,200 nits are now 1,600 nits. If you use it outdoors, the screen can even hit 2,000 nits as it tries to counteract the glare of the sun.
We tried several times and it’s almost instant. If you are using your phone and walking outside, you can see the change and it makes the phone more useful.
It also makes the iPhone hair hotter. It’s hard to control the variables to see how much hotter, given that you’re already outside and the sun is hitting the device.
However, this will affect the battery. We’ll see over time how useful it is and what impact frequent use has on the battery.
From notches to islands
The rumours were right about a big front-facing feature for 2022, and that’s the lack of a notch in the display. The oft-derided design feature isn’t gone, it’s been repurposed and disguised with some clever UI work.
The screen still has some gaps, but it takes the form of a pill and a hole, allowing the TrueDepth camera array to function normally. It’s also a smaller array, by 31%, with a proximity sensor that works behind the screen to save space.
However, instead of just leaving the space above these black holes indented, as usual, Apple incorporated them into Dynamic Island.
Typically taking the form of a small black oval surrounding the cutouts, Dynamic Island can expand sideways to provide small pop-up prompts or grow larger to provide more detail or control.
This is very clever and obvious in hindsight. It hides the cutout in the form of the naked eye but reframes it from being a permanent display wasteland to being part of a more useful notification system.
It’s a great software-based sleight of hand trick. We’ve already seen some games using it and some cool wallpapers.
A little cat sits on the Dynamic Island in the Apollo app
But, at the moment, it is quite scarce. It’s also pretty passive, being used primarily as an info screen for another app, besides the one you’re using. This is fortunate because if those interactions were required, it would turn what is barely a one-handed device into a two-handed one, with all the accessibility issues that entail.
With that said, we look forward to seeing how developers can take full advantage of it in the future. And, like pretty much everything else on an iPhone Pro, it will eventually come to other iPhones.
Inside the iPhone 14 Pro, Apple has switched to the A16 Bionic chip, which is hardly surprising given that it happens every year. But since the iPhone 14 reuses the version of the A15 included in the iPhone 13 Pro models, it may be a chip that the non-Pro iPhone 2023 could end up using in the future.
The A16 consists of a six-core CPU with two performance and four efficiencies and a five-core GPU. This is the same configuration as the A15, but with a faster CPU that uses almost 16 billion transistors, with a GPU that has 50% more memory bandwidth.
Then there’s the 16-core Neural Engine, now capable of 17 trillion operations per second, and an advanced image signal processor to handle photo and video tasks.
And as for that speed, you may have already seen some benchmarks floating around from the day of the event. Benchmarking done on the day of an event is done on the sly by the people at the event and should not generally be trusted given the setting.
The good news is that the A16 Bionic is very, very fast and is a huge leap over the A15. While the A15 Bionic in the non-professional iPhone 14 models is slightly faster than the iPhone 13 A15, the A16 beats them all.
Geekbench 5 on iPhone 14 Pro (left) and iPhone 13 Pro (right)
In the Geekbench 5 benchmark, the iPhone 13 Pro scored 1723 and 4658 in the single-core and multi-core tests. The new iPhone 14 Pro managed 1,880 single-core and 5,317 multi-core, handily beating the previous generation’s chip while both were running iOS 16.
Geekbench also has a graphics test. In that graphics-focused Compute benchmark, the iPhone 14 Pro scored 15,739, up from 14,401 for the iPhone 13 Pro.
Comparing the two devices against each other in Antutu’s benchmark, the iPhone 14 Pro scored 897,708 compared to 767,863 for the iPhone 13 Pro. Antutu reported the highest gains in the GPU, memory and UX categories and the CPU only showed a modest increase in this particular test.
Of course, these are benchmarks. Your performance mileage may vary.
That said, flagship iPhones have been powerful enough for most of the smartphone-using population for some time, and that’s probably why the non-Pro iPhone 14 has a CPU that’s only slightly better than the iPhone 14. iPhone 13.
However, much more on that in a few days.
Probably the biggest news about the iPhone 14 Pro is that Apple has finally started to move away from 12-megapixel camera sensors. Well, at least for one of them.
The wide camera, now dubbed ‘main’, has a 48-megapixel sensor, four times the resolution of the 12-megapixel versions still used in telephoto and ultra-wide cameras.
The main camera uses a “Quad-Pixel” sensor, in which the pixels are arranged in quads of identical colour-sensing elements. This provides advantages, such as each quad acting as a huge pixel in a 12-megapixel image, or acting more traditionally for a 48-megapixel image.
iPhone 14 Pro camera
The main lens also has a seven-element lens compared to the other six-element versions and a second-generation sensor-shift OIS system. Telephoto also has OIS.
While there are three cameras and lenses up for grabs, the traditional three optical zoom levels of 0.5x, 1x and 3x are joined by a fourth, 2X, thanks to the massive sensor. By using the 12MP central area of the lens it matches the fourth zoom level and is still classed as an optical zoom as it is simply a crop of a larger image.
There’s also Apple’s new Photonics Engine, a change to the imaging pipeline that puts the computational photography intelligence of Deep Fusion much earlier in the process, preserving more data in the image.
The upshot is that Apple says the iPhone’s main and telephoto cameras are twice as good at producing low-light photos, with the ultra-wide three times better. In the real world, that’s harder to see.
Stills also benefit from the usual collection of features, including Night Mode, Portrait Mode with Portrait Lighting, Photo Styles, Macro and Apple ProRAW.
We will do more detailed work and camera-specific demos in the future. There are improvements, and they can be seen year after year, but minutes for most.
Apple’s software does most of the work by converting those 48MP shots to 12MP images in most cases, and you can toggle this in Apple’s Camera app for some uses and instead, save massive ProRES files of approximately 100 megabytes per image. Most users will agree with this, but the camera apps that will use the direct 48 megapixels are already available and ready to go from day one.
But those images will come out of the phone rather slowly if you rely on a wired connection. Lightning is still limited to slightly higher speeds than USB 2. Be patient or use a good Wi-Fi connection.
What we generally recommend is this: trust Apple’s Photonic Engine and Deep Fusion to begin with. Once you figure out what you like and don’t like, if you need more than just a point-and-shoot experience, you can easily upgrade to a third-party app that gives you more direct control of the entire system.
The cinematic mode looks much better in 4K
The enhancements aren’t just limited to photos. You can still get 4K video at 60fps, including ProRes at 30fps and HDR, but the cinematic mode is now extended to 4K HDR at 30fps.
Slow-motion video is maintained at 1080p at 240 frames per second, along with Time-lapse video with stabilization, audio zoom, stereo recording and OIS sensor switching, but the latter is also joined by Action Mode.
Action mode uses the entire sensor and onboard processing to further smooth the footage being recorded. Almost like you’re filming putting the iPhone 14 Pro on a gimbal, except you’re doing it freehand.
To test this, we chased after our dog like a wobbly child, iPhone 14 Pro in hand. Despite the uneven terrain, jerky movements, and an incredibly agile dog, the footage came out surprisingly smooth.
A gimbal will remain the go-to solution for those who need the best stabilization and using it won’t require as much lighting as action mode. For the vast majority of the user base, action mode will be more than enough.
Around the front, the TrueDepth camera gets some improvements too, with a new autofocus system that lets you focus on multiple subjects simultaneously. Apple says that a larger aperture lets in 38% more light, reducing noise and improving low-light shots.
Speaking of low light, the new Adaptive True Tone flash redesigns the usual light-producing element to make it more useful. Using an array of nine LEDs in a variety of patterns, the flash can be fired in different ways, depending on the focal length of the photograph.
The result is a flash that is twice as bright in telephoto images and more even light in ultra-wide shots. Again, this is nice, but like any other new feature in the iPhone camera, what you can get out of it depends largely on the skill of the photographer.
Very soon we will talk about this feature in-depth and how it compares to the pro team.
In case of emergency
An iPhone has communications technology. It is implied by “Phone” after the i.
Wi-Fi is the same speed as the iPhone 13 Pro, and while Bluetooth 5.3 is here, there’s practically not much to do with it that Bluetooth 5 hasn’t already done. However, this is likely to become more important later in the iPhone’s life cycle.
There are two main elements in Apple’s release related to communication security outside of simple wireless technology, Wi-Fi and Bluetooth.
First, there’s Emergency SOS via Satellite, which is self-describing. If you’re having trouble somewhere in North America where you don’t have cellular coverage, like an off-grid cabin on a mountain, you can call for help with your iPhone.
This is not StarLink, where satellites make fun of a cell tower. This is point-to-point communication from the iPhone to the satellite.
Instead of a phone call or full-text message, the iPhone will ask you questions about the situation and then tell you to point it at a satellite. Once the connection is established, the responses, along with the iPhone’s location, medical ID, and battery level, are sent to the satellite.
That message is then sent to an emergency service provider, but if they don’t accept the text, it will go to an Apple-operated emergency relay centre that will place the call on the user’s behalf.
The physics of this is complex. Even during that short minute, you’ll have to move the phone in a small arc to stay aligned with the satellite. Some napkin orbital math suggests that users will need about a 2-degree traverse of the phone as the message is sent, and a 5-degree traverse in dark environments like a treetop.
However, Apple will not have the service operational at launch, as it will be available in November. Additionally, Apple says it will be free for two years, though it’s unclear what will happen after that point.
The satellite emergency SOS will arrive in the fall of 2022 in the USA.
This may not be a full satellite call, but since it’s for emergencies and uses very low bandwidth technology, it’s still much better than being stranded in the middle of nowhere with no communication at all.
Continuing on the theme, Crash Detection is a feature that will trigger an emergency SOS response if it detects a serious car accident. Built-in sensors that detect barometric changes, changes in speed and direction, loud noises, and more will determine if a collision has occurred, prompting iPhone to respond.
Again, this is a feature for rare situations. It’s one you hope not to use, but you’ll be glad it exists if you need to.
We expect testimonials soon after the launch about life-saving rescues powered by technology.