How to enable the Pixel Visual Core for HDR+ on Android 8.1 — and what it actually does
It’s time for a photo processing upgrade.
Google’s new Pixel Visual Core co-processor has been sitting dormant inside the Pixel 2 and 2 XL since launch, but now with the Android 8.1 Developer Preview 2 (aka Beta 2) release we have an early look at what it can do. Well, sort of — it actually isn’t enabled by default on the phones, and turning it on only gives us a glimpse of what it’s capable of in third-party apps.
But if you know where to go, you can turn on the Pixel Visual Core and see what it does for your photos on the Pixel 2 or Pixel 2 XL. Here’s how.
How to enable Pixel Visual Core processing
The process for enabling the Pixel Visual Core is a bit funky, and isn’t actually in the Camera app itself but rather the Developer options — but chances are this won’t be an issue for you if you’re running beta software on your phone. Provided you’re running the latest Developer Preview / Beta, here are the steps:
- Go into Settings, System, About phone.
- Find the Build number at the bottom of the screen and tap it five times.
- You’ll also need to confirm your screen lock.
- Go back and tap on the new Developer options menu.
- Scroll down under the “Debugging” subsection and tap the toggle marked Camera HAL HDR+.
- Reboot your phone for the function to be enabled.
What does the Pixel Visual Core do right now?
So here’s the thing: enabling HAL HDR+ doesn’t change anything about the way the Pixel 2’s built-in camera performs — after all, it already has HDR+ on its own without utilizing the Pixel Visual Core. Because this is a beta release, the focus is on enabling the Pixel Visual Core for third-party apps to use. Once you turn on HAL HDR+ processing, any third-party app that plugs into the standard Android Camera API will have its photos processed with the Pixel Visual Core, giving them the HDR+ treatment much in the same way the Google camera app does already purely with proprietary processing software.
The change isn’t massive right now, but the future is bright with this dedicated co-processor.
That means when you fire up something like Instagram or the bevy of other apps with in-app camera needs, the photos you get directly out of those third-party apps will be closer to the quality you experience when taking photos with the built-in camera app. The goal is to not have such a big drop-off in camera quality when shooting inside an app versus using the built-in camera and sharing the photo afterward. This is a huge win for developers and users alike.
Chances are you won’t notice a huge difference in quality or processing speed just yet — remember, this is the first time Google is enabling the Pixel Visual Core for consumers (and just beta testers, who enable it, at that). But the computational capabilities of this co-processor go way beyond most ISPs in phones. There’s also a machine learning component to the way the Pixel Visual Core works, meaning it has the potential to “learn” and improve as it’s used. This processor’s capabilities could be leveraged far better in the future, both with third-party apps and the built-in camera. With hardware like this, the future is bright.
Once you enable the Pixel Visual Core on your Pixel 2 or 2 XL, let us know how you’re finding its capabilities!