Update (Feb 8): Google has revealed that Pixel Visual Core is not used in the Google Camera app and the third-party applications are getting the first crack at it. The story below has been corrected to reflect this information.
Earlier (Feb 6): Remember Pixel Visual Core, the custom image processing chip present in the Pixel 2 smartphones, it is finally being put to use. Google has announced that it is opening up the access to this machine-learning co-processor to the third-party apps. Pixel Visual Core will utilise its machine learning and computational photography chops in apps like Instagram, Snapchat, and WhatsApp to capture amazing photos.
Google had enabled the Pixel Visual Core with Android 8.1 update in the Pixel 2 phones, but it wasn’t being used by any application, including Google Camera app. The Google Camera app has its own enhancements and optimisations to click great images.
“Pixel Visual Core improves image quality in apps that take photos. This means it’ll be easier to shoot and share amazing photos on Instagram, WhatsApp, and Snapchat, along with many other apps which use the Pixel 2 camera. All you need to do is take the photo and Pixel 2 will do the rest. Your photos will be bright, detailed, and clear,” wrote Ofer Shacham, Engineering Manager for Pixel Visual Core at Google, in a blog post.
These changes will be available to Pixel 2 and Pixel 2 XL users as a part of the February update, which is rolling out now. The update will reach all Pixel 2 phones over the next few days.
In addition, Google has revealed that it is bringing new AR Stickers to all Pixel smartphones later this week. These stickers are themed around winter sports, considering the upcoming Winter Olympics.
To make sure the Pixel Visual Core improvements are available in their apps, the app developers will have to make sure their apps use the relevant API. Details can be found on Google Open Source.