Pixel 2 has Google’s First In-House Chipset Called Pixel Visual Core, Aims to Bring HDR+ to More Apps

All of the talk in recent years about Google possibly designing its own in-house chipsets for phones has come true in a way on the Pixel 2 and Pixel 2 XL. This morning, Google announced that both phones contain Google’s first custom-designed System on Chip (SoC), called Pixel Visual Core, though this one was made to help with the outstanding camera in each rather than replace the work of Qualcomm’s Snapdragon line. In the coming weeks and months, Google will enable the Pixel Visual Core to help bring the Pixel camera’s HDR+ magic to third party camera apps.

Seriously, you are reading that correctly – the Pixel 2 and Pixel 2 XL had a secret Image Processing Unit (IPU) that none of us knew about until now and hasn’t even been turned on yet. That’s kind of cool, right?

pixel visual core

The goal here is, again, to bring the Pixel 2’s HDR+ smarts to others outside of the Google Camera app. The SoC will do so by taking advantage of its eight Google-designed custom cores that can deliver “over 3 trillion operations per second on a mobile power budget.” By using the Pixel Visual Core, Google says that HDR+ can run 5x faster and at less than 1/10th the energy than if it was trying to through the application processor (like the Snapdragon 835 in these phones). There are many more nerd details here than that, but it’s over my head and so I’ll leave those dirty details to those willing to dive into the press release below.

Here are a couple of examples of the HDR+ capabilities that are coming to third party camera apps, thanks to Pixel Visual Core:

pixel visual core

pixel visual core

Of course, since this is Google’s first chip, they won’t just stop with opening up HDR+ to others. They already have the “next set of applications” lined up that they want this Pixel Visual Core to power.

The Pixel Visual Core will be enabled first as a developer option in the “coming weeks” through an Android Oreo 8.1 (MR1) developer preview. Further down the road, likely when Android 8.1 goes stable, Google will enable it for all third party apps to use within the Android Camera API.

Press Release


Pixel Visual Core: Image processing and machine learning on Pixel 2

The camera on the new Pixel 2 is packed full of great hardware, software and machine learning (ML) so all you need to do is point and shoot to take amazing photos and videos. One of the technologies that helps you take great photos is HDR+, which makes it possible to get excellent photos of scenes with a large range of brightness levels, from dimly lit landscapes to a very sunny sky.

HDR+ produces beautiful images, and we have evolved the algorithm over the past year to utilize the Pixel 2’s application processor efficiently and enable the user to take multiple pictures in sequence by intelligently processing HDR+ in the background. In parallel with that engineering effort, we have also been working on creating capabilities which enable significantly greater computing power, beyond existing hardware, to bring HDR+ to third-party photography
applications. To expand the reach of HDR+, to handle the most challenging imaging and machine learning applications, and to deliver lower-latency and even more power-efficient HDR+ processing, we have created Pixel Visual Core.

Pixel Visual Core is Google’s first custom-designed System on Chip (SoC) for consumer products. It is built into every Pixel 2, and in the coming months, we will turn it on through a software update to enable more applications to use Pixel 2’s camera for taking HDR+ quality pictures.

Let’s delve into some of the details. The centerpiece of Pixel Visual Core is the Google-designed Image Processing Unit (IPU)—a fully programmable, domain-specific processor designed from scratch to deliver maximum performance at low power. With eight Google-designed custom cores, each with 512 arithmetic logic units (ALUs), the IPU delivers raw performance of over 3 trillion operations per second on a mobile power budget. Using Pixel Visual Core, HDR+ can run 5x faster and at less than 1/10th the energy than running on the application processor (AP). A key ingredient to the IPU’s efficiency is the tight coupling of hardware and software—our software controls many more details of the hardware than in a typical processor. Handing more control to the software makes the hardware simpler and more efficient, but it also makes the IPU challenging to program using traditional programming languages. To avoid this, the IPU leverages domain-specific languages that ease the burden on both developers and the compiler: Halide for image processing and TensorFlow for machine learning. A custom Google-made compiler optimizes the code for the underlying hardware.

In the coming weeks, we will enable Pixel Visual Core as a developer option in the developer preview of Android Oreo 8.1 (MR1). Later, we will enable it to all third-party apps using the Android Camera API, giving them access to the Pixel 2’s HDR+ technology. We can’t wait to see the beautiful HDR+ photography which you already get through your Pixel 2 camera also be available in your favorite photography apps.

HDR+ is the first application to run on Pixel Visual Core. As noted above, Pixel Visual Core is programmable and we are already preparing the next set of applications. The great thing is that as we follow up with more, new applications on Pixel Visual Core, Pixel 2 will continue to improve. We’ll keep rolling out other imaging and ML innovations over time—keep an eye out!

// Google

Kellen

It’s not often that you get to merge personal passions into a professional life, but that’s what Kellen did when he launched Droid Life in 2009. After working years of unsatisfying jobs in the medical and property management fields, he took a risk to try and create an online community while playing with the coolest gadgets on the planet each day, a risk that has turned out to be incredibly rewarding. Outside of Droid Life, Kellen is your typical Portlander who drinks way too much good beer, complains often about the Trail Blazers, and can be found out on the streets for a run, rain or shine.

Post navigation

14 Comments

  • HDR+ is the first application to run on Pixel Visual Core. As noted above, Pixel Visual Core is programmable and we are already preparing the next set of applications, there when Google unlocks it in developer mode , it will be used for hdr+ , latter on enabling it for custom apps , ( which means) hdr + photos will take advantage of the extra 8cores on pixel core that krazy 2 8 core CPU on a phone first ever , I’

    m sure the camera will not need 8 core cpu , next pixel or after that will be dropping Qualcomm and pixel cores will be competing with apple CPU A 12 or A13 Google is the only company bigger then samsung to pull it off , and sinc it’s clean Android , it can very well give apple a serious challenge , it already did after only the second Google phone ( and no Nexus was only Google software ) so yes real Google phone , it took only 2 years to produce better pictures the Samsung and LG even apple , And they did it not with the best hardware but pure software goes to show u it’s software that makes great phones not so much hardware , and apple use to be a software company lately they shifted to hardware and software takes a back seat now ,which is why every new iOS is really bad until eventually it gets patched 100 times , my Google pixel XL got better with 8.0 not 8.10 it need no update because Google had said it there a software company and that’s what they are going to do give u the best software ….

  • Not Google first silicon. Just 2 examples are the Gen 1 TPU and the gen 2 TPU. Gen 2 do 180 Tflops. But then they have their own network silicon in addition to using mercent silicon on the edges.

  • Awesome! There were rumours about a Google imaging chip a week or two before the announcement but I dismissed them when Google made no mention of it – until now.

  • Every time I say I’m not getting the Pixel 2, they make it a [italic]little[/italic] more tempting…

  • I’m less excited about adding HDR+ to additional apps as I am in significantly increasing the speed of HDR+ in the photos app. I mean, it supposed to be 5x faster. That would be HUGE!

  • Hopefully they get it working with Snapchat. Would love to be about to take actually good snaps like iPhones.

    • Google is paying 97$ per hour,with weekly payouts.You can also avail this.
      On tuesday I got a great new Land Rover Range Rover from having earned $11752 this last four weeks..with-out any doubt it’s the most-comfortable job I have ever done .. It sounds unbelievable but you wont forgive yourself if you don’t check it
      !ka150d:
      ➽➽
      ➽➽;➽➽ http://GoogleCashGlobalCareerPartTimeJobs/get/hourly… ★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫★★✫★✫:::::!ka150zz

  • HDR+?
    you can as well install an app and enhance every your photo in the same way.
    looks awful.

  • How will this affect the way all pictures look? Will this void the picture comparison post?

    • No pictures coming out of the main camera shouldn’t look any better. This just allows other applications like 3rd party ones to use the HDR+ algorithm encase you want manual controls or something.

      • No it will do HDR+ if you set by default. Google will move processing to this SoC and could make photos also look better down the road with the hardware main SoC limitation removed.

        • No? What did I say that was wrong? Pictures out of the main (Google) camera shouldn’t look any better assuming you have HDR+ active. Does knowing it’s done via a different imaging processor matter? End result is HDR+ in 3rd party applications.

          The keyword you said is could, they never said they were making improvements aside from making it 5x faster.

Comments are closed.

back to top