There was only one stand - out feature on thePixel 3phones : That fantastic ( single - lens ! ) camera , whichgot betterover prison term and made thePixel 3athe best mid - commando on the market too . Now Google has revealed the follow - ups , thePixel 4and the Pixel 4 XL — so can they keep the Pixels on top of the pile in footing of speech sound cameras ?

The single - crystalline lens cracker bonbon has now become a dual - lens affair , combining a 16MP f/2.4 2x optical zoom telephoto camera and a 12.2MP f/1.7 camera . Unusually for a smartphone in 2019 , there ’s no wide - angle lens : “ While wide - angle can be fun , we mean telephoto is more of import , ” say Marc Levoy from Google Research ( and Stanford University ) at theMade by Google launch event .

The Pixel 4 , like the Pixels before it , relies on “ computational photography”—that ’s a term that Levoy himself come up with for a Stanford course and it means “ doing less with hard - wired circuitry , and more with computer code ” in his own Logos .

Article image

Photo: Alex Cranz (Gizmodo)

fundamentally , we ’re utter about taking several photos at once , and blend them together to get a good closing result — smartphone lenses and processing speeds have now acquire to the stage where this can occur in an instant , without you noticing once you ’ve pressed the shutter button .

And it mean one snap can adjust its exposure to capture the sullen parts of a scene , and another can do the same to retain details in the lighter expanse . Put them together , and you do n’t recede any of the finer contingent . It ’s a technique you ’ll now see on raft of top - end phone camera , including those from Samsung and Apple .

The Pixel 4 camera array also includes ahyperspectral sensor , lean as a “ spectral + flicker sensor ” on the Google Store . Google has n’t say too much about what this does , but we ’re assuming from the ability of hyperspectral tomography to discover multiple television channel of light , the data this sensor captures is going to feed into the Pixel ’s algorithms to further ameliorate how it ’s new photo modes work .

Image: Google

Camera features took up plenty of time at Made by Google.Image: Google

What’s new

Every Pixel has featured what Google calls HDR+ , where a burst of up to nine pictures are captured every prison term you collide with the shutter button , then averaged out to scale down shadow stochasticity . The first newfangled Pixel 4 feature is Live HDR+ , where you ’ll see this effect applied as you draw up a snap — you wo n’t have to approximate at what the end result might calculate like .

Pixel 4 is also introducing dual pic controls , sliders that allow you conform the brightness ( the gaining control exposure ) and phantasma ( the tone mathematical function ) before you take a scene ( you might already know these sort of tweak from apps like Photoshop ) . If , for exercise , you need a dramatic silhouette blastoff rather than the even equilibrium that HDR+ gift you , double vulnerability controller make this potential .

When it come to zoom , Google says the unexampled 2x telephotograph lens on the Pixel 4 , compound with its subsist Super Res Zoom tech working across both lens , results in higher-ranking hybrid zoom . Super Res Zoom , whichdebuted last year , habituate the tiny difference of opinion between each of the nine image in a burst to fill in the details as you whizz along in . It ’s stool guesses , but very bright ones .

Image: Google

HDR+ now works in the viewfinder.Image: Google

The applied science , Google says , act upon better than pasture after the picture has been taken — if you pinch - zoom before you take the photo you should get better results than if you lop it later , because of the computing that are applied as you zoom in.n

The Pixel 4 is also smarter when it comes to reflexive white balancing , a picture taking problem that ’s very slippery to ready — fundamentally hold sure that lily-white looks white no matter what the ignition conditions are like ( if you ’re indoors , for deterrent example , you ’ll often get an orangish pinch from the lighting ) .

Again , it ’s a enquiry of train Google ’s algorithms to recognize when lily-white should be white : “ We ’ve been using learning - base white balancing in Night Sight since Pixel 3 , ” said Levoy on level . “ In Pixel 4 we ’re using it in all photo modes , so you get truer colors , especially in foxy ignition . ”

Image: Google

Dual exposure controls on the Pixel 4.Image: Google

Other improvements are come to portrait mood , the reckoning for which are now use in altogether mode , Levoy toldCNET . The addition of the extra photographic camera lens means more information for Google ’s machine learning algorithm to make for with , and that should lead in depth getting more accurately mensurate across longer distance ( each camera electron lens captures the stab at a slightly unlike slant ) .

at long last , thealready impressive Night Sightis about to get even more up to with the Pixel 4 and Pixel 4 XL . You might have already seen the astrophotography shots taken by the phones , which are made potential by long exposure and more of them : Specifically , 15 exposure of up to 16 seconds each for the Pixel 4 astrophotography mode .

Do the math and that means your Pixel 4 has to stay still for four instant — but the result look worth it . As the stars move and the trees curl over those four minutes , the Pixel 4 algorithmic rule will line up and blend the pictures it takes to make one crisp , noise - free end upshot . If there are people in the inning , you ’ll have to tell them to stay very still .

Photo: Raul Marrero

Three camera lenses have come to the back of the iPhone for the first time.Photo: Raul Marrero (Gizmodo)

As with the Pixel 3 , expect the Pixel 4 ’s photo - ask capabilities to get better over prison term because so much of the unconscious process rely on software . Levoy teased a future update that would enable a photo to equilibrize a bright moon and a glowering foreground — a brightness departure of about 1,500,000 time , Levoy says , or 19 f - stops .

The competition

Google is n’t the only company exploit on this computational approaching to photography of course . Its main rivals Samsung and Apple also have multi - lens cameras that aggregate several shots into one to produce the best termination — the number and type of shot in the burst might deviate , as well as the processing algorithmic rule , but the idea is the same .

As you would expect , these phone makers are keeping a lot of their algorithmic secret to themselves , but the goal is always to raise the most detail and the least amount of noise in a photo , as well as the most exact colour reproduction — and to do all of this no matter what the lighting environment .

Apple ’s Deep Fusion photographic camera update for the iPhone 11 , which is due with iOS 13.2 , uses the neural processing power of the A13 Bionic poker chip to optimize for detail and lowly disturbance across nine disjoined exposures , the same number that Google use . ( It was whiledescribing Deep Fusionthat Apple exec Phil Schiller used the “ mad science ” term repel by Levoy in the Google display . )

Image: Samsung

Samsung Galaxy phones do a lot of processing before you hit the shutter button.Image: Samsung

The iPhone 11 , iPhone 11 Pro and iPhone 11 Pro Max handsets have a 12MP f/2.4 ultra - wide slant lens too . The Pixel 4 does not . That ’s fit with a 12MP f/1.8 all-inclusive slant lense on all three phone , plus a 12MP f/2.0 telephoto lens on the Pro and Pro Max — as well as zooming out to 0.5x , letting you tally more in the frame from the same advantage item , you could zoom in to 2x optic rapid climb .

Samsung ’s beneficial headphone camera , meanwhile , is presently on the back of theGalaxy Note 10 + . You get four lenses : A 16MP f/2.2 extremist - blanket one , a 12MP f/1.5 - 2.4 wide angle one , a 12MP f/2.1 telephoto one ( with 2x optical rapid climb ) and a “ DepthVision Camera ” tomeasure distances more accurately .

Samsung phones typically do more processing in advance than Apple or Google ones , which is where that adjustable fluorine - stop lens do in ready to hand : The firing conditionsare analyzedand the picture is adjusted while you ’re framing the shot . By capture more information to get with ( something Samsung has been doing for years ) , less post - processing is required .

Argentina’s President Javier Milei (left) and Robert F. Kennedy Jr., holding a chainsaw in a photo posted to Kennedy’s X account on May 27. 2025.

We do n’t yet know what the Pixel 4 is going to be like for assume exposure daytime to day , but we do know the iPhone and Galaxy handsets havecaught up with the Pixel this year — whether the Pixel 4 will shift the balance stay on to be seen . More than ever before though , judging a phone television camera is less about reading the specs on the Thomas Nelson Page , and more about take in the oddment results from all that on - board processing and chicanery .

AppleGooglePixel 4Samsung

Daily Newsletter

Get the best tech , science , and refinement news in your inbox day by day .

News from the future tense , delivered to your present tense .

Please pick out your desired newssheet and submit your email to advance your inbox .

William Duplessie

You May Also Like

Starship Test 9

Lilo And Stitch 2025

CMF by Nothing Phone 2 Pro has an Essential Key that’s an AI button

Photo: Jae C. Hong

Doctor Who Omega

Roborock Saros Z70 Review

Argentina’s President Javier Milei (left) and Robert F. Kennedy Jr., holding a chainsaw in a photo posted to Kennedy’s X account on May 27. 2025.

William Duplessie

Starship Test 9

Lilo And Stitch 2025

Roborock Saros Z70 Review

Polaroid Flip 09

Feno smart electric toothbrush

Govee Game Pixel Light 06