Google pixel 3 comment: best smartphone camera (now)


This year is the year of excellence for flagship smartphones. In August, Samsung unveiled its Galactic Note 9, which, despite its powerful capabilities and some interesting new hardware tweaks, including a liquid cooling processor, has not really changed the Galactic universe radically. Then Apple released the new iPhone XS, which provides an improvement similar to the previous iPhone X.
Now we’re approaching the last stop of a new flagship smartphone train in 2018, with Google’s pixels 3 and 3 xl. Like its competitors, pixel 3 will not disrupt this trend. Of course, there are some changes and new features, but if you’re looking forward to a profound smartphone revolution, be lucky in 2019. But we’re left with one of Google’s best products and one of the best Android handsets around – thanks in large part to its impressive camera.
What’s up?
Pixel 3 and pixel 3 XL are Google’s own hardware baby. They followed the pixel 2 and pixel 2XL, which appeared almost a year ago.
The shape factor changed little, but the screen size changed. Pixel 3 has a 5.5-inch OLED screen, and pixel 3 XL extends its display to the top corner of the device, pushing it to 6.3 inches, while making room for a front-facing camera in a gap.
Set pixel 3 next to the iPhone XS max, and unless you notice the microphone slot at the bottom of the pixel, it’s easy to confuse them. In other words, the mobile phone in 2018 has a “look” and the pixels are strictly observed.
Is it really the best camera ever? ”
Let’s get rid of this: I really think pixel 3 is the best camera I’ve ever used on a smartphone. For those who know how to use dslr, it doesn’t replace dslr, but pixel 3 is a very good omni-directional imaging device, and it really impresses me sometimes. Of course, it frustrates me with other people, but AI and computer processing, and every time you take a picture, it’s like the future of the camera, at least outside the core fanatics market – even if all of this can sometimes get in the way of “fixing” things you deliberately screw up.
Goolge continues to calculate the concept of photography. Instead of trying to squeeze all the possible qualities out of the microcamera module in the traditional way, Google is using a single rear-end camera to capture as much data as possible and then combine them to make a good-looking image. Even under harsh conditions.
Photo function

Last year, Google unveiled the core technology for pixel vision, which has a dedicated chip for processing image data. In Pixel 2, every time you press the button to take a picture, the camera will press 10 separate photos and then mix all the information into one picture. It does not take full advantage of some of them to prevent the highlights from being blown away, and it does not take full advantage of others to show the shadows of detail. It compares photographs to find digital noises that should not appear in photographs. It is not only looking for your mistakes, but also trying to compensate for the physical boundaries of the digital camera gears. Google calls it hdr+.
High dynamic range images sometimes look unnatural and cartoon (I find Samsung Galaxy cameras to be the worst criminals in the smartphone world, but that’s partly because of their AMOLED screens). Apple has started doing similar things with photos of the iPhone’s X-ray, which now has dedicated image processing chips. Overall, however, I prefer the look of pixel 3 images because they look more natural out of the lens.
Pixel 3 also adds a new low-light shooting feature called night vision, which goes beyond the typical HDR and takes more pictures on each shutter. Night vision can capture up to 15 photographs, some of which are exposed for up to a third of a second, allowing light to enter the sensor. It’s almost impossible to keep the camera steady (the human hand starts showing signs of shaking for about a third of a second), so pixels use its internal motion sensor to track how your hand is shaking and correct it.
To some extent, night vision is effective. One thing in dark environments is color representation, because more digital noise prevents the reproduction of timbre. In recent years, cameras have made similar strides – have you noticed how good low-light lenses in movies and TV programs have been lately? ———————————————————————–
Google categorically says that this is a solution that will help you never have to use your smartphone flash again, which is good, because cameras, including LED light sources, are bad, just like all other smartphone flashes once existed.


Please enter your comment!
Please enter your name here