no kidding it actually works: google pixel


The Google Pixel appears to be a lightning rod for polarised comments - people either really hate it, or really love it. I'm (obviously) in the latter camp.

Yes it looks like an iPhone. Yes it is expensive. Yes it is computational photography on steroids.

But get over it. The Pixel is here to shake the Android (and possibly camera photography) world forever.

Let me preface by saying that I was very impressed with the 89% DXOmark score. One always take benchmarks with a grain of salt though - the problem with benchmarking is that the exercise is done against a scale of some arbitrary "objective" measure of "common use". We know though that common use in reality... vary.

I'm not going to benchmark this. Heck, I'm not going to even compare it to the Nexus 5X.

I lie.

The Pixel is a marked improvement over the Nexus 5X in so many ways. Sure, the camera module has only marginally improved. However, the magic happens squarely at the graphics signalling processor level, churning through all those pixels and crunching data algorithms to offset the many shortfalls of photography using a tiny lens.

Because what are these shortfalls? Let's ignore cheap optics, and say:

  1. Less light
  2. More noise
  3. Wide depth of field

The Pixel's HDR+ addresses 1 + 2 perfectly. Let me clarify - HDR+ is not HDR in the truest sense of the term. Traditional HDR layers several captures shot at different exposures to produce a composite that maintains a balanced exposure or a wide colour range.

HDR+ can do the above, but also layers several captures shot at the same exposure. The theory by Google's engineers is this - if you can capture a succession of photos rapidly with short exposures and noise is random, you can technically determine through the series what's noise and what's signal, eliminate this to an extent and align the images (addressing point 2). You can then layer the shots into a composite by multiplying the signal (addressing point 1).


Point 3 is a little bit of a clanker. There is a Lens Blur feature mode on the camera app. It absolutely sucks noodles. No self-respecting photographer should use it.

So, how does it go, a couple of weeks into the Pixel?Honestly - from the perspective of a photographer, along with everything mentioned above, this phone as a camera is amazing. There is no lag. There is no stutter. Throw anything at it, and you would most likely walk away with something decent.

Of course, it's not perfect, and it has its shortcomings - but like any phone camera, you quickly learn to play within its limits. For the Pixel, these limits relate to the lack of OIS which shows once slow shutter speeds kick in. Because the Pixel tends to also prefer making shutter speeds slower than it probably should, there is plenty of motion blur (which subjectively, can be a nice effect) if subjects move.

I'm also a little bit impartial to the oversaturation of blues and yellows. I like more natural tones, but I think this is individual preference.

Note: People have complained of the lens halo when faced with a bright light source. However I don't think this is a limitation. No self-respecting photographer should be taking a photo with harsh backlight to the subject. That is just... sad.

These are some recent work using the Pixel.


There'll be more on my Instagram feed - photos taken with the Pixel are tagged with #MyPhoneByGoogle

So will the Pixel's application of computational photography replace photographers? This is in response to the wave of, "OMG. Death of photography. Phone camera kiddies everywhere," which has somewhat gone up a notch since the Pixel's launch (and focus on computational photography).My take is, no.

It is way too easy to take a decent shot with the Pixel. Sure, it can be perfectly exposed. You might even get lucky and the app picks the most decent wefie where no one is caught mid-blink. However, you still need a good photographer to compose the damned thing. In other words, photography is an art, not a science. And computers, as we all know, are great at science, but terrible artists.

In summary, my thoughts are: The Google Pixel - holy sharks on a sailboat, they actually made it work.

TL;DR version: Get it.

introducing photoscan by google photos

I'm really loving Google's ad team of late - getting very on-point, fun, and relevant. Anyway, I'm really excited with the new PhotoScan by Google Photos, which to put it simply, is:
phone camera + computational photography = perfect old photo scanner.

It is no secret that I'm a huge believer of computational photography, and I'm surprised it's taken this long for a feature such as this to be folded into the core of a prime time (i.e., Google Photos) app.

Check out the spot for PhotoScan. I'll be busy telling my dad to ditch his scanner.

(p.s. for those who are curious, the music used in the ad is Chiddy Bang's Ray Charles.)