Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

Why Computational Photography is so Revolutionary

Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

What is computational photography, and why should you care? And how can it help your photography?

Don’t listen to me.

Wood texture. Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

To start off, I don’t know much about computational photography. I’m a newbie. So excuse my errors.

But to me, it is fascinating.

Diagonal yellow lines. Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

In brief, computational photography is using computers, algorithms, and computer processing (processes) to process your photographs to look better. To remove noise, distortion, to improve sharpness, skin tone, etc.

Barbed wire. Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

For example, HDR+ is a HUGE REVOLUTION in the game of photography, specifically camera phone photography.

Why?

It means we are no longer in a megapixel war. We don’t need to keep improving the hardware or the pixel sizes of the phone cameras for “better image quality”. Rather, we can improve our photograph quality by these awesome algorithms, that work with your pre-existing camera phone hardware.

Wooden textures in Kyoto. Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

To sum up, computational photography improves the software of your camera, not your hardware.

Why this is a big deal

Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

We all know the hype cycle of the new camera, with the new sensor, with the better image quality.

But instead, what if the secret to better “image quality” and aesthetics was in upgrading your camera software?

Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

For example, Google and their research team made HDR+, a computational photography camera program that takes many photos in burst mode, then creates an automatic HDR (high dynamic range) image, then automatically removes noise, adds sharpness, improves color tones and saturation, etc.

HDR+

Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

I tested the Google Nexus 6P from my sister Annette, which has the HDR+ function in the built-in Google camera. Generally HDR+ is only available on phones made by google (like the Nexus phones, or the Pixel phones).

Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

But very cool, a guy hacked it so now you can also download a version of the Google HDR+ Camera App (depending if you have a compatible Android phone).

Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

Anyways, I tested HDR+ on the Google Nexus 6P and I’m pretty fucking blown away.

Why?

The image quality is pretty phenomenal. The skin tones look very good. The images are VERY SHARP. The shots indoors or outdoors at night (low light situations) there is very little noise. The images looked so good I wondered to myself,

Why don’t I just shoot all of my photos on a Google phone, with HDR+?

The Zen of Phone Photography

Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

I’ve been shooting with a phone the last few days while I’m at the KYOTO HAPTICLABS OFFICE. It’s an environment of experimentation and play.

Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

It has been fun, I’ve tested the OnePlus 5 phone camera from my sister Jennifer, the Google Nexus 6P, and also on my old iPad Air (photos all processed on VSCO). It is so nice being able to pick, process, and upload your photos all from the phone with a WiFi connection, rather than dealing with the annoying transferring, exporting, on a laptop.

Family selfie. Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Selfie with family. Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

Why this is good for my mom.

Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset
Kyoto, 2017. Google Nexus 6P and HDR+. Processed with VSCO with a6 preset

Another thought: I remember how confusing it was trying to learn Photoshop, Lightroom, and how to process my RAW photos on my computer.

Now, the average photographer (like my mom) no longer needs to deal with all that crap to make good pictures. Rather than her stressing about how to learn how to process her photos to “look good” or aesthetically pleasing, she can focus on the most important thing a human can do… MAKE PHOTOS. To focus on experiencing something, to focus on framing, composition, and all the interesting parts of photography.

I recently got my mom a LG G6 camera phone (she loves it, with the dual lens, one super wide-angle, one normal). It makes great photos automatically, and now with Google Photos she has unlimited backups to the cloud (so she will never run out of space on her phone). Also, she has been sending me beautiful landscape pictures she shoots (she’s currently backpacking through Sweden).

CREATISM: Artificial Intelligence, Machine Learning, and Photography

All these pictures were created, cropped, and processed with Google Street View and a machine learning Algorithm (artificial intelligence) from “CREATISM paper”:

Check out all these pictures shot on Google Street View (a guy who went hiking in the mountain, with a camera strapped to his backpack), and now read this research paper on their findings (download PDF).

How the images were cropped, and processed:

How does a machine make a picture?

Anyways to not bore you, this is what happened:

  1. A computer program analyzed a lot of pictures shot on Google Street View.
  2. The program then selects some landscape pictures, then crops them, adds “dramatic filters”, and processes the images.
  3. The researchers than gave the pictures to landscape pictures to a team of “professional” photographers to judge them (Turing Test, as the pro photographers didn’t know the pictures were created and edited by a machine).
  4. The photographers didn’t realize the pictures were made by a machine, and rated some of the pictures very highly.

Why do we need human photographers to make pictures, when artificial intelligence can do it for us?

What are the implications of this?

  1. Artificial intelligence (computers) can now make pictures that look very good. If a computer can make a beautiful landscape picture, why should a human being make landscape pictures?
  2. My theory: the joy of CREATING a nice landscape picture, the joy of going on the hike, is what makes landscape photography fun for humans.
  3. Therefore, Photography is still important for humans as a joy of walking, the joy of EXPERIENCING LIFE FIRST-HAND (experiencing the joy of hiking in the Sierras is not the same as looking at pictures of it), and also the joy of humans to play “Photo GOD” and morph visual reality to our own whim.

Can an AI (artificial intelligence) program shoot the same pictures as you?

Practical insights

Processed with VSCO with a6 preset

Okay, this is a bunch of theoretical nonsense. Let me try to break it down, in terms of what it means to ME (and possibly you):

  1. Because of HDR+ I would prefer to use a Google Phone Camera over the iPhone. I wonder, maybe I should only use phone for snapshots, and maybe just shoot digital medium format for my “artistic” work? Or should I just use a phone for everything?
  2. Consider, if shooting with a Google Phone makes me a slave to the Google Ecosystem, or whether the “open” platform is better than Apple’s “walled garden?” I personally hate how Google is using all of my browsing and maps behavior to serve me more (very targeted and accurate) advertisements. I have no issues with “privacy”— I just don’t like being distracted and having less agency, and control in terms of shit that pops up in my face, and I cannot then it off (like that Black Mirror episode where you had to pay money to turn off the advertisement pop ups). Apple respects privacy, and tries to sell you devices. Google gives away the software for free, but tries to sell you advertisements. In this sense, I prefer Apple.
  3. Stop buying “mid-range” digital cameras($1000-2000 range). A waste of money, as phone cameras are good enough. Maybe start shooting more film again, for superior aesthetics.
  4. Don’t fuss over “image quality” in photography, rather personal photos that are meaningful to me. Less pretty landscape pictures (robots can do that), rather shooting pictures of my loved ones, family, and using photography as a tool to live a richer life.
  5. Memento Mori Photography. I will die. So use photography as a contemplation on life and death, for myself, and loved ones.

Conclusion

Processed with VSCO with a6 preset

And for you friend, don’t worry about the camera so much. Avoid GAS (gear acquisition syndrome) and focus on MAKING PHOTOS and being CREATIVE EVERYDAY.

Share your best pictures in ERIC KIM FORUM and invest in yourself to be creative everyday with HAPTIC TOOLS.

HAVE FUN,
ERIC


PHONE PHOTOGRAPHY

Garden Grove, 2015. Shot on a LG G4
Garden Grove, 2015. Shot on a LG G4

The best camera is a phone:

See all articles >