Fake Cues: Why The Next Photo Innovation May Be Helping You Tell a Lie

I want to share something which might be a bit unsettling – whenever I’m looking at you, I’m judging. Whether it’s in person or a static photo. I just can’t help it. It’s how I’m wired. By the way, it’s how you’re wired too. Perhaps our most basic survival skill is the near constant assessment and reassessment of other people based on their facial structures, their demeanor, slight changes in their movement. Which makes one wonder: can I “hack” this evolutionary necessity and influence the way you react to me?

For example, “microexpressions” are defined as “brief, involuntary facial expression shown on the face of humans according to emotions experienced. They usually occur in high-stakes situations, where people have something to lose or gain.” Perhaps you remember “Lie to Me,” the Tim Roth tv show where he played the world’s leading expert of microexpression-reading (now THAT would be a good LinkedIn Endorsement).

Screen Shot 2017-09-18 at 2.27.58 PM

But it doesn’t even require the fidelity of a realtime interaction for our monkey minds to start forming an opinion of someone else. All it takes is a face, even a static picture or artistic representation. Did you know we typically find facial symmetry more attractive because it potentially signals high breeding status? Side note – I’m fairly certain I’m asymmetrical AF.

So when hearing that the new photo sharing app Polygram reads the facial reactions of the viewer to tell you whether they liked or disliked your photo, well, that set off a bunch of ideas. We’ve undoubtedly already trained machine learning models to predict the “attractiveness,” “honesty” etc of people depicted in a photo. What happens when we start running this software not just in post-photographic analysis but in photo creation and editing? For example:

  • Selfie mode of a camera could let you select what emotions you want to provoke in the average viewer, give you some basic facial motions to mimic and then shoot a burst of photos, selecting from the burst the pictures which are most likely to work.
  • Photoshop could have buttons for “honesty,” “attractiveness,” “happiness,” etc and move your facial bits around in the smallest ways needed to “enhance” this aspect of your person. The changes likely wouldn’t need to be very significant – they’re called “microexpressions” for a reason.
  • Run software against models in commerce site, Tinder profiles, realtor photos and so on. ID any photos where the viewer would have real but imperceptible negative reaction. Prime them to buy, to swipe right.

So yeah, if software is going to help us read emotional reactions you can be sure it’s going to be used to manufacture them as well.