Site icon Hunter Walk

Fake Cues: Why The Next Photo Innovation May Be Helping You Tell a Lie

I want to share something which might be a bit unsettling – whenever I’m looking at you, I’m judging. Whether it’s in person or a static photo. I just can’t help it. It’s how I’m wired. By the way, it’s how you’re wired too. Perhaps our most basic survival skill is the near constant assessment and reassessment of other people based on their facial structures, their demeanor, slight changes in their movement. Which makes one wonder: can I “hack” this evolutionary necessity and influence the way you react to me?

For example, “microexpressions” are defined as “brief, involuntary facial expression shown on the face of humans according to emotions experienced. They usually occur in high-stakes situations, where people have something to lose or gain.” Perhaps you remember “Lie to Me,” the Tim Roth tv show where he played the world’s leading expert of microexpression-reading (now THAT would be a good LinkedIn Endorsement).

But it doesn’t even require the fidelity of a realtime interaction for our monkey minds to start forming an opinion of someone else. All it takes is a face, even a static picture or artistic representation. Did you know we typically find facial symmetry more attractive because it potentially signals high breeding status? Side note – I’m fairly certain I’m asymmetrical AF.

So when hearing that the new photo sharing app Polygram reads the facial reactions of the viewer to tell you whether they liked or disliked your photo, well, that set off a bunch of ideas. We’ve undoubtedly already trained machine learning models to predict the “attractiveness,” “honesty” etc of people depicted in a photo. What happens when we start running this software not just in post-photographic analysis but in photo creation and editing? For example:

So yeah, if software is going to help us read emotional reactions you can be sure it’s going to be used to manufacture them as well.

Exit mobile version