Tuesday, January 9, 2018

"Learning to Fool Our Algorithmic Spies"


In a series of studies published in 2012, two psychologists, Ara Norenzayan and Will Gervais, set out to test a simple question: When people think about God, do they feel or act as if they are being monitored? Subjects of varying religiosity were primed to think about God, then asked to complete self-evaluations based on statements like “I am sometimes irritated by people who ask favors of me” and “No matter whom I’m talking to, I’m always a good listener.” Results, researchers said, were consistent with what they call the “supernatural-monitoring hypothesis” — that is, that “thinking of God triggers the same social cognitive processes that are activated by real-time social surveillance.” A sense that you’re being seen — whether by a fellow human or by a supreme consciousness from which no thought can be withheld — does seem to change how you see yourself as well as how you behave.


Services that used to depend on asking me questions — What do you like? Whom do you want to follow? — have started making more assumptions based on my behavior. I’m reminded that my actions are being recorded and factored into my experience when I am greeted on Twitter by a series of recommendations derived, apparently, from things I’ve liked, looked at or reposted. I’m implicitly prompted to think about algorithmic surveillance when, the day after I wasted a few minutes scrolling through a cycling publication’s account, my Instagram app floods my Explore tab with photos of mountain bikes. Weeks after tapping an unfamiliar name on Facebook and scrolling through the recent posts — a postmarriage surname change, a new profile photo — this person, a former acquaintance I haven’t spoken to in years, is given the same feed placement as current close friends.