Since 2016, Tristan Harris has embarked on a mission: to reveal the techniques put in place by social networks cause an addiction to them. A former engineer at Google, he was recently interviewed by the US Senate.
It is increasingly difficult to ignore that our beloved social networks were designed specifically for us to be addicted to them. On average, we spend 1 hour and 22 minutes a day on them and tell them a lot about ourselves. Maybe even more than we think.
Coming forward about these methods is the battle that Tristan Harris, a former engineer at Google who has watched social network operations closely, has taken upon himself. Heard on June 26 by the US Senate, he recalled the reasons for his fight in the preamble of his intervention: "One thing I learned is that people don't listen when told 'it's bad for you.' If you tell them 'here's how you're being manipulated'... nobody wants to feel that."
The culture of shock
Since 2016, he has helped draw attention to how news feeds operate like "slot machines", which causes "the same dependence that keeps people in Las Vegas." Another technique, which well known in economics, is the network effect: with the appearance of "likes" and "followers", social networks "create an addiction to getting the attention of others - rather than capturing yours."
However, the more time you spend in these ecosystems, the more machine learning learns about you and your preferences, and optimizes an algorithm that is personalized to you. "It calculates what it can show you that will cause the most commitment from you," says Tristan Harris. The problem: what causes the most engagement is also usually what is the most morally reprehensible. "It has been proven in a study that for every word of moral outrage added in a tweet, the retweet rate increases by 17%".
And it is clear that this system works: on Youtube for example, 70% of traffic comes from recommendations, which are generated by these types of algorithms.
This hearing did not leave the senators unmoved. Democrat Brian Schatz spoke of "amoral algorithms". "Companies are letting algorithms go wild and only using humans to clean up the mess... They are running wild, in secrecy."