Sister blog of Physicists of the Caribbean in which I babble about non-astronomy stuff, because everyone needs a hobby

Sunday 16 December 2018

Sometimes making things harder makes them more popular

On how cognitive ease makes things worse in some situations and possible solutions.

The frictionless design of social media platforms like Facebook and Twitter, which makes it trivially easy to broadcast messages to huge audiences, has been the source of innumerable problems, including foreign influence campaigns, viral misinformation and ethnic violence abroad. YouTube’s most famous frictionless feature — the auto-playing function that starts another video as soon as the previous one has finished — has created a rabbit-hole effect that often leads viewers down a path to increasingly extreme content.

What if Facebook made it harder for viral misinformation to spread by adding algorithmic “speed bumps” that would delay the spread of a controversial post above a certain threshold until fact checkers evaluated it?

Or if YouTube gave users a choice between two videos when their video finished, instead of auto-playing the next recommendation?

This approach might seem overly paternalistic. But the alternative — a tech infrastructure optimized to ask as little of us as possible, with few circuit breakers to limit the impact of abuse and addiction — is frightening. After all, “friction” is just another word for “effort,” and it’s what makes us capable of critical thought and self-reflection. Without it, we would be the blob people from “Wall-E,” sucking down Soylent while watching Netflix on our self-driving recliners.

I'd be prepared to endorse considerably more "paternalistic" approaches than those. If you design a system whereby users are actively encouraged to take the easiest approach regardless of the consequences, it should come as no surprise to find that's exactly what they do. People are not sheep : they can and do think for themselves, but they - we - are also, like it or not, part of a system. It does absolutely no good to merely tell people to be more skeptical or whatever, you've got to make it easier (and rewarding) for them to do this. Perhaps there could be some easier way to search for the content of posts on fact-checking websites and/or some way to gamify skepticism. For all the faults of social media, designing a system which utilises both cognitive ease and difficulties is not a simple task.

Tulerie’s co-founder Merri Smith told me a fascinating story from the company’s early days. In the beginning, Ms. Smith said, the company invited women to join via a brief Google survey, which it emailed to hundreds of prospective members. But only one person filled out the survey. So Ms. Smith and her co-founder decided to try a more complicated approach. Anyone who wanted to join had to conduct a brief video call with a company employee first.

Logically, the new strategy should have failed. But it was a huge hit. Prospective members flooded the invite list, filling up the company’s interview schedule weeks in advance. By creating a more complex sign-up, Tulerie had signalled that its service was special and worth the effort.

https://www.nytimes.com/2018/12/12/technology/tech-friction-frictionless.html

No comments:

Post a Comment

Due to a small but consistent influx of spam, comments will now be checked before publishing. Only egregious spam/illegal/racist crap will be disapproved, everything else will be published.

Whose cloud is it anyway ?

I really don't understand the most militant climate activists who are also opposed to geoengineering . Or rather, I think I understand t...