10 Comments

Sorry, Arnold. You are wrong. As long as nasty barbarians continue to grab and consolidate their coercive power, they will erase and eliminate everything they don't like. You may not like it but "thanks" to the pandemic, they have finished the destruction of all liberal values. Unfortunately, too many cowards are helping and appeasing them. As of today, it looks too late to save the U.S. bureaucracy, universities, and social media --even if the barbarians and their servants were defeated, their rebuilding would be costly and take a long time.

Expand full comment
founding

Something that I think would help social media would be punishing the “echo chamber”. If you think of each pair of users in a social network as having a “similarity score” (how similar they are in the types of posts they interact with), then someone speaking to an echo chamber would get most of their “likes” from people with high similarity scores to themselves.

Deboosting people with high echo chamber scores and boosting people with low echo chamber scores would improve discourse drastically. And I think it would be hard to game without actually improving discourse.

Expand full comment
Jan 1, 2022·edited Jan 1, 2022

I've been on the internet long enough to remember when people thought the war on spam was unwinnable, and then shortly later to see the use of technology and math to nearly eliminate spam. I don't see why semantic analysis, moderation, fact checking, statistics, and machine learning can't be used in the same way to make social media nicer, more pleasant, and more fact based.

The problem isn't technological or mathematical. The problem is incentives. The social media companies have mostly weaponized our attention. They profit the more attention we give their services. And, sadly, we seem to devote much of our attention to sensationalist, low information, high conflict, and tribal nonsense.

I think of Pinterest and LinkedIn as two social media sites that don't endlessly promote conflict through attention mining, and it no surprise to me that they are way less attention consuming than Facebook, Twitter, Instagram, and the like.

Expand full comment

I think that a "tax" on advertising attached to heavy engagement might be useful. The trick would be how to define "heavy engagement.

"

Expand full comment

Such speed bumps have been implemented in many places already, with mixed results as one might expect (it's basically anarcho-tyranny). For example, requiring CAPTCHA to post on anonymous imageboards. When this was first introduced in an attempt to improve atmosphere on boards, it quickly turned out that it did little to impede motivated wipers (i.e. those who flood a board with objectionable or low-quality content) because they soon created scripts which circumvented, solved or brute-forced the CAPTCHA and were able to elude IP address/range bans using a variety of tricks and maybe a little money. Regular users could not muster the numbers to overwhelm the malefactors' automated posting systems, and the more sophisticated CAPTCHA and other restrictions board managers introduced, the worse the problem got. Some regular users turned to the same wipe scripts to out-wipe the wipers. There were wipe wars, treaties, betrayals, and so on. Ultimately key people on all sides got bored with it and moved on.

Expand full comment

I appreciate being pointed toward the article and Arnold’s comments. I sort of feel towards these issues like I felt 20 odd months ago when I had no relatives or acquaintances who been infected with Covid—I’ve never actually received untoward social media like that which is discussed. Anybody else in the same boat? Like OneEyedMan I think comparison with spam is useful and makes me think this stuff will in practice be limited by reasonable steps.

Expand full comment