Dateline: Washington, DC
1 April 2021
In testimony today before the United States Senate Commerce Committee, Facebook chairman Mark Zuckerberg announced a new feature that will correct one of Facebook's most irritating failures.
"For the past few years, Facebook has diligently sought a way to filter out misinformation in public posts," Zuckerman informed the committee. "Frankly, we have done that out of self-preservation to avoid having the government define what is appropriate content on our platform."
Zuckerman then listed a series of initiatives Facebook has put in place to distinguish between True Facts and False Facts. Each of those initiatives has failed.
"We are done trying to be truth monitors. Instead, today, I am announcing a new program that will put an end to an even more dangerous trend in Facebook posts. Non sequiturs."
He provided the following example to the committee. Mary Hulbert of Southport, North Dakota posted a request last month asking for suggestions on how to improve her potato salad. She clearly stated: "I do not want to add dill pickles, green onions, or capers to my salad. I do not like those flavors."
The first reply was: "Have you ever tried adding dill pickles?"
The second reply: " How about green onions? If you eat a dozen a day, you will never contact covid. My daughter-in-law once saw a scientist in a play."
The third reply: "I don't like potato salad. That is because when that corrupt Grover Cleveland was president, everything in government went bad. Cleveland is responsible for ruining vegetables in general."
98% of Facebook clients identified non sequiturs as their most irritating experience on Facebook. 99% confessed they had posted non sequiturs in the past 48 hours.
Zuckerman revealed that Facebook has been working on an artificial intelligence algorithm that would ferret out non-responsive replies by combining the censor skills of a Red Chinese functionary, the eagle eye of those lonely people who spend all day reporting content violations, and the breadth of knowledge of a retired reference librarian. Unfortunately, during test, the algorithm let all three of the replies in the Mary Hulbert example to be posted.
Instead, Facebook has conceded that its discussion pages are probably a prime reason why people are incapable of carrying on an intelligent conversation -- not only online, but in person. To resolve that problem and the related non sequitur issue, Facebook will automatically apply Godwin's Law to every Facebook post.
Godwin's Law postulates that if an online discussion goes on long enough, sooner or later, someone will compare someone or something to Adolf Hitler or fascism in general. Once that happens, discussion ends.
"This is how the new system will work," Zuckerman explained to two dozing senators on the committee. "If Mary Hulbert asked for suggestions on her potato salad, Facebook would be programmed to immediately respond: 'Only Nazis eat potato salad. And anyone who responds to this post is a racist homophobe and will be reported to the administrator.'
"This action may be drastic, but our research team predicts that the chances are high that, in response, people will cut down on online discussions and actually start having real conversations with their actual friends.
"Or maybe we will all turn out to be April fools."