Publisher Bruce Trogdon

A majority of readers (56%) believe that running against Trump will not help Biden, but instead will be Biden’s worst nightmare.

We had a letter Monday thanking us for fighting “the good fight against ignorance and tyranny.” I commented back that “our mission is to present a balance of news and opinions so people can see both sides of issues. We think it is really needed in today’s world of clickbait news and mindless social media.”

Speaking of mindless social media, we have been re­porting on the problems at Facebook and the testimony of whistleblower Frances Haugen to Congress. Today in our Technology section we have a follow up story on Haugen and her suggestions to regulate Facebook’s algorithms.

On Facebook, you decide who to befriend, which pag­es to follow, which groups to join. But once you’ve done that ...

... It’s Facebook that decides which of the posts you actually see each time you open your feed - and which you don’t.

The software that makes those decisions for each user, based on a secret ranking formula devised by Facebook that includes more than 10,000 factors, is commonly referred to as “the news feed algorithm,” or sometimes just “the algo­rithm.” On a social network with nearly 3 billion users, that algorithm arguably has more influence over what people read, watch and share online than any government or media mogul.

We are featuring a two page Guest Column today that is authored by Jeff Kosseff, an associate professor in the U.S. Naval Academy’s Cyber Sci­ence department; and Daphne Keller, who was formerly asso­ciate general counsel at Google.

“Why outlawing harmful social media content would face an uphill legal battle” write the well informed authors about the testimony and sug­gestions from Frances Haugen. Haugen urged Congress to pre­vent the spread of lies on social media platforms like Facebook. She also suggested that the law should restrict platforms’ “am­plification” of such content, on features like Facebook’s algo­rithmically ranked news feed.

Kosseff and Keller believe such remedies would encounter problems with the Consti­tution. It is an interesting article for those willing to wade a little ways into the weeds of technology and the law. For those that don’t, here’s my short synopsis.

What if Congress didn’t directly restrict false speech, but instead created new liability for the platforms that amplify it? That approach appeals to lawmakers and it does to me, too. “It’s unlikely to convince courts, though” maintain Kosseff and Keller.

They say that even proposals that tackle only amplifi­cation of genuinely illegal content - for example, by elim­inating platforms’ Section 230 immunities for defamation - would raise real concerns.

How about just eliminating immunity for amplified content? That’s what I am in favor of. Both Haugen and another former Facebook employee, Roddy Lindsey, have suggested that eliminating immunity for amplified content could have a different consequence: Making platforms stop their current “engagement-based” ranking practices altogether. For critics like me who believe that platforms’ current systems inevitably prioritize emotionally engaging but societally damaging content, this sounds like a winner.

But this method also has legal complications, say Kosseff and Keller in today’s Guest Column.

“To protect free speech, should Facebook be held liable just for their promoted posts?” That’s The Daily Post Reader Poll question for Wednesday.

(0) comments

Welcome to the discussion.

Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.