Link to original article
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: [New] Rejected Content Section, published by Ruby on May 4, 2023 on LessWrong.
tl:dr;
The LessWrong reviews all first time submissions from new users
In the last month, we've built a new process where we reject some of those submissions (with explanation) – you can now see what was rejected from the main site on the new Rejected Content section
New users shouldn't be very afraid to submit, the bar could well be lower than you think (and now you can check based on what we actually reject). The worst that happens now is you get some feedback from us, rather than getting downvoted by many people.
Though maybe write something shorter as your first thing. Don't write thousands of words only for us to say "this has serious problem X".
What is the Rejected Content section?
When new users submit their first post or comment, it is reviewed by a LessWrong team member before it goes live on the site. Many times, especially with an increased influx of users to LessWrong, we will sometimes reject these first posts/comments and not let them go live.
To increase transparency and accountability, we've made a new section of the site where you can view everything we've rejected, and usually with some explanation of why. This means that anyone who is motivated can both go fishing for things they think were actually good but were mistakenly classified as reject-worthy, or in general just verify that the LessWrong team is (or isn't) exercising good and fair judgment in what we're rejecting.
The Rejected Content section is part of the Moderation Log, viewable at
and
What to do if you disagree with a rejection decision?
Message us directly about it – maybe we didn't pay close enough attention and it'll be no big deal to change it. See the contact us page.
Mention in the monthly Open Thread.
Write a top-level post if you want to start a larger conversation about our moderation philosophy or approach.
The point of the feature is precisely so people can weigh in, so don't feel afraid to question our choices here. (This isn't a promise to change in response to feedback, but is a good amount of willingness to listen to feedback.)
Why build it?
As I wrote in a recent post of of mine about moderation, the moderation team has historically faced a tricky choice. Either we:
Approve and let userbase evaluate: When we view low quality content being submitted by new users, we nonetheless approve it (perhaps with downvote) and allow the userbase to evaluate and vote accordingly.
Block using our own judgment: We block that content from going live (and send some message to the author explaining why)
Each choice has costs and benefits. Letting userbase evaluate decreases reliance on the mod team's judgment. We will make mistakes, so seems safer to have more people look at things. Also I feel bad if behind the scenes the LessWrong team is filtering what goes live and people don't know even know what choices we're making or why.
At the same time, letting the userbase decide means often:
a whole bunch of users will view a bad post that they won't benefit from. This seems true even if a moderator gives it an initial downvote.
you get negative karma posts showing up in different places that feels pretty ughy, e.g. in the All Posts page
they can end up in search results
negative karma comments (even if they're collapsed) still make comments sections feel more ughy
Theoretically, visibility of negative karma content could be fixed in other ways, but it's kind of a weird situation to have this content if you're going to to a lot of trouble to prevent it being seen by people anyway.On the other hand, when us moderators decide to block content going live, we're relying on our judgment, which is fallible. And heck, we often disagree even among ourselves. Plus I generally dislike moderation actions aren't visible to p...
view more