261
Weaponized reporting: what we’re seeing and what we’re doing(self.ModSupport)
Hey all,
We wanted to follow up on [last week’s post](https://www.reddit.com/r/ModSupport/comments/...
since 6 years ago
1 of 1
Tip Reveddit Real-Time can notify you when your content is removed.
your account history
Tip Check if your account has any removed comments.
view my removed comments you are viewing a single comment's thread.
view all comments


What you said is correct as a broad philosophy. I do not purport to write code which is free of all possible bugs. But it does not apply to these specific instances.
A software development pipeline that allows a bug of the kind and magnitude of "people who report can get banned instead of people who were reported" to reach production is fundamentally broken. That Reddit is not developing aviation software is not an appropriate hand-waving for the level of negligence it takes for that bug to be missed at every possible level. I am presenting this in a black and white manner because these specific instances are black and white. There's no nuance.
The layers of checks I described are industry standard for professional software development - Peer review, automated testing, human QA testing. This is incredibly basic, and ultimately they facilitate faster iteration by reducing the amount of developer time that has to be allocated to fixing bugs that reach production, where they have greater impact. Google does these things. Facebook does these things. Rinky dink companies with 4 person dev teams I've worked for do these things.