I'm sure Neo knows this:
Soft 404 are not actual defined errors. They are usually caused by thin content - examples: an URL that finds a useless error message, or a basically empty page. Or the in the case of case of forums only: no posted response. Automatic generation of Wordpress tags when someone looks for something new and non-existent, is an example of the "oops page". Apparently soft 404 a is a googlebot response, not a real error. But you get dinged anyway.
The basic idea is:
I do a search on the the word 'furkle' - UNIX.com returns a 200, with a page saying 'Oops'. Sort of like 'dangling URLs'. Googlebot then has the smarts to see that this is a thin page. For forums only when googlebot sees a question with with zero replies - i.e, possibly lots of views - but no posted answers this situation generates this kind of response. You also get this when someone writes a nice piece, but nobody answers it, even though it is high content, "likes" do not count.
I helped clean up literally thousands really old posts on a science site - for the very reason I mentioned above, per the site owner. The discussion sub-forum had about 1500 posts we removed, for example. Humans had to go in on zero reply posts, and do one of:
1. delete the post
2. add a small bit of content like a link to some relevant comment or a link to external/internal page
3. flag the post for someone else who knows the stuff required. Because the post has some merit.
For Neo:
Do we have a way to create tracking the original source of possible airball URL requests? Please share if you do.
How is our zero reply problem?
I can only play with ordering by view count on a given forum:
https://www.unix.com/programming/?or...ort=replycount
Lots of zero replies. Do not know if this is bad or not.