Steersman wrote:You might try to read a little more closely as otherwise it just supports Zvan's argument that "you don’t read for comprehension, and you don’t understand even the basics of the issues we’re talking about." More particularly, what I said, in part, was:The Yeti wrote:Did Steersman just agree with Hornbeck"a ridiculously dumb statistical "reasoning", which has errors so glaring that even a grade school kid could spot them?You might note that while I conceded one aspect of HJ's argument, I rather clearly acknowledged and supported CuntajusRationality's argument that the problem with HJ's argument was that he was clearly ignoring other classes of factors - notably those cases where police reports were not filed, clearly the case in the Shermer-Smith debacle.And, relative to that illustration, one might argue that set B is the "odds of a false rape report" - given the filing of a police report (4%), and that set A is the odds of "filing a rape report with the police" - given that a supposed rape took place, then it is "reasonable" to conclude that intersection of A & B is therefor 0.4%. However, as your analogy suggests, there are number of other classes - notably the likelihood of police reports not being filed, which many feminists have argued is substantially larger than cases in which police reports were filed - which changes the assessment of probabilities rather significantly: Not-A (everything outside of A)might be construed as something like 75%.
And while I'm guessing at this point, I think that that graph suggests that the "odds of false rape report" - made long after the supposed crime, and with or without alcohol as a confounding factor, and to people other than the police - are likely to be quite high.
http://cs.calstatela.edu/wiki/images/th ... heorem.png
Despite having your errors explained to you multiple times, you repeat them yet again. In your example, "False rape reports filed with the police" is a fucking subset of "rape reports filed with police". Since the one set of events is a subset of the another, it would be more appropriate for your venn diagram to have one circle completely inside the other. As to multiplying the two probabilities together, it would not give you the odds of a false rape report, instead it would give the probability that a given rape report is a false report and it was filed with the police. And this operates on the very unjustified assumption that the statistics we have are accurate enough to come up with useful probabilities. Hornbeck is just attempting a bit of extremely simplistic sophistry. Hornbeck's reasoning is even worse because he makes the same error multiple times by throwing in other random factors like the involvement of alcohol and coming up with dubious "probabilities" for them.
But Hornbeck's entire endeavor of trying to use criminal justice statistics in this manner to "prove" a crime is just fucking stupid. Whenever someone is accused of a crime, we could simply look at the conviction rate for that crime and calculate "probabilities" from there. Of course in the US most prosecutors don't even accept the weak cases, and they get plea bargain as often is possible. Most cases don't even go to trial. This strategy means that many (if not most) prosecuting attorneys in the US have a 90+% conviction rate. If we take Hornbeck's ideas to their fucking logical conclusion, we can just eliminate the Courts entirely and just lock up everybody. The whole idea of judging an individual criminal case based upon statistics from other cases is absolutely fucking absurd.
The fact that you are defending this stupidity does not reflect well on your intellect, Steers. I apologize in advance to the rest of the pit for engaging with the Steersbot.