My point has been that you can’t be subjective and objective at the same time.
Some of the assessment can be objectively accurate, like land use and frequency of targets, and some of it has an element of subjective judgement.
So you can assign a fairly accurate number to parts of the assessment but not to others.
The problem would be in presenting one objective number or probability for an assessment based on data which is in part subjective.
I am quite sure that intention to deceive has not been mentioned.
More that the drive to fill the brief of assessing risk may have resulted in the development of a system, which by presenting risk as an overall numerical probability (If this is indeed the case) , gives the impression of a degree of objectivity that is not consistent if you follow the data trail to its source. (not consistent because subjective and objective observations are mixed)
If this is not the case then I will stand corrected. That’s why Tony used my doctor analogy, in that it is acceptable to assign a number on a scale of 1-10 to a subjective feeling of pain in a patient.
So therefore it’s acceptable to place a numerical probability on risk in a situation where people are exposed to failure from trees using, in part, subjective data.
Affirming the consequent, sometimes called converse error, is a formal fallacy, committed by reasoning in the form:
1. If P, then Q.
2. Q.
3. Therefore, P.
An argument of this form is invalid, i.e., the conclusion can be false even when statements 1 and 2 are true. Since P was never asserted as the only sufficient condition for Q, other factors could account for Q (while P was false).
The name affirming the consequent derives from the premise Q, which affirms the "then" clause of the conditional premise.