Unfortunately, it was not just in the Cold war era - we are still seeing a lot of it, at least on the Navy side. Statistical analysis confused with decision theory too.
Even worse is the idea that you can put number labels on subjective opinions, and then "do math on them" and get results that make sense. You can't average 5 "yes-no" questions, 5 "1-5 scale good to bad" questions, and 5 "agree - disagree 1-5" questions and get "a quantitative answer". We see that kind of thing all the time.
And then there is the numerical confusing of activity with progress - see that all the time with "averaging" MOPs to get the "quantitative" result of the MOE the MOPs are filed under. When the question is asked "how does the performance of those activies achieve the objective" the answer is usually "we briefed the MOEs and MOPs to the Commander and he didn't push back on it" And the "if these were really problems, the TTP would warn not to do that."
The Navy desperately needs a cadre of ORSA - specialty folks like the other services have.
Bookmarks