Quote Originally Posted by Ken White View Post
is anyone like, you know, doing anything about that?

All the real benefit I saw was in the last item quoted: "...and (c) to examine the viability of counter-measures aimed at reducing or eliminating them...."
Over the decades many analytic methods have been developed that specifically seek to reduce or eliminate bias. There are a least a couple of dozen general methods and many more tailored for specific problems. Too many analysts do not even know these methods exist, much less have the training to use them properly, or, if trained, the time to utilize them properly.

That biases are known to exist and the issue is recognized is totally true, my perception is that little has been done to correct the problem for fear of trampling artistic little psyches (pardon my hyperbole and present company excepted but there are some out there...) OR, far more importantly, of not giving the Boss what he's looking for by skewing the analytical process to suit.
In my experience it's more often the Boss that does the skewing since most are not receptive to information that conflicts with what they believe. Beyond that perpetual tension, I think it's fine to do yet more research but IMO the problems with intelligence do not stem from a lack of knowledge on cognitive bias, nor a dearth of methods to combat that bias. There are a host of structural and bureaucratic problems but at the analyst level the three biggest problems I see are:

1. Lack of analyst training.
2. Lack of analyst introspection and self-awareness.
3. Lack of analyst curiosity.

The latter two problems stem, IMO, from a limited pool of analyst candidates (and is a topic that deserves its own thread). The first problem is something that can and should be solvable but is perennial. Look at the intelligence schoolhouse for any of the services. The vast majority of training time is spent on memorizing information, briefing skills, and intel systems. Those are important, but there is comparatively little (or nothing) on researching skills (beyond "look in the pub" or "search siprnet"), evaluating information, strengths and weakness of the differents "ints" (and what they can and can't provide) along with actual analysis.

I will avoid mention of an overwhelming desire on the part of some to never be wrong. That based on the rationale that if you do not go out on a limb, they cannot or will not cut it off... (An attitude that I have on a couple of occasions seen proven mistaken by irate Commanders not apprised of all that was known. ).
That's undoubtedly true and, as you mention, at times there are strong incentives to never be wrong even if that means being useless or never being right either. This is another topic that we could discuss at length since it is deep and endemic.

Agree with your sentiment on contractors but have to point out that in-house studies tend to reach foregone conclusions (dare I say preordained?) and little changes. Outside studies sometimes have an effect.
True, sometimes they do but mostly they don't. When they do the effect is usually temporary until the next intelligence failure comes along.

Rex,
Also, there are biases and there are biases. The focus here is on some pretty specific issues that may not have been adequately explored, such as whether formal methods to reduce certain types of bias (for example, Analysis of Competing Hypotheses methodologies) might actually introduce other sorts of biases (for example, ones associated with the the sequencing of information); the impact that classification levels may have on the perceived weight of information (less a problem among analysts who understand how sausages are made than clients who consume the output, in my view); how probability assessments may be skewed by psychological processes, etc.
Then it's probably valuable research but the sad reality is that the vast majority of such research never trickles-down to impact the individual analyst.