Results 1 to 20 of 39

Thread: intelligence analysis, overcoming bias and learning

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Council Member Ken White's Avatar
    Join Date
    May 2007
    Location
    Florida
    Posts
    8,060

    Default Competition is the spice of life.

    It also is a significant aid to keeping everyone honest...

    The Canadians know that and are smart enough to have someone else also take a look and pull best ideas from both. We, on the other hand...

    The efficiencies of consolidation and centralization are known, what is often ignored is the adverse impact of those moves on effectiveness. Two minds / approaches are always better than one; three even mo' betta...

    There's another aspect aside from effectiveness. Put another way, you can always have your best gunners do the shooting or the best guy on point -- but no one else will learn much or become good at what they do and your bench will not be very deep.

    All that has been known for centuries; "Quis custodiet ipsos custodes?" is not new and is practiced in most of the world; only the insane American predilections for one size fits all, one size does all, "whatever the boss wants" and "always show your good side" ignore that logic.

    You're not likely to see any US countermeasures. We prefer those who totally support what the Boss wants, no matter how inane or even criminal -- as your example proves...

  2. #2
    Council Member IntelTrooper's Avatar
    Join Date
    May 2009
    Location
    RC-S, Afghanistan
    Posts
    302

    Default

    Quote Originally Posted by Ken White View Post
    You're not likely to see any US countermeasures. We prefer those who totally support what the Boss wants, no matter how inane or even criminal -- as your example proves...
    Yes we do... as I had the distinct impression that her "analysis" was thinly-disguised propaganda to make her battalion commander look good...
    "The status quo is not sustainable. All of DoD needs to be placed in a large bag and thoroughly shaken. Bureaucracy and micromanagement kill."
    -- Ken White


    "With a plan this complex, nothing can go wrong." -- Schmedlap

    "We are unlikely to usefully replicate the insights those unencumbered by a military staff college education might actually have." -- William F. Owen

  3. #3
    Council Member
    Join Date
    Mar 2008
    Posts
    1,457

    Default Hmm, seems to me

    ...the money might be better spent elsewhere

    Cognitive biases are pretty well understood already. Hopefully this isn't reinventing the wheel, particularly since this is contracted research.

    What's really needed IMO is better training for analysts and the money, IMO, would be better spent there.

  4. #4
    Council Member Ken White's Avatar
    Join Date
    May 2007
    Location
    Florida
    Posts
    8,060

    Default Well, yeah but...

    Quote Originally Posted by Entropy View Post
    Cognitive biases are pretty well understood already. Hopefully this isn't reinventing the wheel, particularly since this is contracted research.
    is anyone like, you know, doing anything about that?

    All the real benefit I saw was in the last item quoted: "...and (c) to examine the viability of counter-measures aimed at reducing or eliminating them...."

    That biases are known to exist and the issue is recognized is totally true, my perception is that little has been done to correct the problem for fear of trampling artistic little psyches (pardon my hyperbole and present company excepted but there are some out there...) OR, far more importantly, of not giving the Boss what he's looking for by skewing the analytical process to suit.

    I will avoid mention of an overwhelming desire on the part of some to never be wrong. That based on the rationale that if you do not go out on a limb, they cannot or will not cut it off... (An attitude that I have on a couple of occasions seen proven mistaken by irate Commanders not apprised of all that was known. ).

    Agree with your sentiment on contractors but have to point out that in-house studies tend to reach foregone conclusions (dare I say preordained?) and little changes. Outside studies sometimes have an effect.

  5. #5
    Council Member
    Join Date
    Aug 2007
    Location
    Montreal
    Posts
    1,602

    Default

    In this case, as I understand it, the contractors are largely administering the tests and providing the data, not actually doing the analysis.

    Also, there are biases and there are biases. The focus here is on some pretty specific issues that may not have been adequately explored, such as whether formal methods to reduce certain types of bias (for example, Analysis of Competing Hypotheses methodologies) might actually introduce other sorts of biases (for example, ones associated with the the sequencing of information); the impact that classification levels may have on the perceived weight of information (less a problem among analysts who understand how sausages are made than clients who consume the output, in my view); how probability assessments may be skewed by psychological processes, etc.

    I don't think it is reinventing wheels.
    They mostly come at night. Mostly.


  6. #6
    Council Member
    Join Date
    Mar 2008
    Posts
    1,457

    Default

    Quote Originally Posted by Ken White View Post
    is anyone like, you know, doing anything about that?

    All the real benefit I saw was in the last item quoted: "...and (c) to examine the viability of counter-measures aimed at reducing or eliminating them...."
    Over the decades many analytic methods have been developed that specifically seek to reduce or eliminate bias. There are a least a couple of dozen general methods and many more tailored for specific problems. Too many analysts do not even know these methods exist, much less have the training to use them properly, or, if trained, the time to utilize them properly.

    That biases are known to exist and the issue is recognized is totally true, my perception is that little has been done to correct the problem for fear of trampling artistic little psyches (pardon my hyperbole and present company excepted but there are some out there...) OR, far more importantly, of not giving the Boss what he's looking for by skewing the analytical process to suit.
    In my experience it's more often the Boss that does the skewing since most are not receptive to information that conflicts with what they believe. Beyond that perpetual tension, I think it's fine to do yet more research but IMO the problems with intelligence do not stem from a lack of knowledge on cognitive bias, nor a dearth of methods to combat that bias. There are a host of structural and bureaucratic problems but at the analyst level the three biggest problems I see are:

    1. Lack of analyst training.
    2. Lack of analyst introspection and self-awareness.
    3. Lack of analyst curiosity.

    The latter two problems stem, IMO, from a limited pool of analyst candidates (and is a topic that deserves its own thread). The first problem is something that can and should be solvable but is perennial. Look at the intelligence schoolhouse for any of the services. The vast majority of training time is spent on memorizing information, briefing skills, and intel systems. Those are important, but there is comparatively little (or nothing) on researching skills (beyond "look in the pub" or "search siprnet"), evaluating information, strengths and weakness of the differents "ints" (and what they can and can't provide) along with actual analysis.

    I will avoid mention of an overwhelming desire on the part of some to never be wrong. That based on the rationale that if you do not go out on a limb, they cannot or will not cut it off... (An attitude that I have on a couple of occasions seen proven mistaken by irate Commanders not apprised of all that was known. ).
    That's undoubtedly true and, as you mention, at times there are strong incentives to never be wrong even if that means being useless or never being right either. This is another topic that we could discuss at length since it is deep and endemic.

    Agree with your sentiment on contractors but have to point out that in-house studies tend to reach foregone conclusions (dare I say preordained?) and little changes. Outside studies sometimes have an effect.
    True, sometimes they do but mostly they don't. When they do the effect is usually temporary until the next intelligence failure comes along.

    Rex,
    Also, there are biases and there are biases. The focus here is on some pretty specific issues that may not have been adequately explored, such as whether formal methods to reduce certain types of bias (for example, Analysis of Competing Hypotheses methodologies) might actually introduce other sorts of biases (for example, ones associated with the the sequencing of information); the impact that classification levels may have on the perceived weight of information (less a problem among analysts who understand how sausages are made than clients who consume the output, in my view); how probability assessments may be skewed by psychological processes, etc.
    Then it's probably valuable research but the sad reality is that the vast majority of such research never trickles-down to impact the individual analyst.

  7. #7
    Council Member
    Join Date
    Jan 2007
    Location
    Thunder Bay, Ontario, Canada
    Posts
    156

    Default Word of even BROADER research....

    ....into how the bad guy does things, and how to predict what they'll do.

    Public posting:
    The Department of National Defence, Defence Research and Development (DRDC), Toronto, Ontario has a requirement for the provision of Scientific & Technical support for behavioral, cognitive, and social sciences research in various security environments and operational contexts. Examples of such contexts include command, communications, computer intelligence, surveillance and reconnaissance; collection, analysis and dissemination of intelligence products; effects-based operations; all with an explicit focus on human performance at the individual group and organizational levels.
    Now, a bit more detail from the Statement of Work here:
    “DRDC Toronto is now actively building its capacity for human sciences in a new research domain: Understanding, prediction and influence of adversaries’ intent …. DRDC requires contractual support for research in this new area. In particular, the Adversarial Intent Section (AIS) of DRDC Toronto anticipates a requirement for significant research effort that addresses the following general topic:

    (a) eludication of contemporary security concepts such as Effects-based Approaches to Operations (EBAO), Defence Development and Diplomacy (3D), and the Comprehensive approach (CA) and their use in military capability development and doctrine;

    (b) understanding the social, organizational, and cognitive determinants of human capabilities for the effective production of military and civilian intelligence pertinent to domestic and international issues;

    (c) influence processes that play a role in conflict-ridden environments characterized by complex interactions among adversaries, allied forces, and bystander groups; (and)

    (d) methods and tools for structuring, portraying and analyzing complex contemporary security environments.”
    I've "plain languaged" it a bit here.

  8. #8
    Council Member
    Join Date
    Jan 2007
    Location
    Thunder Bay, Ontario, Canada
    Posts
    156

    Default Why not get a machine to "average out" the analyses?

    This, from Wired.com's Danger Room:
    The U.S intelligence community has a long history of blowing big calls — the fall of the Berlin Wall, Saddam’s WMD, 9/11. But in each collective fail, there were individual analysts who got it right. Now, the spy agencies want a better way to sort the accurate from the unsound, by applying principles of mathematics to weigh and rank the input of different experts.

    Iarpa, the intelligence community’s way-out research arm, will host a one-day workshop on a new program, called Aggregative Contingent Estimation (ACE). The initiative follows Iarpa’s recent announcement of plans to create a computational model that can enhance human hypotheses and predictions, by catching inevitable biases and accounting for selective memory and stress.

    ACE won’t replace flesh-and-blood experts — it’ll just let ‘em know what they’re worth. The intelligence community often relies on small teams of experts to evaluate situations, and then make forecasts and recommendations. But a team is only as strong as its weakest link, and Iarpa wants to fortify team-based outputs, by using mathematical aggregation to “elicit, weigh, and combine the judgments of many intelligence analysts" ....

  9. #9
    Council Member
    Join Date
    Aug 2007
    Location
    Montreal
    Posts
    1,602

    Default

    Quote Originally Posted by Ken White View Post
    The Canadians know that and are smart enough to have someone else also take a look and pull best ideas from both. We, on the other hand...
    We have some absolutely top-notch analysts—and, obviously, some less so. HOWEVER, one disadvantage of having a very small community is... that we have a very small community. Consequently, there is less opportunity for exchanging ideas/debating/challenging assumptions/etc. Virtually all of the folks that work on the Middle East, for example, could comfortably fit into the Sparks Street Tim Hortons for a double-double.

    There are ways of offsetting that, of course--for example, by bringing folks into discussions who aren't in the IC, but rather in the aid, diplomatic, or even (heaven forbid) the academic and NGO communities. Some agencies and managers do it. Some don't. It is obviously harder in MI (especially on the deployed, pointy end) than it is in political assessment however, and the folks in the LE and security intelligence communities aren't always used to working that way either.

    There's also interchange with allied communities, and especially the US, not just at the level of data but also in terms of conferences/meetings/discussions/etc.

    From what I can see, the US has got much better at this post-9/11 than before.

    On the original public tender that milnews posted, you'll find some of the very interesting work that the lead researcher (Dr. David Mandel's) is doing on these and other issues listed here.
    They mostly come at night. Mostly.


  10. #10
    Council Member marct's Avatar
    Join Date
    Aug 2006
    Location
    Ottawa, Canada
    Posts
    3,682

    Default

    Hi Rex,

    Quote Originally Posted by Rex Brynen View Post
    We have some absolutely top-notch analysts—and, obviously, some less so. HOWEVER, one disadvantage of having a very small community is... that we have a very small community. Consequently, there is less opportunity for exchanging ideas/debating/challenging assumptions/etc. Virtually all of the folks that work on the Middle East, for example, could comfortably fit into the Sparks Street Tim Hortons for a double-double.
    Unfortunate, but true.

    Quote Originally Posted by Rex Brynen View Post
    There are ways of offsetting that, of course--for example, by bringing folks into discussions who aren't in the IC, but rather in the aid, diplomatic, or even (heaven forbid) the academic and NGO communities. Some agencies and managers do it. Some don't.
    Also works in reverse - one of my colleagues is ex-MI. Still, stovepiping is a major problem all around. On that note, are you coming to Ottawa for CASIS this year? It's looking like I will be there and I am expecting some lively discussions (in the bars and Timmy's, not the panels!) on recent CSIS "revelations".

    Cheers,

    Marc
    Last edited by marct; 07-10-2009 at 08:50 PM.
    Sic Bisquitus Disintegrat...
    Marc W.D. Tyrrell, Ph.D.
    Institute of Interdisciplinary Studies,
    Senior Research Fellow,
    The Canadian Centre for Intelligence and Security Studies, NPSIA
    Carleton University
    http://marctyrrell.com/

  11. #11
    Council Member
    Join Date
    Aug 2007
    Location
    Montreal
    Posts
    1,602

    Default

    Quote Originally Posted by marct View Post
    On that note, are you coming to Ottawa for CASIS this year?
    I will if I can, but I may have to be somewhere more Middle Eastern those particular days.

    I am expecting some lively discussions (in the bars and Timmy's, not the panels!) on recent CSIS "revelations".
    On the current issues with CSIS, the courts, and security certificates, I think this is proof that the 2008 changes introduced to the security certificate system (at the insistence of the Supreme Court) have been useful.

    Under the old system, the government presented two cases: a public unclassified one, and a private classified one wherein only the judge (and not the defence lawyers) heard the supposed evidence. Under the new system (like that in the UK), special advocate working for the defence team with appropriate clearances has the opportunity to review and challenge the classified part of the evidence presented.

    The need for the system was evident to me some years ago, when I was an expert witness in the public part of a trial under the old system. The government's unclassified presentation was full of errors: at one point they confused an elderly history professor with the head of Fateh's Force 17 executive protection/special activities group; another time, they presented varying transliterations of the same Arabic name (by visa clerks) as evidence of using aliases; and so forth. It made me wonder what weaknesses were in the secret version of the evidence, and how on earth the judge could possibly know what was accurate in the absence of a counsel for the defence being present to challenge and raise questions.
    They mostly come at night. Mostly.


Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •