Page 1 of 2 12 LastLast
Results 1 to 20 of 39

Thread: intelligence analysis, overcoming bias and learning

  1. #1
    Council Member
    Join Date
    Jun 2007
    Location
    Ohio
    Posts
    10

    Default intelligence analysis, overcoming bias and learning

    All Source Analysts Training

    After a recent DGS DART working group my office was having a discussion on all source analysts training. There seems to be some misconception at least in the DGS community as to what training should be required for those Analysts who are working in an All Source capacity. I think part of this stems from the fact that the AF does not have an All Source Analyst AFSC. Each AFSC traditionally has been INT focused with the exception on the 1N0s, who are Ops focused assigned generally to flying squadrons and the 14N Intelligence Officer.

    The line of thinking at the working group tended to steer towards mission specific training and application training rather than core skills that could be used at any unit.

    I wanted to get everyone’s thoughts on some of the basic skills training that any all source analysts should have in the toolkit.

    The idea we have come up with is a three track approach for all source training, regional training and mission specific training.

    The All Source track would be a pyramid with courses built on a foundation of basic skills.

    Foundation Skills
    • Analytical Methods
    – Problem definition
    – Process (scientific method)
    – Statistics
    • Research
    – Strategies
    – Source evaluation
    • Communications
    – Technical and Editorial Writing
    – Briefing
    101 Training
    • AFSC/MOS Technical School
    – 1NX
    • By-fire hose training
    – 1N1: Receives SIGINT, HUMINT, AGI, OSINT
    – Intel: Receives Ops familiarization;
    – AF: Rcvs Army-centric, Navy-centric, Joint
    • Military Capabilities
    • General Intelligence (CIA, NSA, NGA, etc)

    Advanced Skills
    • Scenario-based training/exercises
    • Specialized analytical techniques
    – HUMINT-specific support
    – MASINT-specific support
    • Region-centric integration
    – Able to correlate language, geo-political, and cultural aspects to specific intelligence problems

  2. #2
    Council Member
    Join Date
    Jan 2007
    Location
    Thunder Bay, Ontario, Canada
    Posts
    156

    Default Canada Studying Int Analyst Bias

    This from MERX, Canada's public tender posting page (bolding mine):
    "....Defence Research & Development Canada (DRDC) have a requirement to retain the services of a contractor to provide support for DRDC Toronto's data collection and collection involving a series of behavioural science experiments with human subjects. The experiments described in this Statement of Work (SOW) are motivated by DRDC Toronto's Applied Research Program (ARP) project, entitled "Understanding and Augmenting Human Capabilities for Intelligence Production," which is under the project management of Dr. David R. Mandel, Group Leader of the Thinking, Risk, and Intelligence Group within DRDC Toronto's Adversarial Intent Section. The overarching objectives of the ARP project and the experiments described herein for which contractor support is sought are: (a) to identify systematic biases in human performance that may effect the quality of intelligence analysis; (b) to identify factors that may attenuate or amplify such biases or otherwise influence judgmental performance; and (c) to examine the viability of counter-measures aimed at reducing or eliminating them...."

    More details in Statement of Work here (.pdf download).

  3. #3
    Council Member Ken White's Avatar
    Join Date
    May 2007
    Location
    Florida
    Posts
    8,060

    Default Hoo, Boy. Are they going to have fun...

    With some slight experience as one of them and with Analysts, I noted the problem, acknowledge that many can park their bias and do an effective job (subject and issue dependent) but am firmly convinced it is a problem. glad to see someone delving into it. Look forward to the result.

  4. #4
    Council Member IntelTrooper's Avatar
    Join Date
    May 2009
    Location
    RC-S, Afghanistan
    Posts
    302

    Default

    What about the bias of the people conducting the research? Is there another panel to investigate their possible bias?

    While there's been some discussion of this in American intelligence circles, I haven't seen any definitive counter-measures. For example, there was no method for us to challenge the bizarre analysis of a battalion S-2 (so-called battalion senior intelligence analyst) who proclaimed that the main problem in our province was "criminal activity" and not the Taliban. We figured that since this conclusion wasn't based on any actual reporting, she must have some kind of cognitive bias against, say, reality.
    "The status quo is not sustainable. All of DoD needs to be placed in a large bag and thoroughly shaken. Bureaucracy and micromanagement kill."
    -- Ken White


    "With a plan this complex, nothing can go wrong." -- Schmedlap

    "We are unlikely to usefully replicate the insights those unencumbered by a military staff college education might actually have." -- William F. Owen

  5. #5
    Council Member marct's Avatar
    Join Date
    Aug 2006
    Location
    Ottawa, Canada
    Posts
    3,682

    Default

    Quote Originally Posted by IntelTrooper View Post
    For example, there was no method for us to challenge the bizarre analysis of a battalion S-2 (so-called battalion senior intelligence analyst) who proclaimed that the main problem in our province was "criminal activity" and not the Taliban. We figured that since this conclusion wasn't based on any actual reporting, she must have some kind of cognitive bias against, say, reality.
    No, no, no! It's not a bias against reality, it's a definition of the Taliban as criminals !

    Now, it would have been a bias against reality if she had defined the Taliban as Smurfs... then again, maybe not !
    Sic Bisquitus Disintegrat...
    Marc W.D. Tyrrell, Ph.D.
    Institute of Interdisciplinary Studies,
    Senior Research Fellow,
    The Canadian Centre for Intelligence and Security Studies, NPSIA
    Carleton University
    http://marctyrrell.com/

  6. #6
    Council Member Ken White's Avatar
    Join Date
    May 2007
    Location
    Florida
    Posts
    8,060

    Default Competition is the spice of life.

    It also is a significant aid to keeping everyone honest...

    The Canadians know that and are smart enough to have someone else also take a look and pull best ideas from both. We, on the other hand...

    The efficiencies of consolidation and centralization are known, what is often ignored is the adverse impact of those moves on effectiveness. Two minds / approaches are always better than one; three even mo' betta...

    There's another aspect aside from effectiveness. Put another way, you can always have your best gunners do the shooting or the best guy on point -- but no one else will learn much or become good at what they do and your bench will not be very deep.

    All that has been known for centuries; "Quis custodiet ipsos custodes?" is not new and is practiced in most of the world; only the insane American predilections for one size fits all, one size does all, "whatever the boss wants" and "always show your good side" ignore that logic.

    You're not likely to see any US countermeasures. We prefer those who totally support what the Boss wants, no matter how inane or even criminal -- as your example proves...

  7. #7
    Council Member IntelTrooper's Avatar
    Join Date
    May 2009
    Location
    RC-S, Afghanistan
    Posts
    302

    Default

    Quote Originally Posted by marct View Post
    No, no, no! It's not a bias against reality, it's a definition of the Taliban as criminals !
    Curses, now I'm the one with the bias...
    Now, it would have been a bias against reality if she had defined the Taliban as Smurfs... then again, maybe not !
    I thought I recognized the guy in those briefings! Mullah Papa Smurf AKA Abu Smurfette...
    "The status quo is not sustainable. All of DoD needs to be placed in a large bag and thoroughly shaken. Bureaucracy and micromanagement kill."
    -- Ken White


    "With a plan this complex, nothing can go wrong." -- Schmedlap

    "We are unlikely to usefully replicate the insights those unencumbered by a military staff college education might actually have." -- William F. Owen

  8. #8
    Council Member IntelTrooper's Avatar
    Join Date
    May 2009
    Location
    RC-S, Afghanistan
    Posts
    302

    Default

    Quote Originally Posted by Ken White View Post
    You're not likely to see any US countermeasures. We prefer those who totally support what the Boss wants, no matter how inane or even criminal -- as your example proves...
    Yes we do... as I had the distinct impression that her "analysis" was thinly-disguised propaganda to make her battalion commander look good...
    "The status quo is not sustainable. All of DoD needs to be placed in a large bag and thoroughly shaken. Bureaucracy and micromanagement kill."
    -- Ken White


    "With a plan this complex, nothing can go wrong." -- Schmedlap

    "We are unlikely to usefully replicate the insights those unencumbered by a military staff college education might actually have." -- William F. Owen

  9. #9
    Council Member
    Join Date
    Aug 2007
    Location
    Montreal
    Posts
    1,602

    Default

    Quote Originally Posted by Ken White View Post
    The Canadians know that and are smart enough to have someone else also take a look and pull best ideas from both. We, on the other hand...
    We have some absolutely top-notch analysts—and, obviously, some less so. HOWEVER, one disadvantage of having a very small community is... that we have a very small community. Consequently, there is less opportunity for exchanging ideas/debating/challenging assumptions/etc. Virtually all of the folks that work on the Middle East, for example, could comfortably fit into the Sparks Street Tim Hortons for a double-double.

    There are ways of offsetting that, of course--for example, by bringing folks into discussions who aren't in the IC, but rather in the aid, diplomatic, or even (heaven forbid) the academic and NGO communities. Some agencies and managers do it. Some don't. It is obviously harder in MI (especially on the deployed, pointy end) than it is in political assessment however, and the folks in the LE and security intelligence communities aren't always used to working that way either.

    There's also interchange with allied communities, and especially the US, not just at the level of data but also in terms of conferences/meetings/discussions/etc.

    From what I can see, the US has got much better at this post-9/11 than before.

    On the original public tender that milnews posted, you'll find some of the very interesting work that the lead researcher (Dr. David Mandel's) is doing on these and other issues listed here.
    They mostly come at night. Mostly.


  10. #10
    Council Member
    Join Date
    Mar 2008
    Posts
    1,457

    Default Hmm, seems to me

    ...the money might be better spent elsewhere

    Cognitive biases are pretty well understood already. Hopefully this isn't reinventing the wheel, particularly since this is contracted research.

    What's really needed IMO is better training for analysts and the money, IMO, would be better spent there.

  11. #11
    Council Member marct's Avatar
    Join Date
    Aug 2006
    Location
    Ottawa, Canada
    Posts
    3,682

    Default

    Hi Rex,

    Quote Originally Posted by Rex Brynen View Post
    We have some absolutely top-notch analysts—and, obviously, some less so. HOWEVER, one disadvantage of having a very small community is... that we have a very small community. Consequently, there is less opportunity for exchanging ideas/debating/challenging assumptions/etc. Virtually all of the folks that work on the Middle East, for example, could comfortably fit into the Sparks Street Tim Hortons for a double-double.
    Unfortunate, but true.

    Quote Originally Posted by Rex Brynen View Post
    There are ways of offsetting that, of course--for example, by bringing folks into discussions who aren't in the IC, but rather in the aid, diplomatic, or even (heaven forbid) the academic and NGO communities. Some agencies and managers do it. Some don't.
    Also works in reverse - one of my colleagues is ex-MI. Still, stovepiping is a major problem all around. On that note, are you coming to Ottawa for CASIS this year? It's looking like I will be there and I am expecting some lively discussions (in the bars and Timmy's, not the panels!) on recent CSIS "revelations".

    Cheers,

    Marc
    Last edited by marct; 07-10-2009 at 08:50 PM.
    Sic Bisquitus Disintegrat...
    Marc W.D. Tyrrell, Ph.D.
    Institute of Interdisciplinary Studies,
    Senior Research Fellow,
    The Canadian Centre for Intelligence and Security Studies, NPSIA
    Carleton University
    http://marctyrrell.com/

  12. #12
    Council Member Ken White's Avatar
    Join Date
    May 2007
    Location
    Florida
    Posts
    8,060

    Default Well, yeah but...

    Quote Originally Posted by Entropy View Post
    Cognitive biases are pretty well understood already. Hopefully this isn't reinventing the wheel, particularly since this is contracted research.
    is anyone like, you know, doing anything about that?

    All the real benefit I saw was in the last item quoted: "...and (c) to examine the viability of counter-measures aimed at reducing or eliminating them...."

    That biases are known to exist and the issue is recognized is totally true, my perception is that little has been done to correct the problem for fear of trampling artistic little psyches (pardon my hyperbole and present company excepted but there are some out there...) OR, far more importantly, of not giving the Boss what he's looking for by skewing the analytical process to suit.

    I will avoid mention of an overwhelming desire on the part of some to never be wrong. That based on the rationale that if you do not go out on a limb, they cannot or will not cut it off... (An attitude that I have on a couple of occasions seen proven mistaken by irate Commanders not apprised of all that was known. ).

    Agree with your sentiment on contractors but have to point out that in-house studies tend to reach foregone conclusions (dare I say preordained?) and little changes. Outside studies sometimes have an effect.

  13. #13
    Council Member
    Join Date
    Aug 2007
    Location
    Montreal
    Posts
    1,602

    Default

    In this case, as I understand it, the contractors are largely administering the tests and providing the data, not actually doing the analysis.

    Also, there are biases and there are biases. The focus here is on some pretty specific issues that may not have been adequately explored, such as whether formal methods to reduce certain types of bias (for example, Analysis of Competing Hypotheses methodologies) might actually introduce other sorts of biases (for example, ones associated with the the sequencing of information); the impact that classification levels may have on the perceived weight of information (less a problem among analysts who understand how sausages are made than clients who consume the output, in my view); how probability assessments may be skewed by psychological processes, etc.

    I don't think it is reinventing wheels.
    They mostly come at night. Mostly.


  14. #14
    Council Member
    Join Date
    Aug 2007
    Location
    Montreal
    Posts
    1,602

    Default

    Quote Originally Posted by marct View Post
    On that note, are you coming to Ottawa for CASIS this year?
    I will if I can, but I may have to be somewhere more Middle Eastern those particular days.

    I am expecting some lively discussions (in the bars and Timmy's, not the panels!) on recent CSIS "revelations".
    On the current issues with CSIS, the courts, and security certificates, I think this is proof that the 2008 changes introduced to the security certificate system (at the insistence of the Supreme Court) have been useful.

    Under the old system, the government presented two cases: a public unclassified one, and a private classified one wherein only the judge (and not the defence lawyers) heard the supposed evidence. Under the new system (like that in the UK), special advocate working for the defence team with appropriate clearances has the opportunity to review and challenge the classified part of the evidence presented.

    The need for the system was evident to me some years ago, when I was an expert witness in the public part of a trial under the old system. The government's unclassified presentation was full of errors: at one point they confused an elderly history professor with the head of Fateh's Force 17 executive protection/special activities group; another time, they presented varying transliterations of the same Arabic name (by visa clerks) as evidence of using aliases; and so forth. It made me wonder what weaknesses were in the secret version of the evidence, and how on earth the judge could possibly know what was accurate in the absence of a counsel for the defence being present to challenge and raise questions.
    They mostly come at night. Mostly.


  15. #15
    Council Member
    Join Date
    Mar 2008
    Posts
    1,457

    Default

    Quote Originally Posted by Ken White View Post
    is anyone like, you know, doing anything about that?

    All the real benefit I saw was in the last item quoted: "...and (c) to examine the viability of counter-measures aimed at reducing or eliminating them...."
    Over the decades many analytic methods have been developed that specifically seek to reduce or eliminate bias. There are a least a couple of dozen general methods and many more tailored for specific problems. Too many analysts do not even know these methods exist, much less have the training to use them properly, or, if trained, the time to utilize them properly.

    That biases are known to exist and the issue is recognized is totally true, my perception is that little has been done to correct the problem for fear of trampling artistic little psyches (pardon my hyperbole and present company excepted but there are some out there...) OR, far more importantly, of not giving the Boss what he's looking for by skewing the analytical process to suit.
    In my experience it's more often the Boss that does the skewing since most are not receptive to information that conflicts with what they believe. Beyond that perpetual tension, I think it's fine to do yet more research but IMO the problems with intelligence do not stem from a lack of knowledge on cognitive bias, nor a dearth of methods to combat that bias. There are a host of structural and bureaucratic problems but at the analyst level the three biggest problems I see are:

    1. Lack of analyst training.
    2. Lack of analyst introspection and self-awareness.
    3. Lack of analyst curiosity.

    The latter two problems stem, IMO, from a limited pool of analyst candidates (and is a topic that deserves its own thread). The first problem is something that can and should be solvable but is perennial. Look at the intelligence schoolhouse for any of the services. The vast majority of training time is spent on memorizing information, briefing skills, and intel systems. Those are important, but there is comparatively little (or nothing) on researching skills (beyond "look in the pub" or "search siprnet"), evaluating information, strengths and weakness of the differents "ints" (and what they can and can't provide) along with actual analysis.

    I will avoid mention of an overwhelming desire on the part of some to never be wrong. That based on the rationale that if you do not go out on a limb, they cannot or will not cut it off... (An attitude that I have on a couple of occasions seen proven mistaken by irate Commanders not apprised of all that was known. ).
    That's undoubtedly true and, as you mention, at times there are strong incentives to never be wrong even if that means being useless or never being right either. This is another topic that we could discuss at length since it is deep and endemic.

    Agree with your sentiment on contractors but have to point out that in-house studies tend to reach foregone conclusions (dare I say preordained?) and little changes. Outside studies sometimes have an effect.
    True, sometimes they do but mostly they don't. When they do the effect is usually temporary until the next intelligence failure comes along.

    Rex,
    Also, there are biases and there are biases. The focus here is on some pretty specific issues that may not have been adequately explored, such as whether formal methods to reduce certain types of bias (for example, Analysis of Competing Hypotheses methodologies) might actually introduce other sorts of biases (for example, ones associated with the the sequencing of information); the impact that classification levels may have on the perceived weight of information (less a problem among analysts who understand how sausages are made than clients who consume the output, in my view); how probability assessments may be skewed by psychological processes, etc.
    Then it's probably valuable research but the sad reality is that the vast majority of such research never trickles-down to impact the individual analyst.

  16. #16
    Council Member
    Join Date
    Jan 2007
    Location
    Thunder Bay, Ontario, Canada
    Posts
    156

    Default Word of even BROADER research....

    ....into how the bad guy does things, and how to predict what they'll do.

    Public posting:
    The Department of National Defence, Defence Research and Development (DRDC), Toronto, Ontario has a requirement for the provision of Scientific & Technical support for behavioral, cognitive, and social sciences research in various security environments and operational contexts. Examples of such contexts include command, communications, computer intelligence, surveillance and reconnaissance; collection, analysis and dissemination of intelligence products; effects-based operations; all with an explicit focus on human performance at the individual group and organizational levels.
    Now, a bit more detail from the Statement of Work here:
    “DRDC Toronto is now actively building its capacity for human sciences in a new research domain: Understanding, prediction and influence of adversaries’ intent …. DRDC requires contractual support for research in this new area. In particular, the Adversarial Intent Section (AIS) of DRDC Toronto anticipates a requirement for significant research effort that addresses the following general topic:

    (a) eludication of contemporary security concepts such as Effects-based Approaches to Operations (EBAO), Defence Development and Diplomacy (3D), and the Comprehensive approach (CA) and their use in military capability development and doctrine;

    (b) understanding the social, organizational, and cognitive determinants of human capabilities for the effective production of military and civilian intelligence pertinent to domestic and international issues;

    (c) influence processes that play a role in conflict-ridden environments characterized by complex interactions among adversaries, allied forces, and bystander groups; (and)

    (d) methods and tools for structuring, portraying and analyzing complex contemporary security environments.”
    I've "plain languaged" it a bit here.

  17. #17
    Council Member
    Join Date
    Jan 2007
    Location
    Thunder Bay, Ontario, Canada
    Posts
    156

    Default Why not get a machine to "average out" the analyses?

    This, from Wired.com's Danger Room:
    The U.S intelligence community has a long history of blowing big calls — the fall of the Berlin Wall, Saddam’s WMD, 9/11. But in each collective fail, there were individual analysts who got it right. Now, the spy agencies want a better way to sort the accurate from the unsound, by applying principles of mathematics to weigh and rank the input of different experts.

    Iarpa, the intelligence community’s way-out research arm, will host a one-day workshop on a new program, called Aggregative Contingent Estimation (ACE). The initiative follows Iarpa’s recent announcement of plans to create a computational model that can enhance human hypotheses and predictions, by catching inevitable biases and accounting for selective memory and stress.

    ACE won’t replace flesh-and-blood experts — it’ll just let ‘em know what they’re worth. The intelligence community often relies on small teams of experts to evaluate situations, and then make forecasts and recommendations. But a team is only as strong as its weakest link, and Iarpa wants to fortify team-based outputs, by using mathematical aggregation to “elicit, weigh, and combine the judgments of many intelligence analysts" ....

  18. #18
    Council Member AnalyticType's Avatar
    Join Date
    Jun 2009
    Location
    Austin, Texas
    Posts
    66

    Talking Still a firm believer in Heuer's research...

    I found him to be spot on. It's not possible to eliminate cognitive biases from the analyst entirely, as they're based in how we learn and incorporate knowledge. What IS possible is significant mitigation of those biases by first being trained to identify them; second being willing to address them; third being trained in structured analytic methodologies which go far in levelling the analytic playing field; fourth being willing to retool theories to fit the facts rather than retooling the facts to fit the theories.

    The fourth I perceive to be the most important - and least engaged.

    On the previously mentioned structured analytic method ACH, I found that Heuer's software was effective in mitigating cognitive bias (as is a simple hand-written matrix, BTW) but only if it's worked in a particular fashion.

    As an example:
    Any typical competing hypotheses matrix has a fairly straightforeward design. The first column on the left is populated with all of the facts generated by the analyst's research. The top cell of the second, third, fourth columns (et cetera) contain ALL working hypotheses, each in its own column. There may be several variations on a couple themes, or simply a pair of mutually exclusive theories.

    The analyst then examines the facts in relation to the hypothesies, determining consistency, lack of applicability, or inconsistency. But this is where I found that the way this simple matrix is worked matters regarding the outcome.

    The first couple of times I utilised ACH software, while in college in intelligence analysis classes, I had not yet learned that there may be a difference in how the process should be run for assuring the least bias possible. So I started at the top of the column for Hypothesis A, and worked my way down. I compared data points 1-45 to Hypothesis A, attempting to assign a value (highly consistent, consistent, not applicable, inconsistent, highly inconsistent) to each data point as it related to Hypothesis A. Then I went through the same exercise with the same data for Hypothesis B's column.

    What a mess. For the particular project I was working on at the time, my results were inconclusive and an exercise in frustration.

    Finally another student clued me in. Work across! Is data point 2 consistent, not applicable, or inconsistent with Hypothesis A? Is data point 2 consistent, not applicable, or inconsistent with Hypothesis B? C? Next, is data point 3 consistent, not applicable, or inconsistent with Hypothesis A? B? C? Working across, apply a data point to all hypotheses, then the next fact, then the next, down the matrix.

    I will tell you that it surprised the heck out of me to find that, without having rearranged or changed ANY of my data points or hypotheses, the direction in which I worked the matrix made a HUGE difference in the utility of the results.

    Next, what must be done is to eliminate (or rework) the hypotheses which have large numbers either of "not applicables" or "inconsistents" in their column.

    Having just spent the last year working on the problem of US border security and Mexican drug cartel violence in Texas, I've watched several coworkers repetitively discard confirmed data because it doesn't fit their theories. This stuff has frustrated the living tar out of me! The individuals in question habitually cherry-picked the facts to "prove" their hypotheses, rather than working at trying to disprove all theories. That hypothesis which is least able to be disproven tends to have the highest validity.

    Structured methodology, such as those tools identified and taught by Richards Heuer and Morgan Jones (among others) are the best tools I've found for removing ego and bias from the work of being an analyst.

    As mentioned or alluded to in previous posts, the wheel does not need to be reinvented, nor does the process by which it rolls need to be studied again some more. The tools are there, and have been highly effective for decades; but they must be taught consistently and reinforced often throughout intelligence analysts' careers, regardless of venue or gov't agency.
    "At least we're getting the kind of experience we need for the next war." -- Allen Dulles

    A work of art worth drooling over: http://www.maxton.com/intimidator1/i...r1_page4.shtml

  19. #19
    Council Member
    Join Date
    Jan 2007
    Location
    Thunder Bay, Ontario, Canada
    Posts
    156

    Default New related paper: What do Analysts' Managers Say?

    A bit of follow-up work - from the paper's abstract:
    Intelligence analysis provides important informational support to civilian and military decision makers. Recent intelligence failures of Canada’s allies have been attributed mostly to cognitive, social, and organizational deficits and biases of individual analysts and intelligence agencies. Such attributions call for a comprehensive examination of intelligence production from the sociopsychological perspective. The present report discusses findings from interviews conducted with Canadian managers of intelligence analysts. The interviewed managers identified a number of pertinent issues in the intelligence production process that may be explicated through the application of the behavioural sciences’ accumulated knowledge and methodology. The identified issues are discussed in light of the intelligence studies and behavioural sciences literature, and a roadmap for the behavioural sciences research program in support of the intelligence function is outlined.
    Executive summary downloadable (right click and "Save as") here, full report downloadable here.

  20. #20
    Council Member
    Join Date
    Feb 2007
    Location
    Rocky Mtn Empire
    Posts
    473

    Default Hmmm More studies

    In my collection and analysis class I use Heuer as one of my readings, as I, too, believe that he is spot on. There are numerous recommendations on how to overcome the effects of various biases. I introduce critical thinking as advocated by Tim VanGelder, but also use Cass Sunstein's book Infotopia, which examines statistical norming, market models, and internet collaboration a' la intellipedia.

    Based on my depraved development, I am convinced that learning critical thinking and alternative analysis can assist analysts in overcoming the pitfalls of being "normal".

    But it will be interesting to see what any new data may reveal, or not reveal.

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •