Originally Posted by
Seahorse
Thanks to Wilf for initiating this thread and to the many contributions which are richly exploring the issues related to metrics. I wish to contribute to this exchange with my own observations and experiences.
My experience began with the development of EBO concepts and theory and the ability to apply measures to assess the desired effects with JFCOM in 2002. I have since had the dubious benefit of actually applying the theory of MOP/MOE development, collection, assessment, analysis and reporting at the strategic, operational and tactical levels of the ISAF mission with separate deployments to both Kabul and Kandahar as an operational analyst.
Notwithstanding the vigorous debates surrounding EBO itself, the continued and insatiable demand for measures has frustrated me considerably. I echo many of the identifed problems several posts have described. Unlike many of my contemporaries (scientists) I feel there are serious limitations to the collection, analysis and reporting of measures. It is my position that reliance on the measurable has led to a situtaion I describe as the real EBO effects blurred operations. This blurring is based on several factors such as:
i) Reliance on MOPs versus MOEs - MOPs are necessary to evaluate if we are doing things right. Numbers killed in an AO can be used to evaluate insurgent activity. However numbers of KIA, development projects or numbers of Shura meetings are meaningless when divorced from the requirement for patrols, development or local interaction. MOEs are intended to evaluate if we are doing the right things - such as the earlier post regarding the value and priority associated with combat patrols, intelligence driven operations, or medical and humanitarian assistance. True MOEs could be used to support transitioning ink spot strategies from clear, hold and build dominant operations in an AO for example. Unlike MOPs (which are straightforward and easier and faster to measure), the selection and development of MOEs is not easy and their evaluation lags the operational tempo.
ii) measuring the measurable versus the important - too often there is far too much reliance on what we can measure and then associating that measure with the progress of an operation. Number of schools built may make one feel good, but what has it to do with education requirements/need? Are the schools located where they are needed, are they built to specified standards, are they protected from Taliban vandalism, are they staffed with qualified teachers, do they have secured funding for operations, do they have teaching materials, are the local children attending the school etc? I have seen Unicef tents that were more effective schools than were dedicated buildings. Measuring 'things' also results in what I call an effects explosion. This refers to the ever increasing numbers of metrics that are foisted on deployed personnel to collect and report irrelevent data (ISAF progressed from 25 to 100 to over 700 monthly measures over a one year period for example). Couple this with the problems in collection and reporting this information across the provinces results in inconsistencies, ommissions, real and unintentional errors, inflationary or creative reporting, etc.
iii) reporting measures and producing slides or excel tables instead of analysing their significance and providing advice or recommendations. Often analysts divorced from the actual operations are given the responsibility to assess and report on the data. The results are mostly divorced from the operations and are virtually useless in terms of influencing the operations themselves, they are predominantly used to report to higher headquarters and later repackaged and distilled to form IO materials that are used to show progress of the mission to national audiences and media.
iv) metrics obsession - certain metrics (KIAs, TICs and other SIGACTs for example) become THE metric for reporting and this detracts from the purpose of collecting the metric in the first place which should be determining effectiveness and supporting HQ decision analysis requirements. Too often I witnessed senior staff that were more concerned with the numbers than they were with the associated trend and analysis. We need to emphasize the relationship between metrics and operations. The counter narcotics reliance on # of acres of poppy eradicated is a useless and overly relied on metric when one considers that despite the ever increasing (presumably good) statistic, poppy cultivation has exploded in terms of its proliferation both within problem provinces and to a large number of previopusly poppy free provinces. Where is the analysis of the eradication program in relation to the growth of the problem. Secondly, is there a secondary relationship between eradication and insecurity which may be counter to the dominant mission in Afghanistan?
v) distillation of metrics - Too often I witness the development of metrics that are so distilled (down to the tactical level) that they are relatively meaningless to the original purpose. This distillation is the result of combinations of the above problems. Any one measure can be rationalized to be important to a particular purpose in and of itself, however in the roll up of the comprehensive assessment the relative weighting of the metric becomes too prominent and can dominate the reported results. I believe certain metrics are key and quite dominant (numbers of IEDs could be one example) on their own, however the overall assessments need to account for more than the constituent elements. I assert that qualitative measures are more important to MOEs, which should form the basis of higher level assessments.
Quantitative assessments which dominate metric measures can inform qualititative assessments however the subjectivity of these qualitative measures detracts from their adoption and use. I would assert that an informed and effective force understands qualitatively their AO. This understanding fails to be conveyed within our dominant quantitative metric measurement systems.
These are a few of my observations. I continue to work in this field and am attempting to develop proposals for how to more effectively implement measures in support of decision making and influenceing operations.
V/R
David
Bookmarks