PDA

View Full Version : Theoretical underpinings of Assessment?



pvebber
03-03-2009, 05:36 PM
I've been working on several excercises/"experiments" dealing with the navy's Maritime Operations Center (MOC) development, and one of the big issues I'm seeing is in the area f assessment. The broad framework the MOC is working under is supporting a commander's decision cycle consisting of Plan - Decide - Monitor - Assess. We are making some progress (finally) on the Plan side - and the Decide and Monitor have bumps but have been the most solid - leaving us with Assessment as the red-headed stepchild of the bunch.

The closest thing to a "theory of assessment" I've been able to find is in the ONA - based system of systems analysis writing (lately under fire if not discredited, at least at JFCOM). So what has replaced ONA-SoSA? Or is its principles still around under fresh acronyms? Joint doctrine seems jam packed with a wide variety of assessments to be done, but almost nothing on methodology.

Thanks for any enlightenment!

slapout9
03-03-2009, 05:42 PM
Check out this months issue of Military Review. Article on The Art of Design and a couple of others. Your are essentially correct that is just a rewording of existing methods, but they have some good insights. Later Slap.



http://usacac.army.mil/CAC2/MilitaryReview/

pvebber
03-04-2009, 03:19 PM
Thanks! Soory for the misplaced post!!

pvebber
03-04-2009, 07:12 PM
The artical on Design hits at a lot of the points I am interested in, and builds on the Commander's Appreciation and Campaign Design pamphlet. The need for a "design" step, part of which is to develop a "theory of action" - the missing "model" part to me fills a large void in the current plan - decide - monitor - assess framework. Whether this is considered part of 'plan' or a seperate precursor to plan, I leave to the doctrine types to ruminate on, but I found this VERY useful.

slapout9
03-04-2009, 08:12 PM
pvebber, you may want to google Dr. Russell Akoff or Acoff a lot of this comes from his theory of Interactive Planning which is pretty good stuff. Kind of a planning theory based upon the idea that the only good plan is one that is constantly adapting. Acoff was one of the leaders of the Systems Thinking movement from WW2 and is still alive I believe. Later

Ken White
03-04-2009, 08:41 PM
Soft systems methodology LINK. (http://www.scu.edu.au/schools/gcm/ar/areol/areol-session13.html)

USMC MAGTFerist
03-05-2009, 07:23 PM
It sounds to me like you're looking for info on how to do different types of assessment, one of them being "operational assessment." At II Marine Expeditionary Force, we developed an assessment methodology beginning back in '02 that served us well throughout OIF-I, 04-06, and (hopefully) to this day. Send me a message and I'll give you a point of contact. I believe there are pubs on this as well. Keep in mind that senior commanders should be focused on progress toward major decision points in a campaign, not so much on "combat assessment" of current operations. The commander wants to know if (and when) he's approaching a decision point, especially those that requires either sequel or branch planning.

pvebber
03-06-2009, 05:49 PM
What I'm digging into gets at a number of levels of assessment and relationships involving assessment.

One is the interrelationship between the planning side where COA development assembles a scheme of causal actions that is believed to result in fulfilling the Commander's intent, and the assessment side where the divergence in reality between what was anticipated in planning and what is actually occuring is determined and fed back into planning for adaptation.

In current planning and assessment doctrine, an implicit assumption of independance between objectives and COA tasks allowing a hierarchical decomposition is made. There is also an implicit assumption that lower level assessments and metrics can be aggregated in a linear manner to provide indication of progress or regress toward objectives.

Two is the relationship between MOEs and MOPs (and MOE 'indicators' which are sometimes used, but is not an offical joint term that I've found thus far), and LOOs and LOEs. The notion that MOEs answer "are we doing the right things" and MOPs answer "Are we doing things right" sheds no light on how you determine "how much right thing doing is enough" (a thresholding vice aggregation issue) and hoe you know the things you are collecting MOP data on are realted to MOEs (the article Slap refs above proposes a "theory of action" or what I would call a "cognitive model" of how and why the tasks you are performing cause the results you observe and attempt to measure with MOEs. This is compounded by a lack of doctrinal discussion (at least that I've found) on the relationship between COA development and LOO/LOE development and how you emply MOE's and MOPs in assessing across scales.

Three is the issue of "not everything that counts can be counted, and not everything that can be counted counts" - the focus on algorithmic analysis of quantitative trends, over heurisitcs and thresholding makes people think "more information is always better, and quantifiable information is always better than subjective information". Confirmation bias seems rampant in the excercises I've been a part of with the focus always on metrics that appear to imply 'good news' and an avoidance or at least burial in the details of those implying bad. This harkens back to the period in the history of science when it was believed that carrying out experiments to prove the correctness of the theory improved the probability that the theory was correct. The more modern interpretation is that the truthfulness of a statement is best judged by the least contradictory evidence, not the one with the most confirming evidence. This appears to fall victim in at least some cases by the desire not to give the Boss bad news...

Again, I fully understand the difficulties of dealing with these complexities in real world time constrained staff limited situations. But it appears rather than admit "we don't know what we don't know" we tend to dress up metrics of questionable pedigree into thermographs and stoplight charts that portray a vern chaotic and complex situation in a manner that makes it look like is better understood than it actually is.

Appreciated the opportunity to learn more from those who have had to deal with it directly!