Results 1 to 8 of 8

Thread: Theoretical underpinings of Assessment?

  1. #1
    Council Member pvebber's Avatar
    Join Date
    May 2007
    Location
    Rho Dyelan
    Posts
    130

    Default Theoretical underpinings of Assessment?

    I've been working on several excercises/"experiments" dealing with the navy's Maritime Operations Center (MOC) development, and one of the big issues I'm seeing is in the area f assessment. The broad framework the MOC is working under is supporting a commander's decision cycle consisting of Plan - Decide - Monitor - Assess. We are making some progress (finally) on the Plan side - and the Decide and Monitor have bumps but have been the most solid - leaving us with Assessment as the red-headed stepchild of the bunch.

    The closest thing to a "theory of assessment" I've been able to find is in the ONA - based system of systems analysis writing (lately under fire if not discredited, at least at JFCOM). So what has replaced ONA-SoSA? Or is its principles still around under fresh acronyms? Joint doctrine seems jam packed with a wide variety of assessments to be done, but almost nothing on methodology.

    Thanks for any enlightenment!
    "All models are wrong, but some are useful"

    -George E.P. Box

  2. #2
    Council Member slapout9's Avatar
    Join Date
    Dec 2005
    Posts
    4,818

    Default

    Check out this months issue of Military Review. Article on The Art of Design and a couple of others. Your are essentially correct that is just a rewording of existing methods, but they have some good insights. Later Slap.



    http://usacac.army.mil/CAC2/MilitaryReview/

  3. #3
    Council Member pvebber's Avatar
    Join Date
    May 2007
    Location
    Rho Dyelan
    Posts
    130

    Default

    Thanks! Soory for the misplaced post!!
    "All models are wrong, but some are useful"

    -George E.P. Box

  4. #4
    Council Member pvebber's Avatar
    Join Date
    May 2007
    Location
    Rho Dyelan
    Posts
    130

    Default

    The artical on Design hits at a lot of the points I am interested in, and builds on the Commander's Appreciation and Campaign Design pamphlet. The need for a "design" step, part of which is to develop a "theory of action" - the missing "model" part to me fills a large void in the current plan - decide - monitor - assess framework. Whether this is considered part of 'plan' or a seperate precursor to plan, I leave to the doctrine types to ruminate on, but I found this VERY useful.
    "All models are wrong, but some are useful"

    -George E.P. Box

  5. #5
    Council Member slapout9's Avatar
    Join Date
    Dec 2005
    Posts
    4,818

    Default

    pvebber, you may want to google Dr. Russell Akoff or Acoff a lot of this comes from his theory of Interactive Planning which is pretty good stuff. Kind of a planning theory based upon the idea that the only good plan is one that is constantly adapting. Acoff was one of the leaders of the Systems Thinking movement from WW2 and is still alive I believe. Later

  6. #6
    Council Member Ken White's Avatar
    Join Date
    May 2007
    Location
    Florida
    Posts
    8,060

    Default You might also investigate

    Soft systems methodology LINK.

  7. #7
    Registered User
    Join Date
    Mar 2009
    Posts
    2

    Default

    It sounds to me like you're looking for info on how to do different types of assessment, one of them being "operational assessment." At II Marine Expeditionary Force, we developed an assessment methodology beginning back in '02 that served us well throughout OIF-I, 04-06, and (hopefully) to this day. Send me a message and I'll give you a point of contact. I believe there are pubs on this as well. Keep in mind that senior commanders should be focused on progress toward major decision points in a campaign, not so much on "combat assessment" of current operations. The commander wants to know if (and when) he's approaching a decision point, especially those that requires either sequel or branch planning.

  8. #8
    Council Member pvebber's Avatar
    Join Date
    May 2007
    Location
    Rho Dyelan
    Posts
    130

    Default

    What I'm digging into gets at a number of levels of assessment and relationships involving assessment.

    One is the interrelationship between the planning side where COA development assembles a scheme of causal actions that is believed to result in fulfilling the Commander's intent, and the assessment side where the divergence in reality between what was anticipated in planning and what is actually occuring is determined and fed back into planning for adaptation.

    In current planning and assessment doctrine, an implicit assumption of independance between objectives and COA tasks allowing a hierarchical decomposition is made. There is also an implicit assumption that lower level assessments and metrics can be aggregated in a linear manner to provide indication of progress or regress toward objectives.

    Two is the relationship between MOEs and MOPs (and MOE 'indicators' which are sometimes used, but is not an offical joint term that I've found thus far), and LOOs and LOEs. The notion that MOEs answer "are we doing the right things" and MOPs answer "Are we doing things right" sheds no light on how you determine "how much right thing doing is enough" (a thresholding vice aggregation issue) and hoe you know the things you are collecting MOP data on are realted to MOEs (the article Slap refs above proposes a "theory of action" or what I would call a "cognitive model" of how and why the tasks you are performing cause the results you observe and attempt to measure with MOEs. This is compounded by a lack of doctrinal discussion (at least that I've found) on the relationship between COA development and LOO/LOE development and how you emply MOE's and MOPs in assessing across scales.

    Three is the issue of "not everything that counts can be counted, and not everything that can be counted counts" - the focus on algorithmic analysis of quantitative trends, over heurisitcs and thresholding makes people think "more information is always better, and quantifiable information is always better than subjective information". Confirmation bias seems rampant in the excercises I've been a part of with the focus always on metrics that appear to imply 'good news' and an avoidance or at least burial in the details of those implying bad. This harkens back to the period in the history of science when it was believed that carrying out experiments to prove the correctness of the theory improved the probability that the theory was correct. The more modern interpretation is that the truthfulness of a statement is best judged by the least contradictory evidence, not the one with the most confirming evidence. This appears to fall victim in at least some cases by the desire not to give the Boss bad news...

    Again, I fully understand the difficulties of dealing with these complexities in real world time constrained staff limited situations. But it appears rather than admit "we don't know what we don't know" we tend to dress up metrics of questionable pedigree into thermographs and stoplight charts that portray a vern chaotic and complex situation in a manner that makes it look like is better understood than it actually is.

    Appreciated the opportunity to learn more from those who have had to deal with it directly!
    "All models are wrong, but some are useful"

    -George E.P. Box

Similar Threads

  1. Assessment of Effects Based Operations
    By SWJED in forum Futurists & Theorists
    Replies: 140
    Last Post: 05-03-2009, 06:00 AM
  2. Departure Assessment of Embassy Baghdad
    By SWJED in forum US Policy, Interest, and Endgame
    Replies: 10
    Last Post: 02-18-2008, 07:09 PM
  3. Initial Benchmark Assessment Report
    By SWJED in forum US Policy, Interest, and Endgame
    Replies: 2
    Last Post: 07-13-2007, 03:12 PM
  4. Personality Assessment Test
    By wierdbeard in forum RFIs & Members' Projects
    Replies: 3
    Last Post: 04-15-2007, 06:14 AM
  5. Replies: 0
    Last Post: 04-28-2006, 07:16 PM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •