Armchair Dragoons Forums

News:

  • The ACDC returns in 2025!  17-19 January 2025 we'll gather online for a variety of games and chats all weekend long
  • The 2024 Armchair Dragoons Fall Assembly will be held 11-13 October 2024 at The Gamer's Armory in Cary, NC (outside of Raleigh)

News

The ACDC returns in 2025!  17-19 January 2025 we'll gather online for a variety of games and chats all weekend long

Author Topic: The Obstacles On The Road To Better Analytical Wargaming (WotR)  (Read 5758 times)

bayonetbrant

  • Arrogance Mitigator & Event "Organizer"
  • Administrator
  • Staff Sergeant
  • *
  • Posts: 16172
  • Going mad, but at least going somewhere
    • Six Degrees of Radio
https://warontherocks.com/2019/10/the-obstacles-on-the-road-to-better-analytical-wargaming/

Quote
Now that Bob Work has left the building — well, two years ago — what is the future of wargaming as an analytical method in the Department of Defense? That’s not an easy question to answer. The last few years have seen both steps forward and backward as supply sought to meet demand. But now demand may be tapering off, as traditional biases for more mathematically based methods begin to reemerge and attempt to reclaim the mantle of analytical validity, further complicated by the uninformed fascination with machine learning and “big data” as an alternative to gaming.

As an advocate for, and practitioner of, analytical wargaming within the Department of Defense, I’ve witnessed some good things emerge over the past few years, but also enough poor practices to reinforce, to no small extent, the criticisms of gaming made within the operations research community. Peter Perla’s 2016 call to improve wargaming, while widely read and commented on at the time, was mostly ignored by wargame practitioners among federally funded research and development centers, universities, and defense contractors, who, frankly, seem largely content to continue on with business as usual.

It would be easy to dole out a list of examples here, but some time spent perusing the Defense Department’s Wargame Repository (if you have access) will tell the story. With million-dollar wargames putting out such profound insights as “Cyber will be very important in future warfights,” it becomes hard to justify a method that is often costly, yet generates such paltry and obvious results.

Perla’s description of the BOGSAT (“bunch of guys sitting around a table”) masquerading as analytical exercise hits the nail on the head. Attendees of such games often come away wildly enthusiastic about the success of the wargame in which they often “learned a great deal.” Yet when pressed they are hard-put to provide a single insight that most of the participants didn’t already know when they started. This is made worse by wargame results that are often all too easily falsified when underlying assumptions are subjected to more rigorous analytical scrutiny. Chris Weuve’s monograph on Wargame Pathologies is demonstrated often enough in practice, but if we’re honest with ourselves it’s not hard to understand how going into an intellectually stimulating environment for an extended period of time with people of similar interests would be very gratifying personally. But that’s not the same thing as a successful wargame.

In my experience, frustrations with this trend were foremost when Work, then deputy secretary of defense, tried to realize a more robust wargaming enterprise. The real challenge was to create a wargaming paradigm that would be accepted by various analytical organizations. Wargame providers, however, were content to dismiss analytical concerns and requirements, in one case going so far as to bluntly state that despite client concerns, wargames were theirs to execute as they saw fit. They were, after all, the professionals, and they would not be told how to do their work. Unfortunately, the work of these professionals did not meet informed client needs.

As a result of the direct failure of the wargame community to adequately address analytical concerns, the department created wargame teams capable of designing and executing wargames that would be fully integrated with more traditional analytical methods. It had to be done on tight timelines and it also required that every game be custom-designed specifically to address level-of-analysis concerns. The games had to be rigidly adjudicated, conducted with as few external participants as possible, bringing in subject-matter experts where needed only. Designers were brought in from the commercial board-game world, along with subject-matter experts, research methodologists, and analysts. Over the course of a six-month study there could be as many as five different rigidly adjudicated game systems created, with execution consisting of as many as 20 to 30 games, all while integrating with traditional analysts who would run or create models and simulations to validate or falsify game assumptions. The efforts often spanned levels of analysis from the tactical all the way to the geostrategic within one study. The results of these studies had direct and significant impact on several high-profile defense programs, to include net assessments, operational plans, and significant acquisition programs. The games were analytically credible by virtue of the fact that multistage efforts created end-to-end logical narratives for why the results were what they were, and why it mattered. And, in some instances, some genuine innovation emerged.

It is significant that much of this effort had to do with the concept of analytical ownership. Traditional wargame practitioners provided a service, not an analytical product. This is to say that the traditional approach of getting a bunch of subject-matter experts into a room, dividing them into teams, and facilitating a red-blue interaction with the end result being a report back on what happened was not acceptable. What’s more, it was not a valid method with which to get at any complex issue in any analytical detail. Instead, the department sought a process of analytical ownership in which there was a design of research that incorporated wargames along with other methods, and with a final product that described in narrative detail why the effort calibrated around certain theories of success.

Several agencies within the Defense Department, particularly within the Office of the Secretary of Defense and the combatant commands have now seen the effectiveness and impact of a complete analytical process that incorporates wargames and are now beginning to consider how they might do the same thing. The notable exception to this interest has been among more traditional wargame practitioners in the wargame community. To date not a single federally funded research and development center, contractor, or educational institution that purports to provide a wargame service has shown the slightest interest in providing a complete analytical solution that incorporates wargames, nor have they shown interest in analytical ownership of the outcome. To my knowledge, none has even been interested enough to ask what the requirements are. Apparently there is enough demand elsewhere to keep the wargame community busy, and if the reports I’ve read generated from recent wargames are any indication, the BOGSAT is alive and well.

And with new leadership comes new priorities. I can’t predict what any of that will mean for the wargame community at large, but the desire for more analytically robust wargames is certainly present and consumers at the Defense Department are now aware that better is possible. The question remains whether the community at large will step up to address that desire, or continue along the current path of least resistance, providing visceral experiences devoid of rigor or analytical depth.

It is not my intent to paint the entire community with one brush. But frustration with the professional wargame community of practice is real and growing among many of us in the department. Substantial investments have been made in facilities and institutions to improve wargaming as an analytical tool, but many feel this investment has been wasted on practitioners too wedded to what they’ve always done in the past to make any real improvements to the process. Witness the great concern in the wargame community over how the next generation of wargamers will be trained while avoiding any substantive discussion or investigation about what analytical requirements wargame consumers actually need met. Concerns over the community of practice’s future might be best addressed by finding out what the consumers actually want out of wargames and what has been done along those lines already within the analytical departments of the Department of Defense.

Dr. Jon Compton is a senior analyst and wargame subject-matter expert in the Office of the Secretary of Defense. He holds a doctorate in formal research methods and world politics. The views expressed in this article are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. government. The appearance of external hyperlinks does not constitute endorsement by the U.S. Department of Defense of the linked websites, or the information, products, or services contained therein. The Defense Department does not exercise any editorial, security, or other control over the information you may access at these locations.

=+=+=+=+=+=+=+=+=+=+=+=+=+=++

Random acts of genius and other inspirations of applied violence.
-~-~-~-~-~-~-~-~
Six Degrees of Radio for songs you should know by artists you should love


bayonetbrant

  • Arrogance Mitigator & Event "Organizer"
  • Administrator
  • Staff Sergeant
  • *
  • Posts: 16172
  • Going mad, but at least going somewhere
    • Six Degrees of Radio
Reply #1 on: October 23, 2019, 07:07:15 AM
There's been some follow-up

Perla and some others have directly addressed Compton's article

https://warontherocks.com/2019/10/rolling-the-iron-dice-from-analytical-wargaming-to-the-cycle-of-research/

Quote
Is U.S. wargaming in dramatic need of reform? Jon Compton of the Office of the Secretary of Defense insisted the answer is yes in a recent article that caused some consternation and debate in the community of defense wargamers. Compton said much of value, but more needs to be said. He poured a heavy dose of criticism over the wargaming community, not all of which is well targeted.

We certainly agree about the need to integrate wargames with analyses, exercises, and assessments as part of — dare we say it? — the cycle of research. Indeed, CNA and others have striven to do exactly that — when the sponsors of our work have been open to doing so. We disagree with Compton, however, about giving the wargaming community the central role and responsibility for integrating all aspects of the cycle of research.

It is long past time for the leadership of the Department — perhaps acting through the groups Compton calls on the carpet (federally funded research and development centers, other contractors, and educational institutions) — to break apart the stovepipes of analysis, wargaming and, indeed, of “analytical wargaming” as Compton terms it. Pentagon leadership needs to focus on integrating those stovepipes into a new paradigm for providing comprehensive advice to senior leadership. These senior leaders should include not only those within the Office of the Secretary of Defense, but also those of the services, the various operational and functional commands, and the research community writ large. That senior leadership will best be served not merely by better analytical wargaming but primarily by their own broad-based commitment to integrate wargaming with analysis, exercises, experiments, and real-world assessments. It is through such integration that senior leaders — indeed, leaders at all levels — can base their crucial strategic, programmatic, operational, and tactical decisions on the most comprehensive information and insight available.

Compton has illuminated the way forward through his description of the project he undertook at the office of Cost Assessment & Program Evaluation (CAPE) to create what he termed “analytical ownership.” It was a process involving repeated cycles of analysis in the traditional sense in order to, as he describes it, “run or create models and simulations to validate or falsify game assumptions.” In turn, the games themselves became “analytically credible by virtue of the fact that multistage efforts created end-to-end logical narratives for why the results were what they were, and why it mattered.” This process is precisely what one of us  proposed in coining the “cycle of research” nearly 30 years ago.

The underlying power behind the cycle of research has been demonstrated on several occasions. The integration of technical research and experimentation, fleet exercises, and Naval War College wargames played an essential part in preparing the leadership of the U.S. Navy for their roles during World War II. More recently, the detailed technical analyses and iterated wargaming carried out by the Naval War College’s Halsey Alfa group have influenced fleet experiments and exercises and informed Navy decisions about both future programs and operations.

The organizations that make up the “wargaming community” that Compton criticizes so harshly — federally funded research and development centers, other contractors, and educational institutions — are not all in the position of being their own master distinct from the government agencies who must sponsor and fund such work. Although Compton implied that federally funded research and development centers, as well as others, should “take analytical ownership” of this process, it is important to recognize that the CAPE effort was sponsored and executed with government leadership. The Naval War College’s Halsey Alfa group has been using a similar paradigm for more than a decade.

Indeed, we use that term, paradigm, with malice aforethought. Since the McNamara era’s introduction of the concepts of systems analysis into the Pentagon’s lexicon, analysis has become a mantra of truth. Even the term Compton uses — analytical wargaming — demonstrates obeisance to the concepts of analytical rigor and objectivity based on the principles of economics and the physical sciences. For too long that paradigm has seduced both the analysis and wargaming communities within the Defense Department into judging the value of all tools, regardless of their character and use, by standards of validity and utility too narrow to encompass the full range of truth and value.

The paradigm should change.

Instead of imposing the tenets of systems analysis and operations research on wargaming, or those of wargaming on analysis, it is time for the Department — not their supporting contractors and institutions — to recognize the essential need to integrate all the intellectual tools at its disposal across all levels of decisions. And it is at the locus of those decisions that the need should be recognized and the supporting expertise tasked and funded to meet new requirements. The supporting contractors and institutions are very much in a position to fill those requirements but are not — and should not be — in a position to determine what those requirements are. Defense leaders in government and the military ought to take responsibility to use all the tools at their disposal — including operational forces as well as the operations research, analysis, and wargaming communities — to gather the information they need to make the best possible decision.

But it is also imperative that each of these communities draw on the knowledge and experience of the others. Paper analyses and wargames should leverage and be leveraged by practical activities such as experiments, exercises, and real-world assessments. One simple way to think about the essential cycle is this. Analyses help us understand the effects and effectiveness of current and future weapons, systems, and concepts. Wargames help identify how that understanding — and how what we don’t know and how what we are mistaken about — may influence how we act, and in doing so can help us identify critical analyses that need to be done. Exercises, experiments and real-world assessments help us understand better how real people and systems perform in the real environment. Which of these pieces can we do without?

At every stage, the owners of the problem — the strategists, programmers, operational commanders, and others — are the ones who ultimately must specify what they want to learn and how to translate what the results of all those efforts mean for their own decisions. It is the Department of Defense — not the federally funded research and development centers, contractors, and educational institutions — who should take the “analytical ownership” Compton calls for. In his own recent article highlighting the value of the cycle of research, Phillip Pournelle proposes a way forward for the Department of Defense to address the challenges it faces. His prescription includes the traditional application of common scenarios across the department. There may be value to such scenarios, but they may also prove just another bureaucratic box to check while designers of games and analyses find creative ways to “drive a truck through” them to focus on their preferred issues and solutions.

Within that context, the wargaming community is not without sin. As Compton points out, there are bad wargames — and even worse events masquerading as wargames — being perpetrated on the department. The community of those who recognize and understand the strengths and weaknesses of wargaming should be unrelenting in their critiques of such events. Just as they ought to be unflagging in their efforts to help educate and energize the leaders of the department — those who fund and sponsor wargaming and other research. Both the leaders and their supporting communities need to look beyond individual philosophies and tools and to integrate wargames, exercises, analyses, experiments, and real-world assessments into the comprehensive information and advice needed by those who must make the critical decisions affecting national security.

=+=+=+=+=+=+=+=+=+=+=+=+=+=++

Random acts of genius and other inspirations of applied violence.
-~-~-~-~-~-~-~-~
Six Degrees of Radio for songs you should know by artists you should love


bayonetbrant

  • Arrogance Mitigator & Event "Organizer"
  • Administrator
  • Staff Sergeant
  • *
  • Posts: 16172
  • Going mad, but at least going somewhere
    • Six Degrees of Radio
Reply #2 on: October 23, 2019, 07:08:47 AM
Apparently Pournelle started writing his article before Compton's was ever published, and updated for context afterwards

https://warontherocks.com/2019/10/can-the-cycle-of-research-save-american-military-strategy/

Quote
There’s a debate in the Pentagon about wargaming, and it’s heating up. With a recent War on the Rocks article, John Compton, senior analyst and wargame subject-matter expert in the Office of the Secretary of Defense, has put his hat in the ring. Titled “The Obstacles on the Road to Better Analytical Wargaming,” the essay lays out a powerful case that the Defense Department’s wargaming enterprise is broken.

Compton argues that wargamers have ignored Peter Perla’s call to reform of the art of wargaming. Many practitioners continue to execute wargames which aren’t wargames (e.g. Bunch of Guys and Gals Sitting Around a Table), and have failed to adapt their types and styles of games to what the customers ask for. He then describes, accurately in my view, how many wargaming practitioners lack “analytic ownership,” and fail to properly construct their games using multiple methods.

While I largely agree with Compton’s criticism, I think he paints with too large a brush. Many in the wargaming community are working for the very reforms he calls for. Others work in fields which don’t directly apply, such as training or education. In some areas, however, he doesn’t go far enough. His article fails to highlight the danger of the status quo, and the real risk that poorly-constructed analysis (not just wargaming) can lead to battlefield losses. The future force is in danger of being designed based on the impetus of services’ prerogatives and history rather than on a proper inquiry, exploration, and evaluation worthy of a joint force. The detachment of wargaming and the other elements of analysis from an integrated approach cuts the military adrift from its analytic moorings just when the nation and its allies need it the most.

The United States has the analytic talent to implement a proper cycle of research. But this will not be sufficient unless the Department of Defense can organize its functions and incentivize integration of the disparate analytic activities. Fortunately, the Department of Defense has an analytic community with the tools necessary to accomplish the task and a historically successful formula to follow if it chooses to reenergize its efforts. By recreating the cycle of research — which was a success during the Cold War — the Department of Defense can build a truly innovative joint force to take on our new challenges. To do so requires coordinated effort, common scenarios the services can work from, and the synchronized use of multiple diagnostic techniques.

The Challenge of the Century

Choosing and building the right innovative force is hard. It’s difficult because it involves an enormous organization for which change is painstaking, a need to think long-term when focusing on the current crises is so much easier, and cunning competitors who are attempting to outwit them.

Developing the right force is also vital to American national interests. The National Defense Strategy Commission of 2018 reported that the military doesn’t have the operational concepts it needs, and struggles to “link objectives to operational concepts to capabilities to programs and resources.” None of these issues can be fixed unless the Defense Department repairs its analytical capability. A report by the Government Accountability Office identifies a good place to start. Its report specifically notes the lack of a common set of scenarios to be used in strategic analysis. This set of scenarios was once known as the ‘analytic agenda,’ and later became ‘support to strategic analysis.’ Support to strategic analysis has been hindered by three interrelated challenges. The scenarios were said to be cumbersome and inflexible. It was claimed that the analysis did not significantly deviate from services’ programmed force structures or test key assumptions. Finally, it was claimed that the Defense Department lacked the joint analytic capabilities to assess force structures. As frustration increased, key stakeholder and leaders disengaged from the support to strategic analysis process by 2014. Just as importantly, the service’s preferred force structures and their assumptions were not tested until the budgetary process had already been completed, leaving little time and no incentive to examine true alternatives.

Congress required one service to develop truly alternative force structures. Specifically, it made the Navy deliver three alternatives from three different sources, but these alternatives do not appear to have had any significant impact on the 30-year shipbuilding plan. However, the Navy is conducting a new force structure assessment, which may incorporate some of those concepts.

There are those within the defense analysis community who believe that these critical assessments are inaccurate and overstated. Indeed, there is plenty of reason to be optimistic. The Defense Department is blessed with an energetic analytic community capable of tackling these problems with multiple complementary techniques, and who can look back on a historically effective approach for guidance in the future. Meanwhile, an organic grassroots solution has sprung up from within the analytic community. Those who wish to join in the effort can contribute to the solution.

The combined talents of the analytic community — wargamers and operations analysts alike — could help the Pentagon meet the challenges posed in the National Defense Strategy. While there has been a historic distrust between the two communities, an alliance was forged at a special workshop under the aegis of Deputy Secretary of Defense Bob Work as he campaigned to reinvigorate wargaming. While wargaming, when done correctly, is very valuable for a rapid exploration of the decisions and alternatives competing sides may take, computer-based campaign analysis is extremely valuable in rigorously examining the details and assumptions behind those decisions.

When wargaming and analysis are combined with data collected from experimentation, fleet or field exercises, and weapons tests, it completes a cycle of research that was a key ingredient to winning the Cold War. For example, during the 1980s the Chief of Naval Operations’ Strategic Studies Group conducted a series of games and analyses on how to conduct anti-submarine warfare operations against the Soviet navy using team tactics coordinating maritime patrol aircraft, ships, and submarines. The captains who were part of the Strategic Studies Group then went out into the fleet in leadership positions where they implemented those tactics, making it clear to the Russians that they would be no match if the Cold War went hot.

The Cycle of Research

Wargames are crucial in the examination of what competitors can do. They provide insight on the range of options available to rivals who do not have the same perspective as Americans and their allies. The choices of a competitor are strongly shaped by national history, philosophy, and organizational psychology. Furthermore, wargames provide the means to generate a variety of outcomes that assist in the exploration and understanding of potential future scenarios, and the preparation of those who participate in them to respond to the range of potential settings. However, wargames can often lack the precision and rigor needed for the development of an order of battle, budgets, and data necessary for the services and the Defense Department to complete their task of manning, training, and equipping the future forces.

Computer-based campaign models are crucial to the use of science and quantitative analysis to gain insight into the necessary forces, the trained personnel needed to operate those forces, the logistics crucial to operating them, the command and control network to enable them, and myriad other issues. However, all such “analysis must simplify and often discard much that is not reproducible or readily predictable — including at times, human behavior.” The danger is as the analyst searches for a measurement of performance, this can often be based on the American perspective; thus either discounting a competitor’s perspective or, worse, projecting an American perspective onto competitors. This can lead the analysis into an intellectual cul-de-sac where the actions of competitors can neutralize American capabilities.

Wargaming and campaign analysis are dependent on real world events for their validity. The conduct and data collection in experimentation, fleet or field exercises, and weapons tests is critical to ensuring that wargames and analysis are rooted in reality. While the games ask what we can do, and analysis answer how best to do it, real-world events tell us if it can really be done and the factors influencing it.

During the Cold War, the Office of the Secretary of Defense and the armed services employed a very successful multi-disciplinary approach that contributed to the American victory over the Soviet Union. The Department developed a series of recommendations to implement peacetime (and if necessary, wartime) approaches to win a long-term competition, including what people now call grey zone competition. While a large set of wargames was used as a diagnostic tool, the Pentagon leveraged these wargames by using them to inform the analysis conducted by other sections of the Defense Department. This enabled the department to use the games to contribute to the development of truly alternative force structures from which current forces arose.

After the end of the Cold War, the pursuit of efficiencies in the face of budget cuts, the disruptions caused by sequestration, and uncertainty about the nature of the dominant threat combined to erode leadership support for the cycle of research. While individuals and offices tried to continue to play their part, without central direction the various disciplines reverted to intellectual silos.

The Way Forward

If the Department of Defense is to successfully take on the challenges described in the National Defense Strategy and compete with other major powers, it will need to support the analytic community to employ all diagnostic approaches — wargaming, analysis (including modeling), and exercises — towards the problem.

It will also have to take a very hard look at its analytic organizations and ask how they do or do not contribute to the cycle of research. How is my wargame contributing to the identification of hypothesis for follow-on analysis or issues to be examined in an exercise? How do the assumptions in my models and simulations stand up to decisions made by experienced players in my games or in fleet/field exercises? Have I fully explored the range of decisions competitors (particularly from the Red perspective) may make before I build my model? How am I capturing data in fleet and field exercises to validate or negate assumptions in my games and modeling? How are my efforts contributing to an overall cycle of research? If these questions cannot be answered, then the public good has not been fully supported.

In the current budget process, the Joint and Service Staff assessment capabilities (including campaign analysis) are brought to bear on the force structures after the budgetary processes have chosen a force.  This means there is no real consideration or analysis of truly alternative joint forces in the process nor trade-off analysis within the joint force. This results in the services continuing to build what they have done in the past. If the Department of Defense is to be effective, then the Secretary of Defense and his staff must have the means to rapidly conduct wargames and assessments of alternative forces (including alternative concepts of operations) early and often in the process.

The value of a common and consistent set of scenarios for the joint analytic process is widely recognized. As a result, an organic grassroots effort has arisen from the services. The services have voluntarily combined their efforts to rebuild the common set of scenarios in the form of the Joint Forces Operational Scenario initiative. Rather than a slow top-down approach, the services have been working together to develop scenarios and a common set of tools to enable sharing of scenarios, assumptions, and results. The Joint Forces Operational Scenario effort recently produced the first joint baseline scenario concept of operations document in only six months.

The Department of Defense should support the communities of practice within the analytic community while they share and refine best practices and explore new techniques. It should use wargaming to examine the wide range of potential scenarios and employ the more precise analytic approaches to tackle common critical issues within them or detailed near term campaigns.  All of this should be validated by fleet and field exercises and experiments to make sure all analysis and gaming is rooted in reality.

Those who wish to contribute to the solutions of these dilemmas can join in the deliberations of the analytic community at two upcoming workshops — one on wargaming and analysis of cyber operations, and another on improving campaign analysis (including its interrelations with other techniques).

The secretary of defense can build on these combined endeavors to enable truly alternative forces to be examined and assessed. Continued success requires all the elements of the national security analytic community to work together, each contributing their own perspective, led by the secretary of defense, to build the innovative joint force that the United States needs.


=+=+=+=+=+=+=+=+=+=+=+=+=+=++

Random acts of genius and other inspirations of applied violence.
-~-~-~-~-~-~-~-~
Six Degrees of Radio for songs you should know by artists you should love


trailrunner

  • Corporal
  • **
  • Posts: 1076
Reply #3 on: October 23, 2019, 05:56:21 PM
I'm pretty familiar with the MS&A (modeling, simulation, and analysis) done within DoD to support force structure and especially major acquisition programs.  I have spent countless hours arguing the validity of models.  In my opinion, there are three barriers to truthful outcomes:

1)  The simulations are performed to give us a desired answer.  Early in an acquisition program, they are used to justify the new start by portraying the enemy as having an overmatching capability, and we will be defeated if we don't buy the new system.  Then, later in the program, the simulation will be used to show that if go to full-rate production, America will be victorious, even though the system didn't meet any of its requirements.  Keep in mind that the simulations are developed by contractors (either the prime building the system or support contractors), so if they don't give the desired answer, jobs will be lost.  Even among the government folks (like me), if you try to be objective but it goes against the flow, you will find yourself with a basement office in the Pentagon, your funding will be jeopardized, and you won't be given the chance to present your results.  There are not a lot of honest brokers, nor checks and balances.

2)  Modeling full-scale battle is hard.  In the 1990s, when PCs were becoming powerful and the maturing internet was allowing massive networked simulations, DIS (distributed interactive simulation) was going to be able to replicate hundreds of thousands of entities with physics-based models, and *therefore* it would be realistic, and anyone who disagreed was a naysayer.  Well, I was that naysayer, and I presented some papers showing some limitations with this way of thinking.  The point is that even if we are trying to be completely objective, modeling even small-scale force-on-force warfare is very, very hard, and we are nowhere close to being able to do it credibly.  That doesn't mean that MS&A is worthless (not at all), but the limitations need to be discussed objectively.  This sounds simple, but it is rarely done honestly.

3)  Predicting the next war is very difficult.  Wars often bring huge surprises.  The enemy will do things that nobody would have ever thought of.  Besides that, we don't know where we'll be fighting next.  Sure, we can come up with plausible scenarios and learn a lot from those.  Red-teaming or OPFORs are useful, but they must be given free rein, and even then, they are constrained by what's built into the model.  The "unknown-unknowns" are by definition not accounted for.


I’ve spent half my life’s earning on wargames, women, and drink. The rest I wasted.


bob48

  • Smeghead.
  • Warrant Officer
  • Lead Sergeant
  • *
  • Posts: 12335
Reply #4 on: October 24, 2019, 07:34:35 AM
That's pretty sobering stuff.

“O Lord God, let me not be disgraced in my old days.”

'We few, we happy few, we band of brothers'