Partners
Saab Theissen Training Systems Saab Cubic MIL SIM FX Marathon Thargets Rheinmetall Meggitt MTSN eblanks argon microflown avisa
Organisation
Ruag
LOF c/o Fenco AG
Höglerstrasse 73
CH-8600 Dübendorf

Dr. John R. "Buck" Surdu

Senior Scientist
Cole Engineering Services, Inc.

Dr. John R. “Buck” Surdu retired from the US Army after almost 29 years of service. After working in the research directorate of the National Security Agency, he began work for Cole Engineering as a senior scientist. He was a successful project manager for the Army’s OneSAF program as well as a PM at DARPA for the Deep Green program, among other projects. He is a Ph.D. in computer science with background and refereed publications in artificial intelligence, modeling and simulation, and software engineering. He is a charter member of the Modeling and Simulation Professional Certification program. His current research interests are diverse and include developing technologies to improve live, force-on-force training.

Abstract 1: Optically Based Small Arms Targeting for Live Force-on-Force Training

Current tracking technologies used to estimate Soldier and weapon location and orientation are insufficient to support long-range direct fire engagements in live force-on-force exercises and mixed and augmented reality (AR) training applications. Tracking solutions developed for augmented reality only support engagements at ranges just over 50 meters, but Soldiers and Marines are trained to fire at targets at 375 meters or more. Laser-based systems suffer from a number of drawbacks: (1) they do not require shooters to lead moving targets, (2) they do not require soldiers to properly adjust weapon elevation based on target range, (3) they cannot represent grenade launchers, and (4) they are blocked by foliage. Each of these drawbacks results in negative training. We present a novel approach to determining hits and misses with small arms fire that enable longer-range engagements without the use of lasers for live, force-on-force training applications. The basic intuition to this novel approach is that instead of using high-resolution accuracy of soldier and weapon location (to within four centimeters) and orientation (to within 100 micro radians or better), the starting point for a calculation is the reported locations of the shooter and target and the shooter’s sight picture when the round is fired. No harnesses, markers, reflectors, or indicia are used in this approach. With a sight picture as the starting point of the computation we know the aim point with respect to the target precisely at the time the trigger is pulled. Was the shooter leading the target properly? Did the shooter aim above center of mass of the target because he was firing at long range? This information is used to compute whether a shooter should accurately get credit for a hit or miss at long range. The system, Optically-Based Small Arms Targeting (OBSAT) successfully demonstrated accurate recording of hits and misses at distances of up to 375 meters against stationary, moving, and partially occluded targets. Most recently we have integrated the M-203 Grenade Lanucher into OBSAT, enabling for the first time, grenadiers to participate in live training. We have also integrated a wound model that enables OBSAT to determine the part of the body that was impacted by the round and generate location-appropriate wounds. This will improve training by enabling soldiers to conduct appropriate self and buddy aid as well as by information medic training. Results of this work are presented as well as areas for future research required to make this technology fieldable.

Abstract 2: Simulation Integrated Into Mission Command Systems

This work examines the practicality and effectiveness for embedding simulators into a mission command device. This is a different approach than merely making simulations interoperable with mission command systems, which has been possible for many years. The goal is to use only the operational plan in theater for simulation input, hiding all simulator details from the operator so he/she does not need to learn new tools. A prototype capability is discussed which produces a Course of Action (COA) analysis. For this effort, we used SitaWare to create courses of action using the native planning capabilities of that tool. The user created no scenario files and did not know anything about the simulation. When the user clicks the “simulate” button either the Marine Air Ground Task Force (MAGTF) Tactical Warfare Simulation (MTWS) or One Semi-Automated Forces (OneSAF) simulation is used to simulate the plan. The intelligence planner creates courses of action for the enemy. After inputting the operational plan, the commander selects the number of simulation runs to execute and presses a button to start the simulation which runs faster than real time in the background. The user never sees the simulation or controls any of the simulation entities. There are no scenario files or other intermediate artifacts. Both MTWS and OneSAF pull data directly from the SitaWare data store and place the simulation results back into the data store when complete. When the simulation runs complete, the commander may view the results in graphics and charts which compare the multiple runs. The system presents a decision support matric to aid the commander. The user may then conduct some “sensitivity analysis.” He does this by selecting factors to weight more heavily than others, such as personnel losses or fuel consumption, or by weighting the most likely and most dangerous enemy courses of action. This aids the commander in making an informed choice as to which course of action to develop into a full plan.

webcontact-levelsoffidelity@itds.ch