Researchers at the U.S. Army Research Laboratory continue to develop and evaluate methods for navigation and communication that are ‘hands-free, eyes-free and mind-free’ to aid Soldiers in the field.
Soldiers wear a lightweight belt around their torso, containing miniature haptic technology. The belt provides vibratory or tactile cues allowing a Soldier to navigate to map coordinates and receive communications while still carrying a weapon.
Research said initial feedback from Soldiers testing the device is positive. Soldiers say they liked being able “to concentrate on other things and not the screen.”
Soldiers are able to move and communicate while keeping visual map displays in their pockets and their eyes on the surroundings.
Vibratory signals are communicated through tactile actuators inside the device. Navigation signals correspond to vibrations or pulses that tell the Soldier which direction to go.
“Data are still being compiled, however, it is clear that Soldiers rarely looked at the visual display when the tactile belt was ‘on.’ Soldier feedback was very positive,” said Gina Hartnett, from HRED’s Fort Rucker, Ala., field element. “This assessment gave us a great example of how a device can free up the senses so effectively. Course times were faster on tactile assisted navigation legs. Soldiers reported being more situationally aware of their surrounding because they rarely, if ever, had to take their eyes off of their environment. Additionally, not having to interact with a visual display, allowed their hands to stay on their weapon.”
As long as the tactile sensation is felt at the front of the torso, the Soldier moves forward. If the sensation is at the side or back, the Soldier simply turns until the GPS-enabled signal is felt at the front.
At the same time, communications are also provided by tactile means that can be from other Soldiers or more intelligent ground robots — such as status updates or warnings regarding potential threat.
The vibration, or sensation the Soldier feels, determines what the Soldier is supposed to do or the task they are to perform and is based on the tactile language that is developed — such as with Morse code.
The patterns are developed to be distinct, unique and consistent with the information at hand, to allow the Soldier to quickly and easily interpret the cues. For example, hand signal information or specific messages such as “robot battery low” can be assigned to patterns, learned and recognized.
One may think of the vibration signals as similar to different ring types on your cellular phone. A person may know who is calling without actually looking at the screen to see the person’s name or number. It is the sound that provides the alert — not the actual sight of it.
Tactile actuators could be placed in any number of objects — such as a glove, belt, inside the helmet or vest.
Researchers from U.S. Army Research Laboratory, known as ARL, Human Research and Engineering Directorate’s Fort Benning, Ga., field element, are testing such tactile systems for navigation and/or communication during mission-relevant exercises to determine the effectiveness of these devices while wearing them and seeing how they perform during actual use. Soldiers quickly learn the system, attaining proficiency with the signals within 10-15 minutes.
Soldiers recently participated in an assessment of the NavCom system at Fort Benning, to evaluate simultaneous presentations of navigation and robot communication/monitoring using tactile patterns of two types of advanced tactors during operationally relevant scenarios. Researchers asked Soldiers to complete several combat-related tasks during this exercise.
The scenarios involved night land navigation on equivalent courses of about 900 meters. While navigating from waypoint to waypoint, Soldiers also received communications from a hypothetical autonomous robot regarding either the robots status or a possible threat detected by the robot. Additionally, Soldiers negotiated exclusion zones and identified enemy targets along the course.
The system automatically collected data, such as time to each waypoint and accuracy to each waypoint. Observer-based data collection included accuracy of robot alerts, number of times Soldiers looked down at their screen, took their hand off of their weapon and correctly identified a target on the course. Subjective data were also collected after each mission in the form of a workload assessment and questionnaire followed by an after action review at the end of the night.
Harnett said that some specific comments from the Soldiers included:
“I was more aware of my surroundings.”
“I don’t land nav much, but this made it a no-brainer.”
“I loved the belt, it worked perfectly.”
“This stream of research is very dear to my heart,” said Dr. Linda Elliott, from HRED’s Fort Benning field element. “It’s not often a Soldier can pick up a piece of equipment, be trained in five to 10 minutes, and have a very positive experience. In a previous night study, Soldiers said they were blind (night, fog, rain, night vision devices fogging up, etc.) and the belt led them straight to point, allowing them to focus attention on their surroundings.”
Elliott said the system supports the three basic Soldier tasks — move, shoot and communicate — all while allowing individuals to move more quickly, accurately, find more targets in their environment and be more effective at covert communications.
“At the same time, we are trying to collect more basic data, to identify the factors that make a tactile signal ‘salient’ — easily felt, immediately recognized and distinguished from others. That has to do with the type of tactile signal strength (and other engineering factors), individual differences (such as fatigue), and environmental factors.”
Tactile systems for military performance have demonstrated their potential with regard to capability achievement and performance advantage, across a number of applications. Experiments and demonstrations have been conducted across a wide range of settings, from laboratory tasks to high-fidelity simulations and real-world environments.
Several ARL studies have been conducted within the context of Soldier land navigation to investigate effects of tactile cues in context. Many of these studies have been published as ARL technical reports.
Elliott said that subsequent experiments proved the value of tactile systems to support Soldier navigation and communication, but at the same time, systems must be improved and refined before they can be practical in combat situations.
“They must be made lightweight, comfortable, rugged, networked within a command and control system and they must be easy to use and easy to maintain,” Elliott said. “As tactile displays are increasingly used for communication of more complex and multiple concepts, it will become evident that tactile and multi-sensory systems in general must be designed for rapid and easy comprehension.”
——-
The U.S. Army Research Laboratory is part of the U.S. Army Research, Development and Engineering Command, or RDECOM, which has the mission to develop technology and engineering solutions for America’s Soldiers.
RDECOM is a major subordinate command of the U.S. Army Materiel Command. AMC is the Army’s premier provider of materiel readiness — technology, acquisition support, materiel development, logistics power projection, and sustainment — to the total force, across the spectrum of joint military operations. If a Soldier shoots it, drives it, flies it, wears it, eats it or communicates with it, AMC provides it.