Friday, March 26, 2010

Looking Down the Road: Human Engineering an Indirect Vision System

Written by T.J. Sharkey and P.A. Lakinsmith.

Summary

While advances in sensor and display technology make it possible to operate military vehicles with indirect vision, a number of unique issues must be addressed to make it usable by the human operators. This article presents a human factors view of the issues and ways to approach them.

Indirect Vision History

Indirect vision systems have been envisioned for use in aircraft since the 1950s. Indirect vision systems can be classified as Synthetic Vision Systems (SVS) or Enhanced Vision Systems (EVS).

SVS present an image of the scene ahead that is generated from a stored data base. The data base contains terrain features, and may contain cultural features such as buildings and power lines. A SVS image is generated in a way that is similar to the way the scene is generated in a training simulator or video game. SVS allows the pilot to see a color, daylight view regardless of the time of day or the weather conditions.

In contrast EVS imagery comes from sensors aboard the vehicle. The sensors can include daytime cameras, infrared sensors, image intensifiers, and millimeter wave radar. While these sensors can outperform the human eye in many useful ways, there are limitations. SVS and EVS imagery is usually presented on a head down display on the instrument panel (dashboard) of an aircraft. This allows an unobstructed view of the outside scene through the aircraft’s windshield.

The FAA has certified a number of SVS and EVS for use in commercial and general aviation aircraft. Some of these systems allow approaches to continue to lower altitudes before a runway is seen directly than would be permissible in the absence of that SVS or EVS.

Recognizing that soldiers operating ground vehicles are at risk whenever they are not under armor protection, the U.S. Army directed that the Manned Ground Vehicle (MGV) component of the Future Combat System (FCS) program would use indirect vision as the primary mode of operation, leveraging the work that had been conducted in the aviation domain. To accomplish this, LCD displays were designed into the MGV (similar to replacing the windshield of the HSCT with displays), and cameras were provided to provide 360° coverage around the vehicle.

The vehicle manufacturer’s main goal when building these systems is to meet performance goals at the desired cost, and to achieve a balanced solution. The human is a critical element of these systems, but is sometimes also the weakest link. An overbuilt engine is of no use if the indirect vision system won’t let the human drive at top speeds.

The Role of the Human Factors Engineer

The role of human factors engineers (HFEs) in Soldier Machine Interface design is to understand the soldier’s capabilities (and sometimes weaknesses) and to modify the system design to best support the soldier. The goal is to create a solider centric solution in the context of the balanced design, using data from prior human performance research, and gathering new data where required. Fortunately, HFE often are able to identify and apply information from previous studies to address the human-system design problem. In the case of indirect vision there is a large body of work performed by the U.S. government and by academic organizations . HFEs are a valuable asset to the design team by assessing the relevance and empirical strength of the prior work. They bring these data to the design space, and inform product teams on the expected performance benefits and losses associated with each design path.In some cases the existing body of work does not completely answer the design team’s questions, and additional studies are required. The Human Factors Engineer has the skills to design simulation and field studies that apply to the target population, that comply with Federal Regulations concerning the use of human subjects in DoD research, and provide meaningful, reliable data to ensure confidence in a subsequent design decision.

Key Indirect Vision System Design Parameters

The remainder of this article briefly discusses some of the design parameters that we as human factors professionals consider important to the design of an indirect vision system. In the authors’ prior work these issues were traded off against each other and other system design parameters (such as cost, weight, and volume) to develop a COTS indirect vision system that proved effective for day and night off-road driving of a surrogate vehicle.

Glass-to-Glass Lag. When viewing a scene directly, light takes virtually no time to travel between the object and the soldier’s eye. However, in an indirect driving system the imagery seen by the driver is delayed. This delay is measured as the elapsed time from the light initially striking the lens of the sensor (the first piece of “glass”) until the light is emitted by the driver’s display (the second piece of “glass”). The delay at the sensor is usually a function of the frame rate. At 30 frames per second the delay is approximately 33 msec. Similarly, if the display updates at 60 frames per second the delay is approximately 17 msec. Additional delays are added for the transmission of the imagery and for each processing phase or piece of equipment in the video stream. When overlaying symbology on a video scene the additional delay can be significant.

Glass-to-glass lag affects the ability of the operator to drive because the vehicle moves in the interval between the time the image is obtained by the sensor and when the image is displayed to the driver . This is usually a small error, but can be important when precise driving is critical. Arguably more important is the effect that glass-to-glass lags have on the ability of the driver to determine the effect of control inputs. With large lags the driver’s control inputs get out of phase with the vehicle. This often leads to imprecise and non-aggressive vehicle control. In extreme cases this can lead to abrupt changes in a vehicle’s heading. This is similar to what aviators refer to as “Pilot Induced Oscillation ”.

Human performance research suggests that the smallest glass-to-glass lag known to produce decrements in tracking performance is approximately 40 milliseconds (Boff & Lincoln, 1988). This value comes from studies of humans performing compensatory tracking tasks that are very similar to vehicle control tasks.

Control lag. The next generation of vehicles are likely to use drive-by-wire technology. With drive-by-wire technology there is no physical connection between the steering wheel and the tires or tracks, which introduces a lag between the time that the driver makes a steering wheel input and the time that the tires or tracks begin to execute that steering input. Fortunately, this delay is small with modern data busses, such as the CAN and FlexRay busses which are increasingly being used in commercial automotive applications. However, computer processing of the steering input (e.g. for safety reasons) may add to the delay. Unfortunately, there are few data available that let us accurately predict how glass-to-glass and control lags interact with one another to affect vehicle control.

Image minification and magnification. The images of objects presented on a display can be smaller, larger, or the same angular size as those images viewed directly through the windshield (See Figure 1). Many designers have chosen to minify sensor imagery in order to fit a wider Field of View onto the displays, but this can result in distance and speed judgment errors. Humans use the angular size of familiar objects as one source of information about the distance to that object. If a familiar object, such as an automobile ahead on the road, subtends a larger angle on the retina than it would if viewed directly (a magnified image) it will tend to be perceived as being closer to the viewer than would the same image if it subtended a smaller angle (a minified image). So, for accurate distance perception when operating a vehicle one would expect that the image should be shown at 1:1 magnification. However, there is some evidence that magnifying the image on a screen by about 30% results in the most accurate distance perception (Roscoe, Hasler, & Doghherty, 1966).

Figure 1. Examples of image magnification, minification, and unity vision.

Image magnification presents the sensor imagery so that it subtends a larger FOV on the screens than it does in the sensor. Image minification takes a wider FOV from the sensor and presents it on a narrower FOV on the display. With both of these approaches the spatial correspondence to objects in the real world (i.e. the heading of that object with respect to your vehicle) is lost, and speed and distance are not perceived correctly. "Unity Vision" is when the sensor FOV is the same as the driver's FOV to the displays. With unity vision, the spatial relationships between yourself and objects in the environment are preserved, and appear the same on the displays and in direct vision or through periscopes.

An often overlooked drawback to minification or magnification of an indirect vision image is the loss of directional accuracy (See Figure 2). Specifically, when the image on a screen is minified or magnified the direction from the observer to that object is affected in a continuous, but non-linear manner; the farther from straight ahead the object is the larger the directional error. This direction error would likely go undetected until the driver attempts to locate the object though a direct vision device, such as vision blocks which neither magnify or minify the image. In this situation, the heading of the object on the display would diverge from that seen in the direct vision device. This may increase the time a soldier requires to locate and visually identify the object in the vision block, and negatively affect his situation awareness.

Figure 2. Loss of directional accuracy resulting from minification and magnification.

Vertical and Horizontal Field of View. The vertical and horizontal field of view (FOV) of the scene may be altered independently of the magnification by making the displays either wider and/or taller, or by adding more displays. However, in most vehicles space and weight are at a premium, so changing the eye-to-screen distance (moving the screen(s) closer to the driver to increase the FOV) is often preferred, provided that the screens don’t interfere with controls, impair ingress or egress, or present other hazards to the driver or other crewmembers. However, changing the FOV of the screens will require the field of view of the sensors providing the imagery to be changed to match that of the screens in order to keep the same magnification level of the image.

The importance of FOV varies depending on the driving conditions. On a smooth road with moderate slopes a limited FOV is acceptable. However, when operating off-road the vertical FOV needs to be large enough to allow drivers to make correct height and depth estimations of obstacles ranging from ditches to hills. In extreme off road conditions a very large vertical FOV is needed, or a means of aiming the sensor to it looks up or down is required. Similarly, when driving in urban situations the ability to look 90o left and right is required to safely negotiate right angle turns at intersections, and some ability to look upward is essential to providing area security against rooftop threats. With display FOV fixed by the screen physical layout, and the magnification fixed to provide accurate and reliable depth perception, a user interface allowing the soldier to aim the sensors (or select imagery from other sensors that are aimed in the desired direction) is needed.

Conclusions

Indirect vision is one example of areas in which Human Factors Engineering provides value to a military vehicle design team using their human research skills. Other areas include crew station physical layout and Graphical User Interface (GUI) design.

As vehicle development efforts continue to incorporate new technologies such as indirect vision, intelligent sensor algorithms, vehicle protection systems, and autonomous mobility solutions, Human Factors Engineers will be very valuable assets to ensure that human-machine systems will work as desired and accomplish mission objectives.

References

Boff, K.R. & Lincoln, J.E. (1988). Engineering Data Compendium: Human Perception and Performance. Wright-Patterson AFB, OH: AAMRL

Roscoe, S.N., Hasler, S.G., Dougherty, D.J. (1966). Flight by Periscope: Making Takeoffs and Landings; The Influence of Image Magnification, Practice, and Various Conditions of Flight. Human Factors. p. 13-40.


Thursday, May 28, 2009

Water, water everywhere...

by Patty Lakinsmith
This one is for the "This Is Broken" usability file.
I found myself in an embarassing situation on a recent business trip. I was attempting to take a shower, and could not locate the control to redirect the water to flow to the shower head instead of the tub. I'm well traveled and have seen a lot of plumbing fixtures, but I was so perplexed by this one that I had to phone the front desk...or resign myself to taking a bath.
Here is the top part of the control.

Here is the bottom part.

Can you tell how to make the water come out of the shower head instead of the tub spigot? When the helpful front desk clerk explained it to me it was clear that they had a lot of calls on this one.

No fair peeking below.

Wait for it.

Wait for it.

OK, here's how it works.

Here's the answer - the bottom edge of the spigot pulls down to activate the shower redirect. Clever, huh?

This is one example of how population stereotypes can be very helpful. People expect certain frequently used controls, whether they are well designed or not. A light switch is one example (we expect them to flip up and down), water faucets another (I'll cover the temperature control on my Grohe kitchen faucet another time).

I'm sure that the designer of the faucet above was quite pleased that the mechanism to redirect the water was cleverly hidden, allowing the clean elements of the design to prevail. But how much money are these faucets costing the hotel chain that installed them? Every time a new guest stays at the hotel there is likely a call to the front desk to ask how to use it, which costs them in lost productivity. How many guests get annoyed, don't bother to call the front desk, and choose not to stay there again?

Sometimes a seemingly small user interface design decision can cause a chain reaction of events that ultimately cost a company money.

* * *

Dr. Lakinsmith is a senior scientist with Monterey Technologies, Inc., and has performed in a technical leadership role in a number of major commercial and government human engineering projects. She has applied user-centered principles and processes to the design and evaluation of both traditional and intelligent user interfaces on devices from screen-based telephones to critical cockpit systems.

Wednesday, February 18, 2009

Designing for Maintainability

by Tom Sharkey


One of my hobbies is racing motocross on vintage motorcycles. On one of my favorite motorcycles is a 1971 CZ 400, which was manufactured in Czechoslovakia. An important part of the off season maintenance on this particular bike is to clean, lubricate, and readjust the swing arm pivot. Without going in to a lot of detail, the swing arm fits between frame members and is held in place by a steel pin that passes though holes in the frame and swing arm. Tension is adjusted by tightening bolts that thread into the pin, which compresses everything together via a pair of cone washers. Figure 1 shows the basic configuration. This design was in use on this motorcycle from the early 1960s to the mid- 1970s.

Figure 1. Schematic of CZ swing arm setup.

As you can imagine, once the pin that passes through the assembly is taken out it is easy to disassemble the components so they can be maintained. Reassembly is not quite so simple, at least for one person, in large part because inserting the pin while positioning the swing arm and fitting the cone washers requires at least three hands. There are some tricks to make this easier, such as putting a dollop of grease on the cone washers to hold them in place and doing each side in series, but these are only marginally successful. The net result is that reassembly usually takes multiple tries as the cone washers drop off and fall on the work stand. While spending a few extra moments in the shop indulging in my hobby isn’t a terrible hardship for me, this is clearly not a design that had the maintainer in mind.

Jumping ahead four or five decades, I was recently asked to conduct a human engineering analysis of a new seat being designed for the military . This analysis was conducted from both the warfighter’s perspective and from the perspective of the maintainer, who happen to be the same person in the target user community. This seat is being designed to some demanding requirements including, but not limited to, weight, volume, adjustability, reliability, mean times to repair and replace, soldier survivability, and of course acquisition and life cycle costs. One of the requirements is that the seat back be removable and replaceable by a single maintainer in a very short period of time. Removal of the seat back is required to allow a maintainer to access other equipment more easily. Figure 2 shows the basic design of the attachment between the seat back and the seat base. Does this look familiar?

Figure 2. Schematic of military seat back-seat base junction.

Yes, the design being proposed for this new, state-of-the-art system is very similar to that found on my old motorcycle. One problem in the military seat design is the use of washers to eliminate lateral play between the seat base and seat back. Reinstalling these washers presents the maintainer with almost exactly the same problem as I find on the old motorcycle – how do you keep them in place while reinserting the pin that holds everything together with just two hands?

I tried to perform this operation in a nice warm, dry, well lit mock up of the military vehicle and indeed managed to drop the washers. Now imagine a military maintainer attempting to perform this task in a stressful environment, perhaps while wearing winter or other protective gloves. The outcome is likely to be dropped washers that aren’t found on the floor and consequently aren’t reinstalled. I should mention that in military systems of this type it is frowned upon to use a “dollop of grease” trick to hold the washers in place because the grease will collect dirt in the operational environment, becoming its own maintenance headache. It is also unacceptable to simply drop the washer on the floor and not recover it. There is no way of telling where the washer will end up and what the effect will be. For example, the lost washer could roll around on the floor and end up jamming the vehicle’s controls.

The bottom line is that by making a minor change in the design of the seat back that reduces the space between the seat’s base and back rest we can eliminate the need to use these washers, and alleviate the maintainability problems with the use of these washers in this venerable design. As a result of making the design change, one person can easily reinstall the seat back, the accuracy of the maintainers is increased (no washers are left out) and the time to replace the seat back after it has been removed is reduced, all of which are important metrics in the acceptability of the system. There are also cost implications. While improving the maintainability of a design to save a minute or two is of little consequence to a hobbyist working on a single machine, when hundreds or thousands of military maintainers perform a task repeatedly on their systems the cumulative time and cost adds up quickly. Now, if I could just redesign the old motorcycle I could spend a few more minutes riding.


Tom Sharkey is a senior scientist at MTI. He has over 25 years of experience providing human factors engineering support across a broad range of ground vehicle and aviation programs, both military and commercial. He is based in our Denver, CO office and can be reached at tsharkey [insert the at sign here] montereytechnologies.com.

Friday, January 16, 2009

Ethical Human Research and Your Company

by Patty Lakinsmith, Ph.D.
Is your user centered research and design company following the federal regulations pertaining to the use of human participants? If you're not, this recent story about a group of Cold War veterans filing a suit against the federal government should be a wakeup call. The regulations pertain to behavioral and social science research, as well as biomedical research, and what you don't know about them can hurt you and your chances of being funded in the future.

Six decades ago members of the military volunteered for experiments on "nerve agents, biological weapons and mind-control techniques", conducted by branches of the Department of Defense. They are now experiencing health issues, and are now sueing the government for failing to inform them about the nature of these experiments, and for failing to obtain their consent to participate.

Following numerous tragic incidents like these, in 1974 the National Research Act was signed into law and a Commission was formed to establish ethical principles guiding research involving human participants.

The Belmont Report summarizes the principles established by this law, which fall into the basic categories of Respect for Persons, Beneficiance, and Justice.

Respect for Persons means that individuals should be treated as autonomous agents, and second, that persons with diminished autonomy are entitled to protection. The concept of Beneficiance means that human subject should not be harmed, and further should serve to benefit from the research in which they participate. The principle of Justice suggests that all persons should be treated equally in receiving benefits or burdens from research participation.

According to the Code of Federal Regulations, Title 45, Part 46 (Protection of Human Subjects), any research conducted or supported by any Federal Department or Agency must comply with certain requirements for procedures, review, and oversight.

The regulation defines the term "research", and identifies what populations of individuals are protected, and those that are considered vulnerable in a research setting. The makeup and functions of an Institutional Review Board (IRB) are defined, and criteria by which an IRB will evaluate prospective research plans involving human subjects. Required elements of an informed consent form are listed, as are the rules governing the use of vulnerable populations such as pregnant women, minors, prisoners and the disabled.

Any entity wishing to conduct human research must apply for and hold an Assurance of Compliance, which is an official legal written commitment by an institution (private or public entity) made to the Federal Government to comply with the Federal regulations while conducting research with human participants. It outlines the procedures and policies in place with the research organization that will enable it to comply with federal regulations, and designates the Institutional Review Board that will oversee research proposals.

While you might think this doesn't apply to you since your work doesn't involve medical procedures, these regulations apply to behavioral studies as well. An IRB reviewing your usability study will be interested in how you recruit your participants, whether they belong to a vulnerable population (e.g. soldiers assigned to participate in your DoD study), how you explain what will happen to them in your study and solicit their consent to participate (and to be photographed or videotaped), the compensation, risks, and benefits given to your participants, and how you protect their anonymity. An IRB can review your plans and determine whether they fall into a low-risk category and are potentially exempt or eligible for expedited review.

Your ability to work efficiently and stay within the law while conducting research for a client is greatly enhanced by familiarization with these laws and revivew processes. Minor differences in a test plan can mean months less review time, and fewer hours spent in painful reviews.

We'll cover some of the key considerations for obtaining study approval in future posts.


Dr. Lakinsmith is a senior scientist with Monterey Technologies, Inc., and has performed in a technical leadership role in a number of major commercial and government human engineering projects. She has applied user-centered principles and processes to the design and evaluation of both traditional and intelligent user interfaces on devices from screen-based telephones to critical cockpit systems.