Research Statement

Dr. Jill Lynn Drury, jldrury (at) mitre.org

Return to Jill Drury's home page

Go to publications page

You're on my research statement page

Go to research collaborations

Go to presentations and courses taught page

Go to list of grants

At the Sydney Opera House in 2006, during the trip for the Intelligent User Interface 2006 Conference. (Photo: Jean Scholtz)

The one-sentence summary of my research is: I evaluate and optimize human interaction technology and work processes to support team-based decision-making in safety-critical applications. There are many concepts represented in this sentence, which are easiest to discuss in reverse order.

Leveson (1986) defined safety-critical systems as: “complex, time-critical physical processes or mechanical devices, where a run-time error or failure could result in death, injury, loss of property, or environmental harm.” Two examples of safety-critical situations that appear in my research are robot search and rescue (because robots need to quickly locate victims but avoid further injuring them) and air defense (because operators must quickly determine whether an unidentified aircraft is a “friend or foe”). Safety-critical domains are both important and challenging, imposing requirements for error-free and efficient interaction. Besides the domains cited above I have worked with command and control systems (e.g., Drury and Cuomo, 1997), unmanned aerial vehicles (e.g., see Drury, Richer, Rackliffe, and Goodrich, 2006), and hazardous materials handling (Drury et al., 2008).

The air defense example illustrates the importance of providing technology, for example in the form of fused information displays, which supports efficient and accurate decision-making. I have used cognitive task analysis methods such as Militello and Hutton’s (1998) Applied Cognitive Task Analysis technique to determine the major cognitive challenges for command and control center operators (Drury and Darling, 2008). More recently, I have been exploring "decision spaces": information that facilitates comparing the relative desirability of different courses of action. Our team developed the concept of using decision space visualizations to provide users with "option awareness" (Drury et al., 2009; Pfaff et al., 2012).

It is even more challenging to support team-based decision-making. One approach I have taken is to decompose the awareness needs of team members within a particular domain: both about each other and about the environment in which they are making decisions (the latter being usually called “situation awareness”). For example, I decomposed team awareness needs, starting with the two overall categories of presence/identity and activities, in Drury and Williams (2002). Simply having information isn’t sufficient for making good decisions; it must be presented in a way that causes the users to understand the interactions between options, environmental conditions, and outcomes. Accordingly, we extended our option awareness work to understand how it can aid team-based decision making (Liu et all., 2011).

Sometimes the right approach to aiding team-based decision-making is not to provide any new technology at all. Rather, a team’s work processes may be suboptimal, resulting in decreased efficiency or accuracy. I always try to observe people in their normal work environment, for example using the ethnographic technique known as Contextual Inquiry (Holtzblatt and Jones, 1993), to determine the combination of interaction technology and work processes that may be appropriate. We used this technique with a military planning group (Swanson, Drury, and Lewis, 2004) and our recommendations included changing the format of an important daily meeting to make better use of the group’s time as well as the use of a tool already present in the group’s collaborative tool suite.

Regardless of whether new processes or new technologies are being proposed, evaluation methods are key to both ensuring that the user’s experience will be improved and that research progress has occurred. It is usually more difficult to employ human-computer interaction techniques for multi-user, rather than single-user, systems (and especially synchronous, non-collocated systems) because of the challenge of running tests with multiple people in multiple places simultaneously (or simulating such conditions). In my thesis work I developed the Synchronous Collaboration Awareness and Privacy Evaluation (SCAPE) method (Drury, 2001) to provide a fine-grained understanding of how well an application supports awareness of team members’ presence/identity and activities when such an understanding is appropriate, and team members’ privacy when the information needs to be protected. Robots are also collaborative systems in the sense that even one robot and one human operator form a team; and I have also co-developed new evaluation methods for determining whether the robot interface provides operators with sufficient awareness (Drury et al., 2007). In general, I enjoy the challenge of adapting or developing new methods for evaluating emerging technologies.

I characterize the best of my work as being foundational, interdisciplinary, and collaborative (in the sense of having been developed in conjunction with very talented colleagues). For example, our decomposition of awareness needs for unmanned aerial vehicles (Drury, Riek, and Rackliffe, 2006) has provided a foundation upon which to build displays for these airborne robots. Our taxonomy of human-robot interaction (Yanco and Drury, 2004) is widely cited and is often used in human-robot interaction courses. I have used techniques, concepts, and approaches from HCI and computer-supported cooperative work when researching human-robot interaction. For example, we adapted HCI techniques to be more appropriate for use with robotics, resulting in different kinds of metrics for usability testing (Yanco, Drury, and Scholtz, 2004) and new “operators” (users’ actions or operations upon the system) for the Goals, Operators, Methods, and Selection rules (GOMS) method (Drury, Scholtz, and Kieras, 2007).

Finally, I have benefited from researching and publishing with more than three dozen distinguished colleagues, including an MIT-trained roboticist, Prof. Holly Yanco of the University of Massachusetts Lowell; a co-developer of GOMS, Prof. David Kieras of the University of Michigan; and a renowned HCI metrics developer, Dr. Jean Scholtz of Pacific Northwest National Laboratory (formerly of the National Institute of Standards and Technology). A summary of my recent collaborations can be seen on my collaborations page.

REFERENCES

Drury, J. (2001). Extending Usability Inspection Evaluation Techniques for Synchronous Collaborative Computing Applications. Sc.D. Thesis, University of Massachusetts Lowell, Department of Computer Science, November.

Drury, J. L., Arambula, J., Kuhn, S., Micire, M., and Yanco, H. A. (2008). Indentifying technology gaps in hazardous materials operations. In Proceedings of the IEEE International Conference on Technologies for HomelandSecurity, Waltham, MA, May 2008.

Drury, J., and Cuomo, D. (1997). Usability Issues in Complex Government Systems. NIST Special Publication 500-237, Symposium Transcription, Usability Engineering: Industry-Government Collaboration for System Effectiveness and Efficiency. Gaithersburg, MD, February 1996.

Drury, J. L. and Darling, E. (2008). A “Thin-Slicing” Approach to Understanding Cognitive Challenges in Real-Time Command and Control. Journal of Battlefield Technology, Vol 11(1), March 2008.

Drury, J. L., Keyes, B., and Yanco, H. A. (2007). LASSOing HRI: Analyzing Situation Awareness in Map-Centric and Video-Centric Interfaces. In Proceedings of the Second Annual Conference on Human-Robot Interaction, Arlington, VA, March 2007.

Drury, J. L., Klein, G. L., Pfaff, M., and More, L. (2009). Dynamic Decision Support for Emergency Responders. In Proceedings of the 2009 IEEE Technologies for Homeland Security Conference, Waltham, MA, May 2009.

Drury, J. L., Richer, J., Rackliffe, N. and Goodrich, M. A. (2006). Comparing Situation Awareness for two Unmanned Aerial Vehicle Human Interface Approaches. In Proceedings of the IEEE International Workshop on Safety, Security and Rescue Robotics. National Institute of Standards and Technology (NIST), Gaithersburg, MD, August 2006.

Drury, J. L., Riek, L., and Rackliffe, N. (2006). A Decomposition of UAV-Related Situation Awareness. In Proceedings of the First Annual Conference on Human-Robot Interaction, Salt Lake City, UT, March 2006.

Drury, J. L., Scholtz, J. and Kieras, D. (2007). The Potential for Modeling Human-Robot Interaction with GOMS. In Human-Robot Interaction, Nilanjan Sarkar, Ed. Vienna, Austria: I-Tech Education and Publishing, pp. 21 – 38.

Drury, J. and Williams, M. G. (2002). A Framework for Role-Based Specification and Evaluation of Awareness Support in Synchronous Collaborative Applications. In Proceedings of the Eleventh IEEE International Workshops on Enabling Technologies: Infrastructure for Collaborative Enterprises (WET ICE) 2002, Pittsburgh, June 2002.

Holtzblatt, K. and Jones, S. (1993). Contextual Inquiry: A Participatory Technique for Systems Design. In D. Schuler and A. Namioka (eds.), Participatory Design: Principles and Practice, Hillsdale, NJ: Lawrence Erlbaum Associates, 177-210.

Leveson, N. G. (1986). Software Safety: Why, What and How. ACM Computing Surveys 18(2): 125 – 162, June.

Liu, Y., Moon, S. P., Pfaff, M. S., Drury, J. L. and Klein, G. L. (2011). Collaborative Option Awareness for Emergency Response Decision Making. In Proceedings of the 8th International Conference on Information Systems for Crisis Response and Management (ISCRAM 2011), Lisbon, Portugal, May 2011.

Militello, L.G. and Hutton, R.J.B. (1998). Applied Cognitive Task Analysis (ACTA): A Practitioner’s Toolkit for Understanding Cognitive Task Demands. Ergonomics, Vol. 41(11), pp.1618–1641.

Pfaff, M. S., Klein, G. L., Drury, J. L., Moon, S. P., Liu, Y., Entezaro, S. O. (2012). Supporting complex decision making through option awarness. Journal of Cognitive Engineering and Decision Making, first published online 10 September 2012.

Swanson, K., Drury, J. and Lewis, R. (2004). A Study of Collaborative Work Practices in a Joint Military Setting. In Proceedings of the International Command and Control Research and Technology Symposium, Copenhagen, Denmark, September 2004.

Yanco, H. A. and Drury, J. (2004). Classifying Human-Robot Interaction: An Updated Taxonomy. In Proceedings of the IEEE Conference on Systems, Man and Cybernetics, The Hague, The Netherlands, October 2004.

Yanco, H. A., Drury, J. L. and Scholtz, J. (2004). Beyond Usability Evaluation: Analysis of Human-Robot Interaction at a Major Robotics Competition. Journal of Human-Computer Interaction, Volume 19, Numbers 1 and 2, pp. 117-149.

Updated 21 September 2013.