Human-centered computing graduate student Eric Vorm recently received a $75,000 research grant from the U.S. Air Force. The award, titled “Designing for transparent automated visual classification systems,” will support Vorm in improving the human-machine team by designing interfaces that enable improved human operator awareness and facilitate explanation-based reasoning when assisted by automation, i.e., artificial intelligence or AI.
Lieutenant Vorm, a Ph.D. candidate at the IU School of Informatics and Computing at IUPUI, is an Aerospace Experimental Psychologist on active duty with the U.S. Navy. He has a background in aerospace and mechanical engineering and experimental psychology, and served with the Marine Corps in Iraq.
Davide Bolchini, chair of the Department of Human-Centered Computing, said, “We are very proud of this prestigious grant that Eric received. His research is very timely—and broadly applicable to everyday digital products that promise higher autonomy and intelligence in supporting users.” Vorm’s thesis advisor, Andrew Miller, assistant professor of Human-Computer Interaction, added, “This project, and Eric’s dissertation as a whole, exemplifies our school’s commitment to applied, interdisciplinary, human-centered research. I’m excited to continue to work with Eric to improve transparency in human-machine interaction.”
According to Vorm, today’s military intelligence analysts must monitor and search through vast amounts of complex visual data for hours on end in order to identify and classify potential threats to the US and our allies. Automated visual classification systems can assist in these tasks by processing vast amounts of visual data and alerting operators when the computer finds something of interest.
Sometimes, however, the data computers must work with is incomplete, and visual classification systems employing deep learning neural networks are still prone to misidentification. These limitations often result in errors and false alarms, which the operator must be able to detect in order to ensure system integrity and mission performance goals are met. In periods of high stress and limited time, however, studies have shown that humans often fail to detect these kinds of errors, and often defer to the system’s recommendations, even when those recommendations are incorrect.
“It all comes down to decision making,” Vorm said. “Humans are great at using a combination of logic and intuition to figure things out, but their decisions are only as good as the information they have to work with. We need to make sure that that information is provided in the right format at the right time to support the operator’s ability to reason and handle challenging events. That is what designing for ideal human-computer interaction is all about.”
Vorm’s research, which begins this fall and will be run at the Air Force’s Human Performance Wing at Wright-Patterson Air Force base, seeks to evaluate and develop interface designs and provide design guidelines for how to achieve appropriate automation transparency to support decision making in complex and time-sensitive conditions. According to Vorm, this research will be useful in improving a variety of civilian technologies as well, from healthcare applications like robot-assisted surgery and AI-assisted diagnoses, to AI-based financial trading, and self-driving cars.
“Our future work environments will be increasingly supported by artificial intelligence, which will continue to assume larger roles in our daily lives. In order for this human-machine team to work,” Vorm said, “we must make sure that both the human and computer communicate effectively with one another. This research is just a small step toward achieving that vision.”