Season 1 Ep.13 Missy Cummings asks: should the US Military use AI weapons?
Summary

In Season 1 Ep.13 of Dr. Pawd, host Dr. Daniel Levitin sits down with Missy Cummings, a professor in the Duke University Pratt School of Engineering, to discuss the complex relationship between AI and human collaboration. With extensive experience in both the military and academia, Missy has seen firsthand the dangers of human-machine interfaces and the need for greater control in the deployment of AI to ensure safety.

Missy's expertise lies in the use of AI robots alongside humans in the real world. She founded the Human and Autonomy Laboratory to address the challenges of human-machine interfaces, subsequently becoming critical of Elon Musk's claims about Tesla's self-driving capabilities. Missy believes that claims such as these create dangerous perceptions regarding the capabilities of self-driving cars and that AI should be used in conjunction with humans, rather than replacing them entirely.

The application of computer vision has made progress, but it cannot replace the human eye and brain in the level of projection, imagination, judgment under uncertainty, and knowledge. The perfect example of it is automotive industries, which invested billions of dollars in developing computer vision, helping in the progression of research, but it still remains research. Computer vision has great applications in narrow settings that are not time or safety critical; however, safety-critical settings require a collaborative arrangement between humans and AI and consider the environment, the car, and the person to define high-risk states for accidents.

The conversation further progresses with the use of AI in manufacturing systems, wherein the fermentation units to grow vaccines, there's a foaming problem that researchers are struggling to solve. AI helps resolve this issue with the use of a vision system to track the foam growth, and an app is being built for scientists to monitor the fermentation unit. The human and autonomy share their predictions for better accuracy in the predictions over time.

The use of AI in the military is a topic that Missy and Dr. Levitin are particularly interested in, and they discuss the Air Force's claim of guiding the bombers using AI. The ambiguity involved concerning the performance of AI under dynamic adaptation is discussed accurately; AI has its limitations that make it unviable for time-critical application where dynamic adaptation is required.

Missy and Dr. Levitin conclude the conversation by acknowledging the need for greater collaboration between the defense industry and research academia to create practical solutions, reduce risk and give soldiers the tools they need to conduct their jobs effectively.