Two professors from Syracuse University’s S.I. Newhouse School of Public Communications who are working on the development of technology to detect manipulated media and combat the spread of fake news are supported by a subcontract that now tops $1.1 million, thanks to a recent expansion.
Jason Davis, research professor and co-director of the Real Chemistry Emerging Insights Lab, and Regina Luttrell, associate dean for research and creative activity, will continue to work on refining a theoretical framework for the creation and testing of AI algorithms that can identify manipulated media. The work is part of the Defense Advanced Research Projects Agency (DARPA) Semantic Forensics program.
“While the challenges associated with fake news and misinformation may not be new, the speed, scale and global impact created by digital media channels certainly is,” Davis says. “This research effort underscores Newhouse’s continuing commitment to addressing some of today’s most challenging problems and contributing to solutions with global impact. It is our intention that this research will help develop solutions that can detect and combat the effects of disinformation across a rapidly evolving digital landscape.”
Over the last semester, the program explored new methods for evaluating
artificial intelligence/machine learning-driven analytics and their ability to
detect and characterize various intents associated with falsified media using common propaganda tactics. Davis, Luttrell and a student research team created controlled data sets using both text and image manipulations to embed two distinct intents into news articles using four common technical propaganda tactics: bandwagoning, diktat, scapegoating or name-calling. During the data creation campaign, the research team added specific manipulations that were designed to create either a “call to action” intent or a “discredit entity” intent.
The research team’s data set provided over 600 individual probes that will be used to test a wide range of analytics and their ability to accurately detect and predict the intent behind specific media manipulations. The results of this research effort will help lay the groundwork for the development of new digital tools to help combat the threat of mis/disinformation on a global scale.
(Updated July 2022)