In the 2020 election, you might not be able to believe your eyes or your ears due to advances in artificial intelligence that researchers warn could be used in the next wave of election meddling. The rise of AI-enhanced software will allow people with little technical skills to easily produce audio and video that makes it nearly impossible to distinguish between what is real and what isn’t, according to a report released Wednesday from researchers led by Oxford University and Cambridge University.

Entitled “The Malicious Use of Artificial Intelligence: Forecasting, Prevention and Mitigation,” the report was released to sound the alarm about how artificial intelligence is becoming easier to use — and could become a key tool in the arsenal of foreign operatives seeking to spread disinformation. The report was authored by 26 of the world’s leading researchers in artificial intelligence.

“There is no obvious reason why the outputs of these systems could not become indistinguishable from genuine recordings, in the absence of specially designed authentication measures,” the authors warn. “Such systems would in turn open up new methods of spreading disinformation and impersonating others.”

While the industry celebrates the positive effects AI can have on the future, the researchers warned that equal consideration must be given to the dark side of AI. They hope the community will mobilize now to mitigate future detrimental effects of the technology.

Artificial intelligence will “set off a cat and mouse game between attackers and defenders, with the attackers seeming more human-like,” said Miles Brundage, a research fellow at Oxford University’s Future of Humanity Institute and one of the authors of the report. The report was a joint project between a group of researchers and technologists including Oxford University’s Future of Humanity Institute, Cambridge University’s Centre for the Study of Existential Risk, and OpenAI, a non-profit AI research company. Read more from nbcnews.com…

thumbnail courtesy of nbcnews.com