Fake Videos

Do you think you’re good at spotting fake videos, where famous people say things they’ve never said in real life? See how they’re made in this astonishing talk and tech demo.

Computer scientist Supasorn Suwajanakorn shows how, as a grad student, he used AI and 3D modeling to create photorealistic fake videos of people synced to audio. Learn more about both the ethical implications and the creative possibilities of this tech—and the steps being taken to fight against its misuse.

Watch the TED TALK below:

fake-video.jpg image: fake video ted talk border=0

---

Fake And Manipulated Videos Find Fertile Ground As Elections Approach

By Wayne Rash
Forbes
July 14, 2020

By now you’ve seen them on Facebook being shared by your friends. They’re videos of one politician or another saying something you never believed they’d say. Perhaps it’s a video of British Prime Minister Boris Johnson ENDORSING HIS OPPONENT. Or perhaps its Facebook CEO Mark Zuckerberg admitting openly that HE WANTS TO OWN YOU.

But as the 2020 elections draw near in the U.S. you’re seeing more and more unlikely videos involving candidates for President or other high office being shared on social media.

As the technology for creating fake videos becomes more sophisticated, and the videos themselves get better and harder to detect, they’re showing up in all sorts of places, ranging from revenge porn to clickbait. Videos ostensibly of famous people are showing up all over, sometimes to cash in on their wealth, or simply to collect clicks for advertising revenue.

When it comes to elections, however, the threat is more serious. During the 2016 Presidential election, Russian intelligence agencies produced a series of fake videos that purported to show then-candidate Hillary Clinton saying things she never said. At the time, this formed part of the FBI’s investigation into election meddling.

The Russians are Coming

Now the Russians are coming back. They’re joined by intelligence agencies in Iran, China and North Korea in their production of high-quality fake videos attempting to make the rounds on social media. The resulting videos are good enough that gullible people post them and then share them, and of course, people with an agenda that agrees with the video will share them even more. Last year, the Department of Homeland Security expressed its concern about these fakes IN A REPORT to the Homeland Security Advisory Council.

“Information warfare techniques are being used by all sides of the political spectrum, and being used by all governments around the world, now more than ever,” said Alex Heid, chief research officer at SECURITY SCORECARD. “Like any war that is waged between established powers, the casualties are oftentimes civilian and considered collateral damage.”

Finding the Fakes

There are really two types of videos that are considered fakes. One is manipulated video, in which a real video is simply changed. A recent one, revealed by The Washington Post, showed House Speaker Nancy Pelosi delivering a speech, but the TIMING WAS CHANGED so that it appeared that she was slurring her words. The narrative that accompanied the video suggested that she was drunk. The real video of the event revealed what had been done.

A deep fake video uses a combination of still images and videos to manufacture a video of something that never existed.

College Admissions-Bribery-USC

“We’ve created two generations of deep fake detectors,” said Wael Abd-Almageed, research associate professor at the University of Southern California. We didn’t release them publicly.” The USC DEEP FAKE DETECTOR is said to be 99 percent effective, which is far better than the open-source products that are available for spotting deep fakes. The reason the products aren’t released is that they could be used by purveyors of deep fakes to improve their software.

Deep fakes are becoming extremely sophisticated. It’s hard to find out if a given video is fake or not, Abd-Almageed explained. He said that his group at USC has interests in the entertainment industry where deep fakes are also a problem, and he expects other industries will also want to work with USC. He said that he can work at scale with social media companies, as well as with other organizations.

“Other users could be media outlets,” he said.

So how do you work with USC? “We can give you access to our system where you upload the video,” Abd-Almageed said. “You will get notified when the system gets finished analyzing the video. It’s free for media outlets and other interested parties.”

He said that those interested parties should reach out to him by email at wamageed at isi.edu.

“I think it’s a big problem, part of a bigger problem which is disinformation, he said. “Don’t share something on social networks without thinking. This whole notion that seeing is believing is gone forever.”

SOURCE