December 26th, 2024

Can we trust what we see?


By Lethbridge Herald on July 5, 2018.

In an era in which we’re warned to beware “fake news,” it seems technology has reached a point where we can no longer even trust our eyes.
High-tech deception has reached new levels with the development of what’s called “deepfake” videos.
According to an Associated Press story in Tuesday’s Herald, this new technology uses facial mapping and artificial intelligence to create videos that appear so amazingly genuine that it can be difficult to tell a phoney from the real thing.
And that’s producing some scary possibilities for the future. Everyone from lawmakers to intelligence officials are warning that the technology could be used to threaten national security or interfere in elections. Imagine the release of a video on the eve of an election that purportedly shows a candidate saying something damaging to his or her campaign.
“I expect that here in the United States we will start to see this content in the upcoming midterms and national election two years from now,” Hany Farid, a digital forensics expert at Dartmouth College in New Hampshire, said in the AP story. “The technology, of course, knows no borders, so I expect the impact to ripple around the globe.”
Farid said because the technology enables average people to create realistic-looking fake videos of, for example, Donald Trump saying whatever they want him to say, “we have entered a new world where it is going to be difficult to know how to believe what we see.”
So far, deepfakes have been used primarily as gags or to smear celebrities. But Republican senator Marco Rubio, a member of the Senate intelligence committee in the U.S., envisions the possibility of more nefarious uses of the technology. He offers examples such as fake video showing an American politician using a racial slur or taking a bribe, or showing a U.S. soldier massacring civilians overseas.
“It’s a weapon that could be used — timed appropriately and place appropriately — in the same way fake news is used, except in a video form, which could create real chaos and instability on the eve of an election or a major decision of any sort,” Rubio said in the AP story.
Two former U.S. ambassadors to Russia have claimed they were the victims of fabricated audio-video creations during their time in Moscow.
While the technology is still new enough to not be entirely foolproof, experts warn these deepfakes will become increasingly difficult to identify. The AP story quotes Andrew Grotto, an international security fellow at the International Center for Security and Co-operation at California’s Stanford University: “Within a year or two, it’s going to be really hard for a person to distinguish between a real video and a fake video.”
That should be disturbing news to all of us. It would be naive to think such technology couldn’t be similarly used to influence public opinion in Canada, too.
What this new technology will do is put even more onus on citizens to take a more cautious approach to news and information that discredits individuals, particularly political leaders and other public figures. As Edgar Allan Poe famously noted, “Believe nothing you hear, and only one half that you see.”
Sounds like wise advice in this age when technology can so cleverly distort what we see and hear.
Comment on this editorial online at http://www.lethbridgeherald.
com/opinions/.

Share this story:

2
-1

Comments are closed.