View Full Version : Can you spot a deepfake
ramus
28th June 2019, 12:56
Can You Spot a Deepfake? Does It Matter?
By Max Read
http://nymag.com/intelligencer/2019/06/how-do-you-spot-a-deepfake-it-might-not-matter.html
A shadow looms over the 2020 election: Deepfakes! The newish video-editing technology (or really, host of technologies) used to seamlessly paste one person’s face on another’s body, has activated a panic among pundits and politicians. During an appearance on CBS This Morning this week, Instagram CEO Adam Mosseri summed up the general attitude toward deepfakes, which his platform currently doesn’t have a policy against: “I don’t feel good about it.” Earlier this month, deepfaked and manipulated videos of Mosseri’s boss Mark Zuckerberg and Nancy Pelosi were each the subject of breathless mainstream media coverage; last week, Congress held hearings on deepfakes. The media, a Politico headline claims, is “gearing up for an onslaught of fake video.” An onslaught! I don’t feel good about it!
Into this fray steps the the Washington Post’s Glenn Kessler, “Fact Checker” columnist, who’s published a “guide to manipulated video” with Nadine Ajaka and Elyse Samuels. The result is a beautifully designed taxonomy of what I think of as the deepfakes extended cinematic universe. The writers divide “manipulated video” into three categories — “missing context,” “deceptive editing,” and “malicious transformation”
----------------------------------------------------------------------------------
"The ramifications of this are over-whelming. This could be the excuse needed to censor everything.
Intranuclear
28th June 2019, 13:46
Well, currently deepfake videos are laughingly easy to spot because they are very blurry and have a great many color artifacts. Internally they use deep generative adversarial networks (GANs) which are just good enough to "fool" convolutional neural networks (CNNs). having said that, if you are not looking at the artifacts, then one can easily be fooled. Technology is improving all the time so maybe in 10 years it will get good enough to beat the human eye. Further, blindly trusting a video to "speak" the truth is never a good idea, just as blindly trusting a person next to you. Its like "trusting" a priest to deliver God's words verbatim. However, if you have a complicated scene with complicated backgrounds, no deepfake tech can deal with that anytime in the foreseeable future because that requires object recognition systems to be able to parse everything in the scene which is well beyond current or upcoming tech.
Look for high quality near the edges and no distortion of background objects and you'll be OK for a very very long time.
Tintin
28th June 2019, 15:11
I particularly enjoyed the obvious (?) play on words here in convolutional neural networks (CNNs) :bigsmile:
The mind boggles, pardoning the pun.
Intranuclear
28th June 2019, 15:31
I particularly enjoyed the obvious (?) play on words here in convolutional neural networks (CNNs) :bigsmile:
The mind boggles, pardoning the pun.
Haha, the irony actually occurred to me as I typed it :)
Denise/Dizi
28th June 2019, 18:04
I find it interesting that the government, and the politicians, and those that are considered the "Elite"... Essentially those that are wiling to sell out the population for their own gains, are suddenly concerned about media manipulations HAHAHA...
Fake News, Deep Fake, Info Wars, all things now becoming "Serious issues" in our society apparently..
"Ok folks, we're going to address this, but forget that we have been manipulating the media since it's invention".. "Nothing to see there... walk away"
Perhaps it isn't funny? But I can see some humor in this.
Intranuclear
28th June 2019, 20:41
No there is nothing funny about these technologies but it may end up being a blessing.
BTW, today DeepNude was taken off market. I will not post pictures as it basically unclothed ANY woman in real time. https://www.vice.com/en_us/article/kzm59x/deepnude-app-creates-fake-nudes-of-any-woman
When photographic tools that we used to depend on become unreliable, we then are back to having to really listen and trust each other, and of course we all know the perils and delights associated with that.
TomKat
29th June 2019, 15:45
No there is nothing funny about these technologies but it may end up being a blessing.
BTW, today DeepNude was taken off market. I will not post pictures as it basically unclothed ANY woman in real time. https://www.vice.com/en_us/article/kzm59x/deepnude-app-creates-fake-nudes-of-any-woman
When photographic tools that we used to depend on become unreliable, we then are back to having to really listen and trust each other, and of course we all know the perils and delights associated with that.
I used to fantasize that some politician blackmail material would be leaked, exposing them as pedophiles or something. But they would just say "deep fake."
Denise/Dizi
29th June 2019, 16:21
No there is nothing funny about these technologies but it may end up being a blessing.
BTW, today DeepNude was taken off market. I will not post pictures as it basically unclothed ANY woman in real time. https://www.vice.com/en_us/article/kzm59x/deepnude-app-creates-fake-nudes-of-any-woman
When photographic tools that we used to depend on become unreliable, we then are back to having to really listen and trust each other, and of course we all know the perils and delights associated with that.
Wow, Intranuclear... They have an app fpr everything don't they? I don't think I will ever look at someone holding their phone up towards me the same ever again..
And TomKat.. You're probably right about that.. Seems whenever real things do get leaked, they're immediately dismissed, and everyone believes the cover story.. For the most part. We're still laughing about a weather balloon... I tend to believe the first story in that situation. I have hope that someday rather than fake news, and manipulated stories, that humanity will learn to engage in sharing some truth for a change.. But I won't hold my breath for too long at any one time..
There seems to be a percentage of humanity that things it is much better than only a few know the really important things.. And that would be ok, if there wasn't deliberate manipulation of such things.. More of a "If you find it, well then you get it" kind of thing. Those that manipulate to deceive however? Are just the worst of the worst in my book..
Intranuclear
29th June 2019, 18:38
It's not so much a conspiracy as it is millions of teenagers getting their hands on very powerful tools, which is akin to handing a teenager a small tactical nuke.
Clearly most of them would not abuse it, but with millions of samples, many will "experiment" with their powers.
The alternative is probably worse though, classifying everything technological and making it illegal to own or even discuss such things.
Living in a paranoid society is no life at all.
We have to learn to understand all these issues with patience and love and teach our kids of the powers they will wield instead of just shielding them from all that is dangerous.
I also cannot live in a society which plugs their ears and hums la la la, don't worry...be happy...
Modern life is and will get more complicated as we add newer and more powerful technologies and it is difficult to imagine a future of humanity that does not change drastically.
People want to know the truth, but the truth is that anything is possible, so what is true?
How can a single mind take it all in and solve it all if not with love and acceptance.
Hatred and paranoia backfire big-time and amplify those negative feelings, so clearly it is not an option.
I cannot protect my children from what is to come. All I can do is to let them know that I love them and for them to pass on the love with open eyes.
Isn't it obvious that all the jihads, school shootings, murders, wars, genocides, and more are the result of people being misunderstood, not loved, ignored, placated, mistreated, abused or taken for granted?
DaveToo
30th June 2019, 19:12
I used to fantasize that some politician blackmail material would be leaked, exposing them as pedophiles or something. But they would just say "deep fake."
There's no need to fantasize about it anymore. It actually happened!
Dennis Hastert, former American congressman who served as the 51st Speaker of the U.S. House of Representatives from 1999 to 2007 who was deemed a "serial child molester", had a sentence of 15 months in prison imposed upon him, two years' supervised release, and a $250,000 fine.
He entered the Federal Medical Center prison in Rochester, Minnesota, in 2016 and was released the following year, after 13 months in prison.
https://www.nytimes.com/2017/07/18/us/dennis-hastert-released.html
Cara
11th October 2019, 08:31
This article from The Register covers a recently released report that tracks the increasing proliferation of Deep Fake videos and the easy accessibility of Deep Fake tools.
Online deepfakes double in just nine months, scaring politicians – and fooling the rest of us
Surprise, surprise, a whopping 96 per cent of them are X-rated
By Katyanna Quach 8 Oct 2019 at 09:15
Deepfakes - counterfeit content generated by AI algorithms - are on the rise, staining the internet with doctored pornography, fake videos of political leaders, and bot accounts.
There are now 14,678 deepfake videos plastered on the net, according to a report [PDF] written by Deeptrace (https://regmedia.co.uk/2019/10/08/deepfake_report.pdf), a startup focused on building software that can detect the machine learning forgeries. That number has shot up 100 per cent, up from 7,964 videos posted over the last nine months to the current figure.
It should come of no surprise that most of them - 96 per cent of them, in fact - are pornographic. Deepfakes (https://www.theregister.co.uk/2018/01/25/ai_fake_skin_flicks/) first made headlines when internet perverts started using the technology to swap out the faces of porn stars for celebrities in x-rated clips, back in January last year, after all.
Creepy people began swapping fake videos of their favorite pop stars or actresses or sharing tips on how to craft your own custom porn. The code to generate these horrendous creations was all publicly available on repositories like GitHub, so with a little technical know-how and the right training data, they weren’t difficult to make. GitHub has been trying to delete DeepNude ripoffs (https://www.theregister.co.uk/2019/07/09/github_deepnude_code_discord/).
Fast forward more than a year, and it’s now even easier. Computer applications like DeepNude (https://www.theregister.co.uk/2019/06/27/deepfake_nudes_app_pulled/), allow users to forge their own nude images with a few clicks. All you have to do is give the software engine a picture and it’ll paste it onto a naked body. The folks from DeepNude retracted their code when it was widely criticized, but the damage had already been done.
The DeepNude devs sold the software under a premium license to Windows and Linux desktop users for $50 a pop. Some of these people then went on to resell the app to other corrupted individuals, hoping to make some money of their own.
The report also found internet marketplaces advertising to help people produce custom deepfakes, asking for up $30 to clone victim’s voices to make someone say something they haven’t actually said in real life in audio clips, or $10 for a more simple fake text.
All these smutty fake clips have racked up a whopping 134,364,438 views across the top four porn websites dedicated to deepfakes. And guess what, 100 per cent of them targeted women.
In the small 4 per cent of deepfake content that’s not pornographic, however, 61 per cent of videos feature men. These videos normally feature Hollywood actors, political leaders, or occasionally tech CEOs.
One of the biggest concerns is that the technology could sow political discord and undermine elections. Deepfake videos have already rocked Gabon and Malaysia. An appearance of Gabonese president Ali Bongo (https://www.facebook.com/tvgabon24/videos/324528215059254/?v=324528215059254) was jittery and stiltered, prompting people to question its authenticity. The clip also appeared at a time when the President was laying low amidst an attempted coup from Gabon’s military; the government was accused of hiding behind Bongo’s supposed health issues.
In Malaysia, the Minister of Economic Affairs Azmin Ali was depicted engaging in homosexual sex acts with a rival political aide. Sodomy is illegal in Malaysia. Ali has denounced the video as a deepfake churned up to destroy his career.
And that’s not all either. Fake bot accounts on Twitter and LinkedIn using deepfake images as profile pictures are plastered on the internet too. Two of the most famous ones are for a so-called Maisy Kinsley (https://www.theregister.co.uk/2019/06/17/roundup_ai/) and Katie Jones. Kinsley claimed to be a journalist at Bloomberg and was attempting to contact Tesla short sellers, whilst Jones was a researcher at a think tank hoping to spy on government officials.
AI researchers are racing ahead to develop machine learning algorithms to detect deepfakes. In many cases, judgement still relies on educated guesswork. If something looks strange, maybe it’s an earring out of place or a telltale blurry wrinkle, then it might just be a deepfake.
“The speed of the developments surrounding deepfakes means this landscape is constantly shifting, with rapidly materializing threats resulting in increased scale and impact. It is essential that we are prepared to face these new challenges. Now is the time to act,” the report concluded. ®
PS: The US state of California has approved (https://a24.asmdc.org/news/20191004-california-bans-deep-fakes-video-audio-close-elections-associated-press) a law that "bans the distribution of manipulated videos and pictures that maliciously aim to give someone a false impression about a political candidate’s actions or words within 60 days of an election," according to Assemblyman Marc Berman (D).
From: https://www.theregister.co.uk/2019/10/08/deepfake_videos_report/
Here’s the PDF report referred to in the article:
https://regmedia.co.uk/2019/10/08/deepfake_report.pdf
Cara
16th October 2019, 07:13
More news coverage on deepfakes: this one is rather sensationalist.
Deepfake videos 'will cause worldwide chaos and pull society apart' expert warns
There are fears fake videos will be used to spread political turmoil and divisions
News
Deepfake videos are becoming increasingly easy to make, and harder than ever to detect, experts warn.
As fake videos are used to spread misinformation quickly around the world, there are fears that more effort is going into developing deepfake-generating tools than into detection.
EU tech policy analyst at the Centre for Data Innovation Eline Cheviot warned that there is a growing imbalance between technologies for detecting and producing deepfakes and that this means there is a lack of tools needed to efficiently tackle the problem.
And she warned humans are no longer able to spot the difference between deepfakes and the real thing, and so are unable to stop weaponised fake news from spreading.
"Debunking disinformation is increasingly difficult, and deepfakes cannot be detected by other algorithms easily yet," she said.
"As they get better, it becomes harder to tell if something is real or not.
"You can do some statistical analysis, but it takes time, for instance, it may take 24 hours before one realises that video was a fake one, and in the meantime, the video could have gone viral."
She added that simply bringing in laws banning or regulating deepfakes is unlikely to be enough and that more understanding is needed by politicians of the technology.
"Partnerships should be developing with industry including social media companies, university researchers, innovators, scientists, startups, to build better manipulation detection and ensure these systems are integrated into online platforms," Eline went on.
https://i2-prod.dailystar.co.uk/incoming/article20574180.ece/ALTERNATES/s615b/0_GettyImages-1172921055.jpg
Social media apps should have better deepfake detection software, tech CEOs have claimed (Image: Getty)
https://i2-prod.dailystar.co.uk/incoming/article20574237.ece/ALTERNATES/s615b/0_unnamed-4.jpg
Deepfake videos have already been used in eerie clips online
Tech entrepreneur Fernando Bruccoleri said tech platforms need to make it easier for people to work out what is real and what is fake.
"I think it will not be as simple as it seems to be able to pass and legislate in the short term," he said.
"Surely any platform will design tools to detect if a video is fake or not, as a counterpart."
But CEO of video verification site Amber, Shamir Allibhai, whose company specialises in detecting fakes, said that it would be impossible to regulate the creation of deepfakes.
Instead, he said, platforms should work to tackle the distribution of such videos, in the same way that they already work to stop the spread of revenge porn.
And he also warned that deepfakes are here to stay, adding: "I think we are going to see significantly more of it in the run-up to the US presidential elections in 2020."
It comes after a study revealed almost all deepfake videos are porn (https://www.dailystar.co.uk/news/world-news/deepfake-videos-online-almost-porn-20534256) .
Despite the fears about the political use of such videos, around 96% of AI-generated clips online were adult fakes.
From: https://www.dailystar.co.uk/news/world-news/deepfake-videos-will-cause-worldwide-20574174
Mashika
16th October 2019, 08:05
Well, currently deepfake videos are laughingly easy to spot because they are very blurry and have a great many color artifacts. Internally they use deep generative adversarial networks (GANs) which are just good enough to "fool" convolutional neural networks (CNNs). having said that, if you are not looking at the artifacts, then one can easily be fooled. Technology is improving all the time so maybe in 10 years it will get good enough to beat the human eye. Further, blindly trusting a video to "speak" the truth is never a good idea, just as blindly trusting a person next to you. Its like "trusting" a priest to deliver God's words verbatim. However, if you have a complicated scene with complicated backgrounds, no deepfake tech can deal with that anytime in the foreseeable future because that requires object recognition systems to be able to parse everything in the scene which is well beyond current or upcoming tech.
Look for high quality near the edges and no distortion of background objects and you'll be OK for a very very long time.
Only in the eyes of someone who knows what to look for
If someone were looking at them on a small screen and possibly distressed or in any other way not in their right mind to think twice about it, would they be able to tell the difference?
Mark (Star Mariner)
22nd December 2019, 17:46
Deep fake George Lucas, and the Rise of Skywalker..
This could probably go in the Movies and TV section, being pertinent to Star Wars, but this deep fake is really quite something. Especially when you consider its home made and put together over just a couple of days. It's pretty funny, but ironically funny, as the awful Rise of Skywalker is more or less the death-knell of our beloved Star Wars saga. It is also disturbing what this technology, even the low budget versions of it, are now capable of.
g5LcbX_USg4
Paul D.
31st October 2023, 14:54
Iain Davis , @ _InThisTogether , reposted this tweet with the comment "yes,it is " . F.w.i.w. , I agree , but not only this tech .will be used .
Deep fake Tom Cruise video, 1min.& 37 secs.
Text:
I’m telling you right now this technology is going to be used as an excuse to regulate the internet and force users to adopt digital id.
All it will take is enough people being fooled that it causes real world implications. A stock price plummets or a person is misrepresented and it leads to violence. The internet version of the PATRIOT Act is sitting in a drawer somewhere on Capitol Hill just waiting for the day.
1719110128648040880
Powered by vBulletin™ Version 4.1.1 Copyright © 2025 vBulletin Solutions, Inc. All rights reserved.