It appears you have not yet registered with our community. To register please click here...

 
Go Back [M] > Madshrimps > WebNews
DoD develops software to spot fakes DoD develops software to spot fakes
FAQ Members List Calendar Search Today's Posts Mark Forums Read


DoD develops software to spot fakes
Reply
 
Thread Tools
Old 10th August 2018, 06:53   #1
[M] Reviewer
 
Stefan Mileschin's Avatar
 
Join Date: May 2010
Location: Romania
Posts: 148,408
Stefan Mileschin Freshly Registered
Default DoD develops software to spot fakes

AI forensics tools

The first forensics tools for catching fakes created with AI have been developed through a programme run by the US Defense Department.

The most common technique for generating fake videos involves using machine learning to swap one person’s face onto another's. The resulting videos, known as “deepfakes”, are simple to make, and can be surprisingly realistic particularly if you have taken large amounts of drugs or are especially stupid.

However after tweaking by a skilled video editor, deep fakes can be made to seem more real which makes the tech a UK defence threat. For example a conspiracy nut could release improbable fake footage showing that Barak Obama killed JFK or a black person beating himself to death in the cells.

Tools for catching deepfakes were developed through a programme—run by the US Defense Advanced Research Projects Agency (DARPA)—called Media Forensics specialising in software automated existing forensics tools, but has recently turned its attention to AI-made forgery.

Apparently, there are subtle cues in current GAN-manipulated images and videos that allow it to detect the presence of alterations.

Siwei Lyu, a professor at the University of Albany, State University of New York realised that the faces made using deep fakes rarely, if ever, blink. And when they do blink, the eye movement is unnatural. This is because deepfakes are trained on still images, which tend to show a person with his or her eyes open.

Others involved in the DARPA challenge are exploring similar tricks for automatically catching deepfakes: strange head movements, odd eye colour.

Lyu says a skilled forger could get around his eye blinking tool simply by collecting images that show a person blinking. But he adds that his team has developed an even more effective technique, but says he’s keeping it secret for the moment.

“I’d rather hold off at least for a little bit. We have a little advantage over the forgers right now, and we want to keep that advantage.”

https://fudzilla.com/news/ai/46908-d...-to-spot-fakes
Stefan Mileschin is offline   Reply With Quote
Reply


Similar Threads
Thread Thread Starter Forum Replies Last Post
Over 1.3 million anti-net neutrality FCC comments are likely fakes Stefan Mileschin WebNews 0 27th November 2017 05:32
AI spots art fakes by examining a single brushstroke Stefan Mileschin WebNews 0 24th November 2017 18:23
Apple furious at “Amazon” for peddling fakes Stefan Mileschin WebNews 0 21st October 2016 11:13
Patriot Develops New Enhancements to V560 Gaming Mouse Software Stefan Mileschin WebNews 0 10th February 2016 07:03
IBM develops app to help blind navigate better Stefan Mileschin WebNews 0 16th October 2015 09:06
Luxury brands sue Chinese online shopping giant for allowing fakes Stefan Mileschin WebNews 0 18th May 2015 09:13
How to Install Software From Outside Ubuntu’s Software Repositories Stefan Mileschin WebNews 0 8th April 2013 09:09
Researcher develops NIC rootkit jmke WebNews 0 24th November 2010 12:54
Student develops software to remove people from Google Street View jmke WebNews 0 12th August 2010 11:26
Nvidia fakes Fermi boards at GPU Technology Conference? jmke WebNews 3 2nd October 2009 17:11

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off


All times are GMT +1. The time now is 10:39.


Powered by vBulletin® - Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO