You can be a victim of ‘Deepfake’ technology

By Demion McTair. Updated 1:24 p.m., Sunday, January 23, 2022, Atlantic Standard Time (GMT-4).

An image published by Bloomberg showing a real photo of Russian president Vladimir Putin (left) and a deepfake version of someone else, purporting to be the Russian president.

You probably already know that someone can use technology to take your face and put it on someone else’s in a video.

Such a scenario will not likely hit home to you, however, until someone places spyware on your devices and use artificial intelligence software to launch a dastardly attack on your character.

But, how likely is this to happen and how far has the deepfake technology reached? First of all let’s take a look at what deepfakes really are.

According to the BBC, so-called deepfakes use machine learning to modify video footage, usually replacing one person’s face with another, with realistic results.

According to the Wall Street Journal, the term deepfake has its origins in pornography, but it has come to mean the use of AI to create synthetic media (images, audio, video) in which someone appears to be doing or saying what in reality they haven’t done or said. The technology isn’t always misused.

Deepfakes can be helpful in the healthcare and education sectors. For instance, in healthcare, the World Economic Forum says that “the development of deep generative models raises new possibilities in healthcare, where we are rightly concerned about protecting the privacy of patients in treatment and ongoing research.”

In education, Towards Data Science says that Deepfakes can help an educator to deliver innovative lessons that are far more engaging than traditional visual and media formats.

THE HORRIBLE SIDE OF DEEPFAKE TECHNOLOGY

While this technology can be used for good purposes, of recent, it is being used by unscrupulous individuals, the world over, the scam people, bring them into disrepute, steal identity, blackmail, and other evils.

There is now increasing access to deepfake apps and software.

Face app, Zao, Reface, SpeakPic, DeepFakeLab, and FakeApp are examples of deepfake apps that just about anyone can download on their smart devices.

Hackers, the world over, are becoming aggressive with the technology.

People in St. Vincent and the Grenadines have had to get their friends to report their Instagram accounts because hackers used deep fake technology to create videos of them endorsing product and money schemes.

This particular issue is one way deepfake technology is being misused.

According to the Wall Street Journal, in October, 2021, MIT Prof. Sinan Aral – a leading expert on the study of misinformation online fell victim of deepfake technology.

Prof. Sinan Aral had to warn his Twitter followers that he had discovered a video of himself that he hadn’t recorded endorsing an investment fund’s stock-trading algorithm. In reality, it wasn’t Prof. Aral in the video, but an artificial-intelligence creation in his likeness, or what is known as a highly persuasive “deepfake.”

As the Wall Street Journal puts it, “Thanks to a number of free deepfake apps that are just a Google search away, anyone can become a victim of such a scam”.

Another way that deepfake technology has been misused is for pornographic reasons. One notable victim of this is then Australian law student Noelle Martin.

According to CBSN, When Australian law student Noelle Martin was 18, she did what all people who’ve grown up with the internet do: she Googled herself. But rather than finding photos of her family vacations, she was shocked to find explicit photos of herself posted to pornographic sites. Martin, however, never took those photos. Her face had been edited on the bodies of adult film actresses and posted online. 

“I saw images depicting me having sexual intercourse, images of me in solo positions where my face was doctored onto the naked bodies of adult actresses,” she said.

What makes deepfake technology increasingly worrying is when it is merged with spyware. Hackers can be spying on your for months taking videos and photos and sounds of you and private areas of your home and then processing that information to make fake impressions.

According to Norton Security, a company that makes one of the world’s leaving anti virus software, “deepfake technology goes a lot further in how it manipulates visual and audio content. For instance, it can create people who don’t exist. Or it can make it appear to show real people saying and doing things they didn’t say or do.

“As a result, deepfake technology can be used as a tool to spread misinformation,” Norton security added.

Though recognizing deepfake technology is becoming increasingly difficult Norton Security says that certain telltale characteristics can help give away deepfake videos, including these:

  • Unnatural eye movement.
  • A lack of blinking.
  • Unnatural facial expressions.
  • Facial morphing — a simple stitch of one image over another.
  • Unnatural body shape.
  • Unnatural hair.
  • Abnormal skin colors.
  • Awkward head and body positioning.
  • Inconsistent head positions.
  • Odd lighting or discoloration.
  • Bad lip-syncing.
  • Robotic-sounding voices.
  • Digital background noise.
  • Blurry or misaligned visuals.

Researchers are developing technology that can help identify deepfakes. For example, researchers at the University of Southern California and University of California, Berkeley are using machine learning that looks at soft biometrics such as how a person speaks along with facial quirks. Detection has been successful 92 to 96 percent of the time.

Organizations also are incentivizing solutions for deepfake detection. 

What would you do if you fall victim to deepfake technology?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: