Technology

Ask The Techspert: Beware of Deepfakes

Have you seen one of the many fabricated images or videos of public figures circulating on social media? If yes, then you know what deepfakes look like. In this article, we will cover what deepfakes are, their danger, how to avoid falling for them, and what to do if you ever find yourself the victim of a deepfake attack.

What are deepfakes?

The term deepfake is derived from “Deep learning,” a subset of machine learning methods based on artificial neural networks with representation learning, and the word “fake,” which doesn’t need any explanation.

This term was first coined in November 2017 when deepfake, an anonymous user on Reddit (a social news and forum website), used an open-source face-swapping technology that leveraged existing artificial intelligence algorithms to create and share fake videos.

The word deepfake has since expanded to include any application or software that can create realistic yet fabricated images, sounds, or videos of people.

Creating a deepfake video at the time required considerable skills and complex applications. Now, in 2023, the ever-evolving world of artificial intelligence (AI) makes deepfakes more realistic and easier to produce…and more dangerous and widespread.

Deepfake Examples

  • What if the Moon Landing had gone wrong? The MIT Center for Advanced Virtuality presents an alternative history using deepfake technology, showing its potential for misinformation. Watch it below.

  • In this next short YouTube video, Gayle King and Tom Hanks’ deepfakes were used to promote products they never endorsed. Watch it below.

The danger of deepfakes

The ability to fabricate videos and audio of any individual can be dangerous and promote all kinds of unethical actions.

Here are three of the most significant threats of deepfake:

  • Misinformation: As in the examples above, people use deepfake technology to create videos or audio of people doing or saying things they never actually did or said.  That can influence public opinion and create confusion about  important issues.
  • Privacy and consent issues: Perpetrators often violate the victims’ privacy by fabricating content that they maliciously use to blackmail, harass, or defame people or steal their identities. 
  • Security Risks: Criminals can use deepfake technology to impersonate people in video calls or create fraudulent content to deceive security systems. In 2019, the first major deepfake attack happened when the CEO of a U.K.-based energy firm listened over the phone as his boss — the firm’s German parent company leader — ordered the transfer of €220,000 (~$235,000) to a supplier in Hungary. The voice sounded legitimate, and the CEO fell for it. You can read more about this cybercrime in this Wall Street Journal article.

How to avoid falling for deepfakes?

In today’s world, we should remember not to believe everything we see online, especially on social media. The best way to avoid falling for deepfakes is by getting your news from trusted news outlets. If you stumble upon a newsworthy video online, you should be able to find different news sources covering the same story. Otherwise, it would be best to take that video with a grain of salt.

What should I do if I am a victim of deepfake?

Technology is moving faster than the legislation that should be in place to protect individuals against its misuse. If you are the victim of a deepfake attack, here are a few things you can do:

  •  Collect evidence by documenting everything – from the screenshot of the abuse to your communication with the platform managers to take it down.
  •  Flag or report the deepfake and alert the platform administrators or moderators. 
  • Contact legal professionals to explore civil or criminal avenues. It’s possible to seek justice under laws that define related crimes, such as extortion, harassment, defamation, etc.
  • Speak out about it, and don’t hesitate to seek medical help if you have trouble coping.

What to do next?

Check out the following Senior Planet resource that can help you avoid falling for false information:

Also, join the Ask a Tech Expert groups on the Senior Planet Community platform to ask any tech questions.

Your turn

What do you think about the dangers posed by deepfakes? Share your thoughts – or remedies – in the comments. 

 

Jonathan-Techspert-techTechspert Jonathan is Senior Planet’s Sr. Digital Community Relations and Product Specialist and a former Senior Planet San Antonio technology trainer. He is also an iOS developer with a background in Information Systems and Cyber Security.

Are You Digital Skills Ready?

Want to get even more digitally savvy? Senior Planet is proud to work with AARP Foundation on the Digital Skills Ready@50+™ initiative, made possible through a generous grant from Google.org. The resources focus on digital essentials to help older adults find and secure jobs, change careers, or explore entrepreneurship. Visit here to learn more and register  – registration is required.

Have a tech question that’s got you stumped? Send your tech questions to Techspert Jonathan using THIS FORM. He’ll be tackling one question a month from readers.

COMMENTS

Leave a Reply

Senior Planet’s comments are open for all readers/subscribers; we love hearing from you! However, some comments are not welcome here as violations of our Comment Policy. If you would like to express a comment about Senior Planet locations or programs, please contact info@seniorplanet.org. Want to continue the conversation? Start your own discussion on this topic on Senior Planet Community.

Your email address will not be published. Required fields are marked *