Deepfakes and Manipulation

Deepfakes and Manipulation

Last week I wrote about Quishing and the rise of Phishing using QR codes. This week I want to delve into the world of deepfakes and manipulation, and how easy it is to be tricked. With the rise of AI, and the pace that it is going, new Cyber threats are growing and becoming more sophisticated by the day.

I've shared an example of a viral deepcake that stormed the internet, whilst you know it's Deepfake, because I've told you, and so had the title of the video. The resemblance and voice is pretty convincing, wouldn't you say?

So let's back it up, What are Deepfakes? It's a form of artificial intelligence (AI) that has ushered in a new era of counterfeit images, sounds, and videos, posing serious challenges to authenticity, and privacy.

Understanding Deepfake

Deepfake technology is essentially an AI-driven ball of hoaxed images and sounds, woven together through intricate machine learning algorithms. The result? Fabricated individuals and events that never occurred in reality, like the one above of Morgan Freeman.

This technology has gained notoriety for its malicious applications, where it is exploited to deceive the public, spread misinformation, and manipulate opinions.

The Many Faces of Deepfake

The possibilities and threats we are seeing emerging are endless for what this technology can be used for:

Scams and Hoaxes

Cybercriminals can employ deepfakes to orchestrate elaborate scams and hoaxes, potentially destabilizing organizations. These deceptions can range from false confessions to financial crimes, causing damage to a company's reputation and finances.

Celebrity Pornography

Nonconsensual pornography, which targets mostly celebrities, is a major threat associated with deepfakes. Using this technology, malicious actors can create explicit content featuring famous individuals without their consent.

Election Manipulation

Deepfake videos of world leaders, such as Donald Trump and Barack Obama, have raised concerns about their potential use in election manipulation. The 2020 U.S. election campaign was particularly vulnerable to this technology.

Social Engineering

Deepfake audio recordings have been used to trick people into believing they are interacting with trusted individuals who never actually spoke those words. This deception can lead to financial losses, as demonstrated by a case involving a CEO transferring funds based on a deepfake voice impersonation.

Automated Disinformation Attacks

Deepfakes can be used to spread automated disinformation attacks, including conspiracy theories and false narratives about political and social issues. One memorable example involved a fake video of Facebook founder Mark Zuckerberg claiming to control vast amounts of user data.

Identity Theft and Financial Fraud

Attackers can use deepfake technology to create fake identities or steal real ones, facilitating various forms of fraud, including account creation and product purchases under someone else's name.

Creating deepfakes involves several methods, with two prominent approaches being:

GANs which are AI systems that employ algorithms to recognize patterns, and they are often used to create fake images and Autoencoders, specialized AI algorithms, play a crucial role in face-replacement and face-swapping technology. By retrieving and swapping facial images, autoencoders allow one face to be seamlessly transposed onto another, producing convincing deepfake videos.

How to spot these?

Identifying deepfakes can be challenging, but there are telltale signs to watch for:

  • Unnatural eye movement, as replicating natural eye behavior is difficult.
  • A lack of blinking, as deepfakes struggle to mimic this human action.
  • Unnatural facial expressions and facial morphing resulting from image stitching.
  • Unnatural body shape, since deepfake technology primarily focuses on faces.
  • Abnormal hair, as fake images often lack realistic individual characteristics.
  • Odd skin colors, a result of the technology's limitations.
  • Awkward head and body positioning, leading to jerky movements.
  • Inconsistent facial positions, causing distorted images when heads move.
  • Strange lighting or discoloration, contributing to unrealistic visuals.
  • Poor lip-syncing, where audio doesn't align with the speaker's mouth movements.

I think the use of these are here to stay, and we'll begin to see them used more in electoral campaigns, and fraud and scam activities. The options we have, are to raise awareness, so individuals know what to look out for to avoid being scammed by these and build out new laws and legislation to how these can be created and used.

Natalie Badawy 💻

Co-Organiser - Melb Talent Meetup | 2024 Recruitment Consultant of the Year Finalist | Volunteer Manager for HerTechCircle | Poddy Host - what the heck is tech(?) | 👩💻

1y

This is such a scary topic ! So important for us to be aware though! Thanks for sharing Lainey!

To view or add a comment, sign in

More articles by Lainey Nicolson

  • 2023 Wrapped

    This will be my last newsletter for 2023 and it marks 6 months since I started writing it, with 26 editions now in the…

    2 Comments
  • YOW Conference 2023

    Circuit Recruitment Group were the exclusive recruitment sponsor for YOW conference this year, and we had the BEST time…

    4 Comments
  • The Australian Cyber Security Strategy 2023-2030

    The anticipated Cyber Strategy launch took place this week, with industry professionals gathered to hear the unveiling…

  • "open to work" The green banner debate

    I wanted to talk about the misconceptions around the green banner on LinkedIn, I recently saw a post on Linkedin…

    2 Comments
  • Tech and critical Infrastructure

    In the past week alone, we have seen the impact tech has on critical Australian infrastructure, having faced, a…

    1 Comment
  • Practice what you preach

    I recently had a good old chinwag with an old candidate of mine, whom I've worked with since day dot of Circuit, and he…

    3 Comments
  • "patch me if you can"

    Recently, Clare O'Neil issued a public call to action for businesses to address three critical software…

    4 Comments
  • Gen Z, AI and Pay Transparency

    With less than three months until the close of 2023, I wanted to give you all a little insight to the trends that I…

    2 Comments
  • Miles and Mental Health

    October 10th was World Mental Health day, and I thought I would use this weeks Newsletter to talk about my own personal…

    5 Comments
  • "Be Cyber wise - don't compromise"

    Happy Monday, It's Cyber Awareness month - so this weeks newsletter is focusing on just that. Not only do I think we…

    1 Comment

Insights from the community

Others also viewed

Explore topics