The Dark Side of AI: How GPT and Socials Can Be Used for Manipulation
In my last article, I spoke about the negatives of AI, and here I go again. I want to be clear, I am a huge proponent of AI, used in the right ways. I have many examples and use cases that I would love to share here, but many are related to my daily work, and as such, need to be kept confidential. I promise I will share some practical use cases for GenAI use at home soon... my daughter and I have been working on some great practical use cases!
However today, I am again going to share the dark side of AI because things in today's digital world aren't getting any easier to recognize as reality, especially with new artificial intelligence tools like ChatGPT that are revolutionizing the way we access information. These technologies can be incredibly useful, but they also open the door to serious privacy risks and fraud. Recently I decided to look up information on myself using GPT-based search tools, and the results were surprising. Not only did I find multiple people with my name, but I also saw how easily AI could connect those results to social media profiles, professional directories, and even legal records. This experiment made me realize just how vulnerable our personal information is—and how criminals are taking advantage of this technology for deception.
AI-Powered Fraud
A recent incident involving a colleague illustrated just how dangerous this can be. They received a phone call claiming their child had been involved in a serious situation and required bail money. The caller instructed her to Venmo the money immediately to an account, warning her of the urgency of the situation. Thankfully she was skeptical and verified the situation before sending anything—but many parents aren’t so lucky.
Scammers are now using AI-generated voices and online data scraping to create highly convincing fraud schemes. They gather names, phone numbers, addresses, and family connections from social media, university websites, and professional directories to make their scams more believable. In this case, it’s easy to see how a scammer could have found the names of students, their parents, and even team members from a university athletics website, then used AI tools to create a convincing fake emergency.
Personal Data Exploited
Most people don’t realize just how much of their information is available online. Here’s how a scammer could build a fraudulent scheme step by step:
1. Data Collection from University Websites – Many university websites list student athletes, their parents, and emergency contacts. Scammers can scrape this information using automated tools.
Recommended by LinkedIn
2. Social Media Connections – Parents often tag their children in Facebook and Instagram posts. This helps scammers build a more complete profile.
3. AI Voice Cloning – Using just a few seconds of a person’s voice from a social media video, AI tools can generate a realistic imitation, making phone scams more convincing.
4. Emotional Manipulation – By combining all this data, scammers can create highly personalized scams, making it harder for victims to detect fraud.
Previously, it was relatively easy to identify a fake email, voice mail or chat message because the grammar used wasn't perfect. However, one of the most unfortunate benefits of ChatGPT to these criminals is leveraging these tools for their perfect language skills. Now it is difficult to detect a scam through the use of language in the communication because the misspellings and verb-tense mistakes simply just aren't present any longer.
Protecting Yourself Against AI-Enabled Fraud
To prevent falling victim to these advanced scams, consider:
While AI offers incredible benefits, it also presents serious risks. My own search experience showed just how easy it is to link a name to multiple online profiles, and my colleague’s terrifying phone call highlights how scammers are exploiting AI to manipulate emotions and steal money. As AI continues to evolve, so do the dangers of fraud and deception. The best defense is awareness, and always think twice.