The New Way Scammers Are "Poisoning" Your Search Results
- Tasha Hoskins

- Mar 2
- 3 min read
Even those of us who grew up with a smartphone in our hands can fall into a trap when we’re in a rush or stressed out.
Take my nephew, for example. He’s a Gen Z University of Georgia grad with a degree in Mechanical Engineering. He isn't some deep-level tech expert, but he’s definitely comfortable with it. He lives and breathes technology, so he’s used to it just... working. But a few weeks ago, he found out the hard way that you can't always trust what shows up on your screen.
The Scam Explained
He needed to book a flight, so he did what any of us would do: he Googled "American Airlines customer service number." Instead of going directly to the official site, he grabbed a number from the Google AI Overview search results.

It looked official, the person on the other end was professional, and they walked him through the whole booking process just like a real agent would. But then, the friction started:
The "Failed" Payment Loop - He gave his credit card info, but the rep claimed it was "declined." They had him try a second card, and when that "failed" too, they asked: "Do you have a debit card we can try instead?" By getting him frustrated and just wanting to be done with the call, they convinced him to hand over his debit info, which is much harder to dispute than a credit card.
The "Booking Queue" Trick - This is where they get clever. They told him he was in a "booking queue" and it would take 24 hours to confirm. This delay isn't a glitch; it’s a tactic to bury the transaction. While he was waiting, they used his debit card to pay themselves a $350 "service fee" through their own portal.
The Double Charge - The next day, when he still hadn't seen his tickets, he called back. They proceeded to book the flight through American Airlines, but they charged the actual ticket cost to one of the other credit cards he’d given them earlier.
In the end, he had a seat on a plane, but he’d been charged twice: once for the real flight, and once for a "service" he never asked for.
The Modern Scam: Search Engine and AI Poisoning
This is a new tactic known as “AI Search Poisoning” or "Search Engine Poisoning" and scammers are leveraging these AI Chatbot to reach more potential victims. Before we try to understand the “hack” let’s dive into how these AI Chatbots work.
AI Chatbots use specific algorithms (Answer Engine Optimization) to generate summarized responses. To generate an answer, the AI Chatbot reads millions of pages across the internet. Scammers know this, so they have started littering the internet with fake data for the AI Chatbots to find.
How They Poison our Search Results
Seeding the Trap: Scammers post fake phone numbers on random forums, social media comments, and even fake "help" documents (like PDFs) uploaded to otherwise trustworthy-looking sites.
Creating a "Chorus": If a scammer posts the same fake number in 50 different places, the AI starts to think, "Wow, I see this number everywhere associated with American Airlines. It must be the right one!"
The Confidence Gap: When you perform a traditional Google search, you are presented with a list of websites. You intuitively look at the company logo, the URL (www.aa.com > customer-server > contact-american) and click on a link that meets your internal risk score. But when an AI Chatbot gives you an AI Overview, or summarized response, it picks what it thinks is the best based on its Answer Engine Optimization algorithm.
Why It Fools the Best of Us
You didn't click a sketchy link in a spam email. You asked a trusted tool a simple question, and it gave you a direct answer. When my nephew saw that number at the very top of his screen, his brain didn't see a "result,” it saw the answer.
How to Protect Yourself
You don't need to be a tech genius; you just need to be a little skeptical:

I love how easy tech makes things, but it’s definitely not perfect. I’ve realized that taking an extra thirty seconds to double-check a source is a small price to pay to avoid a massive headache.




Comments