
Artificial Intelligence (AI) scams will continue to increase as the technology becomes more sophisticated.
Artificial intelligence has simplified various tasks for the average American.
Although the technology has many benefits, it can also be used for evil.
Scammers are using AI to make their schemes harder to detect.
These criminals often target older adults by using AI to impersonate legitimate companies, mimic the voice of a relative, or create false emergencies to trigger panic.
Awareness, clear communication, and preparation are key to preventing loved ones from becoming victims.
Families should implement thoughtful strategies to support and protect themselves and their older members.

AI scams are growing more sophisticated.
AI tools can scan short audio clips to synthesize a voice from a different script.
This technology makes “grandparent scams” harder to detect and easier to believe.
Criminals perpetrating this scam often provide “evidence” that a loved one is in trouble.
They then request money and pressure the targeted family member to act in haste.
Because they mimic the voice of a loved one, these AI scammers can be difficult to recognize as fraud.
AI has even made phishing via email and messaging more difficult to detect.
Messages can now be tailored to the target's online habits and activity.
The grammar mistakes common in the past are less prevalent in these phishing emails.
Any request to download a file, click a link, or provide sensitive information should be approached with caution and investigated.
Because people do not think as logically when under high emotional stress or threat, criminals often seek to trigger these emotions.
By building pressure and urgency, scammers may claim to represent tech support, government agencies, or banks.
Older adults and all other family members should be reminded that legitimate organizations rarely demand immediate action.
Remembering this simple fact can minimize vulnerability to AI scams.
Artificial intelligence chatbots and “virtual companions” are often used by scammers to create emotional rapport with older adults.
Although these online conversations may appear harmless at first, they often escalate into requests for personal information, account access, or money.
These chatbots can appear helpful and friendly, reducing the target's reluctance to respond to their requests.
Some AI companion services also charge microtransactions for use.
Although the fees may seem insignificant, they can create emotional dependency in lonely individuals and can quickly grow into a larger sum of money.
What are the signs to look for to identify a companion scam?
If a new online “friend” claims to care for you or a loved one but asks for gift cards or money, then it is likely a companion scam.
Other signs include frequent messages insisting on urgency and confidentiality regarding a personal crisis, and requests to communicate outside public forums.
Caregivers and family members should talk with seniors about these new online risks and promote an open conversation about online relationships.
Awareness and Verification Systems
Families should set up a code word or password for urgent money requests.
Loved ones can encourage seniors to take a break and verify all unexpected messages or calls by contacting an advisor or a known family member.
Strengthening Digital Habits
It is beneficial to enable two-factor authentication and use a password manager for all accounts.
Taking time to review all privacy settings, limit the posting of personal details, and enable call blockers and spam filters will reduce your risk of exposure.
By reminding family members that reputable companies do not demand immediate payment for cryptocurrency or gift cards, you can help prevent them from falling victim to pressure tactics.
As AI scams grow, families must work together to protect all members, including older adults.
By practicing open communication and providing nonjudgmental support, and by using practical tools, you can help aging loved ones remain engaged in digital connections with you.
At the same time, you help minimize their risk of falling victim to an AI scam.
Regularly updating and reviewing your plans and communication strategies will support both security and personal independence.
Artificial intelligence has made traditional scams harder to identify by using realistic phishing messages and sophisticated voice-mimicking techniques.
Scammers are using virtual companion scams to build trust with older adults and then exploit them.
Families can help protect all members by setting up verification steps and implementing digital safety habits.
When it comes to protecting yourself and loved ones from online criminals, a little common sense goes a long way.
This post is for informational purposes only and does not provide legal advice. You should consult an attorney for advice on any specific issue or problem. Nothing herein creates an attorney-client relationship between Harvest Law KC and the reader.
Reference: National Council on Aging (Oct. 31, 2024) "What Are AI Scams? A Guide for Older Adults"
REMEMBER: “The choice of a lawyer is an important decision and should not be based solely upon advertisements.”
This statement is required by rule of the Supreme Court of Missouri.
