March 5th, 2026

Police warn of AI deepfakes audio and videos used in frauds

By MEDICINE HAT NEWS on March 5, 2026.

Local police are using Fraud Prevention Month to warn Hatters of the many ways scammers have to attempt to defraud victims, including AI 'deepfake' technology.--NEWS FILE PHOTO

newsdesk@medicinehatnews.com

During the first week of Fraud Prevention Month the Medicine Hat Police Service has released a public service announcement about the increasing number of sophisticated scams designed to copy or mimic a person using artificial intelligence.

Police say criminals are finding news to exploit AI technology for financial gains, including AI voice cloning scams and AI-created “deepfake” videos which are designed to appear realistic, urgent, convincing and difficult to recognize.

In situations a fraudster can copy a person’s voice from online videos or social media clips and use the fake AI generated voice to call family members to claim they are in distress and urgently require money to be transferred.

“These calls often sound genuine and are designed to create panic, pressuring victims to send money or share personal information,” reads the PSA. “In some cases, scammers add background noise to simulate emergencies or kidnappings, increasing the sense of urgency.”

Criminals are also using AI to make “deepfake” videos of coworkers, executives and public figures. In these videos an employee may be instructed to transfer funds, share internal documents or provide login credentials.

“Because the videos appear authentic, victims may comply before realizing they have been deceived.”

This technology is also frequently used to create fake customer support agents, investment advisers and business profiles, say police. Fraudsters are also able to design realistic online chat windows that can impersonate well-known banks, airlines and technology companies.

Police warn scammers may try to request remote access to your computer or smart device to access personal information, steal money or commit further frauds.

Job seekers and online rental marketplaces are also being directly targeted by AI generated scams. Police say criminals are able to generate AI interviews that use synthetic voices or video avatars to appear legitimate and ask applicants for personal and banking information.

Individuals using online rental marketplaces are also being targeted as criminals can use AI tools to generate realistic images and detailed descriptions of fraudulent listings, which often pressure victims to send deposits or e-transfers quickly before the listing is gone.

AI technology is also being used in romance scams. Police say fraudsters are now able to manage multiple fake online profiles using AI powered chatbots to run conversation 24 hours a day.

Police say the best defence against AI fraud is awareness and verification.

“If you receive an unusual or urgent request for money or personal information, pause and verify the request,” reads the PSA. “Contact the person or organization directly using trusted phone numbers or in person visits.”

Police say if something feels too urgent, emotional or perfect, it may be a scam.

Hatters who believe they are being targeted or have questions about a suspicious interaction are encouraged to reach to the MHPS at 403-529-8481.

Share this story:

17
-16
Subscribe
Notify of
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments