Practice Update October 2025
TI scams in Australia: how to spot them and stay safe
AI now makes it cheap and easy to fake a person’s face and voice. Scammers are using these “deepfakes” in calls, Direct Messages, emails and ads to push investment schemes, steal logins, or socially engineer payments.
Why this matters
- Australians reported $2.03 billion in scam losses in 2024.
- Losses were $2.74 billion in 2023.
- On social media specifically, $43.4 million in losses was reported in just Jan–Aug 2024, with thousands of “celebrity-bait” deepfake pages and ads removed under Meta’s FIRE program, including those targeting Australian banks.
- Globally, deepfakes now account for a significant share of biometric fraud, approximately 40% in 2024, highlighting just how convincing synthetic voice and video have become.
Real-world cases
- Celebrity deepfake investment ads. Australia’s consumer watchdog has repeatedly warned about fake news pages and deepfake videos of public figures pushing trading platforms.
- Deepfake video meetings. In a widely reported case, a finance employee in Hong Kong was tricked by a deepfaked CFO and colleagues on a video call into paying approximately US$25 million, illustrating the convincing and coordinated nature of these attacks.
- Bank and exchange impersonation alerts. The AFP and the National Anti-Scam Centre (NASC) have issued warnings about bank impersonation scams and crypto-exchange impostors targeting Australians via text messages, emails, and phone calls.
What deepfakes look and sound like
- Voice cloning: just a few seconds of audio can produce a near-perfect voice. Expect pressure, urgency and requests to move money quickly.
- Video fakes: slick interviews, Zoom calls, or ads where lip-sync is almost perfect, backgrounds look subtly odd, or lighting on a face doesn’t match the room.
- Image fakes: profile pics or proof screenshots with mismatched jewellery, blurred ears/hairlines, or warped text.
Tips
- Avoid the urgency: Scammers create a sense of panic. Hang up or leave the chat. Call back using a number you recognise (e.g., your bank card number, official website).
- Verify out of band: If a boss or family member asks for money, call a known number or set a pre-agreed safe word for video calls.
- Challenge the media: Ask the caller to perform a simple action live, such as turning left or showing today’s date on paper. Watch for unusual lighting, frozen teeth/tongue, out-of-sync blinks, or jerky shadows.
- Never click payment links in texts: Go directly to your bank app; don’t follow links or numbers supplied in the message.
- Treat celebrity money ads as scams: ASIC-licensed financial advertisements on major platforms in Australia are moving to stricter verification; if you don’t see clear provenance, assume it’s fake.
Practical prevention
- Use passkeys or app-based 2FA (prefer authenticator apps over SMS where possible).
- Use a password manager and create unique passwords for every account.
- Use PayID namechecking; consider transfer limits and “cooling-off” delays for new payees.
- Keep devices up to date; use built-in password and website warnings.
- Hide your voice and video samples from public profiles where practical; lock down who can direct message or tag you.
Scenarios based on actual scam techniques
Scenario 1: The Urgent Bank Security Call
Characters:
- John, a 52-year-old teacher in Sydney
- Scammer posing as “Mary from his bank” using a cloned voice
What happened
John received a call late at night. The caller, sounding exactly like his bank’s security officer (based on a voice sample lifted from an old radio interview John once did), told him that his account was under cyberattack and he needed to transfer $25,000 into a safe-holding account urgently.
The caller used urgent, fearful language: “We can see criminals draining your account right now.” John, panicked, made the transfer through the link they texted him.
Implications
- John lost $25,000, unrecoverable because he authorised the transaction.
- He spent weeks dealing with ID theft risks after sharing personal details with the caller.
- Emotional stress: loss of sleep, anxiety about financial security.
What could have stopped it
- Stop & breathe: Urgent requests = red flag.
- Verify out-of-band: Call back using the number on the back of the bank card, not the one given in the text.
- Channel check: Banks never ask for transfers via text links or over the phone.
Scenario 2: The Celebrity Investment Video
Characters:
- Priya, a small business owner in Melbourne
- Scammer running a fake crypto investment ad using a deepfake video of a famous Australian TV presenter
What happened
Priya saw a slick Facebook ad featuring a well-known TV presenter explaining how she “doubled her money” with a new crypto platform. The lip movements and voice were nearly perfect, yet clearly a deepfake.
She clicked through, spoke with support staff, and invested $10,000 via bank transfer, expecting guaranteed returns. The platform vanished after two weeks.
Implications
- Total financial loss.
- Ongoing spam calls targeting Priya for more investments — she was added to a victim list sold on the dark web.
- No legal recourse: the ad originated offshore; complex jurisdictional issues.
What could have stopped it
- Challenge the media: No genuine investment opportunity relies on urgency or secrecy.
- Treat celebrity money ads as scams: ASIC warns that guaranteed returns are a scam.
- Report immediately to Scamwatch and eSafety for ad takedown.
Scenario 3: The Deepfake “Boss” on Video Call
Characters:
- Li Wei, accounts officer in a Brisbane construction firm
- Scammers impersonating her CEO and two other managers in a deepfaked Zoom call
What happened
Li Wei joined a Zoom call where she saw her CEO and two colleagues asking her to urgently pay $250,000 to a new overseas supplier. The faces blinked, nodded, and spoke naturally — but it was a fully AI-generated video based on real LinkedIn photos and YouTube speeches.
Trusting the “CEO,” she processed the payment.
Implications
- $250,000 company loss; internal investigation triggered.
- Regulatory reporting obligations under anti-fraud and corporate governance rules.
- Staff morale issues; fear of disciplinary action despite being a victim herself.
What could have stopped it
- Challenge the media: Request a live “safe word” or a unique gesture during video calls.
- Maker-checker control: Payments should require a second verification via a different channel (e.g., phone or SMS to the CEO).
- Incident response drill: Staff need training for deepfake risks in payment authorisation.



