A UK bank has issued a warning, stating that “millions” of individuals could become victims of frauds that use artificial intelligence to clone their voices.A person’s voice can be accurately mimicked by fraudsters using artificial intelligence (AI) using just three seconds of audio from, say, a video that the individual has uploaded online, according to Starling Bank, an internet-only lender.
After identifying the victim’s friends and relatives, con artists can utilize the AI-cloned voice to pretend to be calling and requesting money.
They have already affected hundreds
These kinds of frauds have the capacity to “catch millions out,” Starling Bank stated on Wednesday in a news release. They have already affected hundreds.
More than 25% of respondents to a survey the bank and Mortar Research conducted last month among over 3,000 persons claimed to have been the victim of an AI voice-cloning fraud in the previous 12 months.
According to the survey, 8% of participants would give over as much money as asked by a friend or family member, even if they thought the call was odd, and 46% of participants were unaware that such scams existed.
Lisa Grahame, chief information security officer at Starling Bank said “People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters,”
The bank is encouraging people to agree a “safe phrase”
The bank is advising customers to decide on a “safe phrase” that may be used to confirm their identification over the phone with their loved ones. This phrase should be short, random, easy to remember, and unrelated to any other passwords.
The lender warns avoiding texting someone the safe phrase since it could be easier for scammers to discover you. If you do communicate it that way, though, you should erase the message as soon as the recipient sees it.
Concerns about AI’s capacity to hurt people are growing as it gets better at impersonating human voices. One way this may happen is by assisting criminals in accessing bank accounts and disseminating false information.
OpenAI, the company behind ChatGPT, a generative AI chatbot, unveiled speech Engine, a speech replication tool, earlier this year. However, the tool was not made public at that time due to concerns about the “potential for synthetic voice misuse.”
GIPHY App Key not set. Please check settings