ChatGPT proxy fraud warning as fraudsters steal private particulars

Sep 09, 2023 at 8:32 AM
ChatGPT proxy fraud warning as fraudsters steal private particulars

Scammers are benefiting from the recognition of to dupe folks into handing over private particulars.

Ray Canzanese, director of Netskope Threat Labs, spoke to about how criminals create proxy imitations of the ChatGPT webpage, permitting them to see all the knowledge the particular person is inputting into the AI-powered service.

He mentioned: “The proxy, which now acts as a “man-in-the-middle” between the sufferer and ChatGPT, permits the scammer to see every part the victims ship to ChatGPT – and all of the responses returned by ChatGPT.

“Scammers do this to collect information about their victims that can be used to target them with additional scams.

“Suppose I used ChatGPT to edit an email to my financial planner and to research a medical condition I was just diagnosed with.

“The scammer now knows my financial situation and can target me with scams that prey on my medical condition.”

may use any doubtlessly embarrassing private particulars to try to blackmail an individual into sending them cash.

The fraudsters may harvest any passwords or keys despatched to ChatGPT via the proxy websites below their management.

Mr Canzanese spoke about what folks can do to keep away from being taken in by the pretend web sites.

He mentioned: “The approach is very similar to recognising phishing pages. ChatGPT is a product of OpenAI and the URL is chat.openai.com.

“If the page you visited has a different URL, it may be a scam. To avoid such scams, always browse directly to websites.

“Do not click on links that you receive via text message, email, or social media. Do not click on links in ads.”

There can be the hazard that AI instrument comparable to ChatGPT might be utilized by scammers to proliferate their ploys and make them extra convincing.

Chris Vaughan, vice chairman of the Technical Account Management at cybersecurity group Tanium, mentioned ChatGPT can be utilized to jot down “convincing scripts” for rip-off cellphone calls.

He additionally mentioned AI can be utilized for voice cloning to mimic an individual’s voice when making a cellphone name.

He defined: “AI technology can mimic your voice, and replicate mannerisms with only 15 minutes of audio recording, making it indistinguishable from the real person.

“Imagine hearing a distressed relative’s voice through the phone, asking you to transfer them money.

“Adding a personal touch to scams has the potential to significantly increase the number of victims.”

Fortunately, there are issues folks can do to keep away from being taken in by rip-off calls pretending to be from an organisation or from a cherished one in want.

Mr Vaughan mentioned: “Listen for unnatural pauses or a distorted voice quality. These can be signs of pre-recorded messages or voice synthesis software.

“If you hear these red flags, don’t be afraid to hang up – don’t stick around if it’s suspicious.”

Fraudsters may use AI to create pretend social media profiles that may look very convincing.

For the most recent private finance news, observe us on Twitter at @ExpressMoney_.