Threat actors are using generative AI to fuel identity attacks and fraud, according to a new report released last week by Transmit Security.
The report, The GenAI-Powered Threat Landscape: A Dark Web Research Report by Transmit Security, is the result of an ongoing investigation by a team of fraud analysts within the Transmit Security Research Lab and reveals that the powerful capabilities of blackhat generating AI platforms are helping fraudsters create new fraud campaigns at new levels of sophistication, speed and scale.
Australia and New Zealand have seen a significant increase in sophisticated scams and fraud cases. The Australian Payment Fraud Report showed a 35.6% increase in payment card fraud in the 12 months to June 2023, costing AUD 677.5 million. Additionally, the New Zealand Banking Ombudsman highlighted an increase in sophisticated unauthorised payment scams, costing New Zealanders more than NZD 200 million per year. Fraudsters are using new tools to create more realistic attacks that would previously have been very difficult to execute due to the effort required to get the language, look and feel right. The report covers:
Proliferation of GenAI tools
-
Ease of access and use: Blackhat generative AI tools like FraudGPT and WormGPT are easily accessible on the dark web and require minimal skills to use. This lowers the barrier for novice fraudsters to launch sophisticated attacks.
-
Advanced fraud capabilities: These tools automate the creation of malicious code, data collection, and execution of highly deceptive fraud campaigns, increasing the volume, speed, and variety of attacks.
Improved fraud techniques
-
Automated Penetration Testing: Generative AI tools can identify business vulnerabilities quickly and efficiently, allowing fraudsters to exploit security holes.
-
Synthetic Identity Creation: Fraudsters are using generative AI to generate high-quality synthetic identity data and fake identities that bypass security controls, including AI-driven identity verification.
Dark Web Markets
-
Robust ecosystem: These marketplaces offer services such as remote desktop protocols (RDP) and credit card verifiers, as well as high seller ratings and escrow services to ensure product effectiveness. This ecosystem supports a wide range of fraudulent activities.
Deepfakes and voice cloning
“Fraudsters work much better together as a community, collaborating and sharing information on generative AI tools and techniques,” said David Mahdi, chief identity officer at Transmit Security. “This collaborative approach among fraudsters requires IT leaders to arm themselves with information and leverage advanced technologies to stay ahead.”
Transmit Security says that to strengthen security, organizations must implement converged fraud prevention, identity verification, and customer identity management services powered by generative AI, AI, and machine learning. A unified, intelligent defense is essential to eliminate data silos, close security gaps, and detect and stop today’s advanced fraud with accuracy and speed.
You can read the full report here.