Safeguarding Investments In The Age Of Artificial Intelligence
Regulators Issue Fraud Alert on Artificial Intelligence Scams
By Bill Reilly
Subscribe to our original industry insightsIn January 2024, the SEC, FINRA and NASAA issued an investor alert in response to an increasing number of investment frauds involving the use of artificial intelligence (AI) and other emerging technologies.
The regulators noted four particular areas of concern to keep investors, and especially senior investors, safe from these frauds. The intent of the fraudsters is the same as it has been for years: entice investors, especially senior investors, with a story promising high returns and little or no risk. The game remains the same, but these scams now utilize the latest technology to offer fraudulent investments.
Unregistered/Unlicensed Investment Platforms Exploiting Artificial Intelligence
The alerts reminds investors that federal and states securities laws generally require securities firms, professionals, exchanges and other investment platforms to be registered. Lack of registration should prompt investigation prior to investing any funds. Evidence suggests unlicensed online platforms, firms and individuals are promoting AI systems that make unrealistic claims, capitalizing on the popularity of AI technology.
Navigating the Hype: Investing in AI-Related Companies Wisely
Artificial Intelligence companies are the new craze in the 2020’s claiming they are the leaders in emerging technologies. However, the alert indicates that bad actors often use the hype around modern technologies to lure investors into investment scams. These scams may include false claims about a public company’s products and services relating to AI, and be part of a “pump and dump” scheme where promoters profit at the expense of investors.
AI-Enabled Scams Utilizing “Deepfake” Technology
Scammers can use AI technology to copy voices, change images, and create fake videos to spread false or misleading information. Some fraudsters are utilizing AI-generated audio, known as “deepfake” audio, to create fake audio messages that sound like a grandchild in trouble. They target senior investors and claim the grandchild needs money urgently.
In other instances, they may utilize deepfake videos to imitate the CEO of a company announcing fake news in an attempt to manipulate the price of a stock. They might also pretend to be SEC staff or government officials to commit fraud.
Evaluating AI-Generated Information in Investment Decisions
Investors should be cautious about using AI-generated information when making investment decisions. This information may rely on inaccurate, incomplete or misleading data. The release mentioned that this type of information can be used by both registered and unregistered entities and individuals.
Communication, Compliance Opportunities
Although this fraud alert was issued to notify investors, industry firms can utilize it to provide additional client communication, and to educate their compliance, supervision and sales staff, so they can be diligent in their dealings with their clients.
Scams come and go, targeting the vulnerable, but change delivery methods to match advancing technology. However, the adage remains the same: “If it sounds too good to be true it probably is.”
Oyster’s Compliance experts provide practical advice when it comes to your firm’s communications policies, supervisory review procedures and record keeping requirements. Get the advice you need from experts who have the FINRA, SEC and state regulatory experience to help solve complex regulatory challenges.