Artificial intelligence (AI) is an increasingly important technology within the investment management industry.1 AI has been used in a variety of ways—including as the newest strategy for attempts to "beat the market" by outperforming passive index funds that are benchmarked against the S&P 500, despite the long-standing finding that index funds consistently win that contest.2
Investment advisers who use AI should consider the unique issues the technology raises in light of an adviser's fiduciary duty to its clients. In this client alert, we provide an overview of how AI is being used by investment advisers, the fiduciary duties applicable to investment advisers, and particular issues advisers should consider in designing AI-based programs, to ensure they are acting in the best interests of their clients.3
How Artificial Intelligence Is Being Adopted by Investment Advisers
AI is currently used by investment advisers in a variety of innovative ways:
Issues Raised by an Investment Adviser's Fiduciary Duties
Under federal law, an investment adviser is a fiduciary to its clients.8 An adviser's fiduciary duty involves a duty of care and a duty of loyalty, which, although not defined specifically in the Investment Advisers Act of 1940 (Advisers Act), have been addressed and developed through U.S. Securities and Exchange Commission (SEC) interpretive releases and guidance, as well as case law.9 As discussed below, these duties have implications for an adviser's use of AI. The specific obligations required by an adviser's fiduciary duty will depend upon what functions the adviser has agreed to assume for the client.10 While the SEC has not provided specific guidance for advisers using AI, current guidance raises unique considerations for advisers to consider.11
Duty of Loyalty
The duty of loyalty requires investment advisers not to place their own interest ahead of their clients' interests.12 An adviser must make full and fair disclosure to its client of all material facts relating to the advisory relationship and employ reasonable care to avoid misleading clients. Information provided to clients must be sufficiently specific so that a client is able to understand the investment adviser's business practices and conflicts of interest.
An adviser's duty of loyalty raises, among others, the following issues with respect to AI-based investment management programs:
What facts does an adviser need to disclose about its use of AI?
Advisers should consider disclosing information such as the following:
How should advisers think about the tension between disclosure obligations and confidentiality regarding proprietary technologies?
It is important for advisers to disclose enough information for investors to make an informed decision about engaging, and then managing the relationship with, the investment adviser. Advisers should be careful to not mislead clients, and information provided to clients should be sufficiently specific so that a client is able to understand the investment adviser's business practices. However, highly technical information about the process behind the AI's decisions might not be beneficial to a client's understanding of the adviser's platform.
Does an adviser need to disclose the historical success rate of returns from using artificial intelligence?
Historically, funds that employ AI have not outperformed the S&P 500.13 Investment advisers might therefore be expected to provide disclosures indicating that an adviser has not conclusively proven AI's ability to predict securities prices and may not "beat the market."14
Duty of Care
The duty of care requires, among other things, the duty to provide advice appropriate for the client and the duty to monitor a client's investments, and the ongoing suitability of those investments, over the course of the relationship.15 An adviser must develop a reasonable understanding of the client's objectives and have a reasonable belief that the advice it provides is in the best interest of the client, based on the client's portfolio and objectives.
An adviser's duty of care raises, among others, the following issues with respect to AI-based investment management programs:
Can an adviser replace traditional suitability assessments with alternative data or other AI-based tools?
If an AI-based system makes investment choices on behalf of clients using deep or machine learning that develops on its own, by tracking client behavior, or by using alternative data, the adviser should pay particularly close attention to how the recommendations generated by those data might differ or be in conflict with a client's explicit preferences and investment objectives. It is possible that an adviser using AI-based tools will make different assessments of what is best or appropriate for the client than if the adviser uses more traditional tools like suitability questionnaires that ask a client about her risk profile, investment objectives, and other characteristics. As a result, an adviser using AI-based systems to generate an investment management system may be at cross-purposes with the client, which would raise issues based on the adviser's duty of care.
How frequently should an investment adviser evaluate its AI program?
Because AI programs create their own rules based on the data they analyze, and autonomously make trading decisions, advisers should develop internal procedures for ensuring their programs are operating correctly.16 For example, advisers should adequately test their AI before and periodically after it is integrated into the investment platform. In addition, advisers should develop strategies for procedures they can implement to adjust their AI programs if they do not produce favorable results. Advisers should also monitor for possible cybersecurity threats.
How should an adviser review investment decisions directed by AI to ensure the decisions still fit within a client's investment goals?
Advisers using AI should adopt and implement procedures that will periodically review the performance of their AI, to ensure that performance is within expected parameters and that decisions are not being made to the detriment of clients' investment goals. Ultimately, the adviser is responsible for all decisions made by its AI-based program and therefore cannot let an AI-based program simply run without the adviser's active monitoring.17
For more information about the use of artificial intelligence by investment advisers or the regulation of investment advisers generally, please contact any member of the Fintech and Financial Services group at Wilson Sonsini Goodrich & Rosati.
This Wilson Sonsini Alert was prepared by Amy Caiazza, Danielle Sartain, and Robert H. Rosenblum.
[1] For example, on May 26, 2020, HSBC announced the launch of its AI Powered US Equity Index family, the first equity index product powered by artificial intelligence and big data, using data insights from IBM Watson to guide equity trading decisions. “HSBC Launches First Equity Index Products Powered by AI and Big Data,” Business Wire (May 26, 2020), https://www.businesswire.com/news/home/20200526005556/en/HSBC-Launches-Equity-Index-Products-Powered-AI.
[2] In his 1972 book “A Random Walk Down Wall Street,” Burton Malkiel maintains that share prices evolve similarly to a random walk and have no relationship with historic values or other variables. Because future share prices are not based on any pattern, they cannot be predicted, and it is unlikely any random fund adviser will be able to out-perform a passive investment fund. Burton Malkiel, A Random Walk Down Wall Street: The Time-Tested Strategy for Successful Investing. W.W. Norton, 1972. Since the 1972 publication, research continues to show that beating the market is unlikely. For example, a study released at the end of 2019 indicated that over the past 10 years, 89 percent of large-cap funds, 84 percent of mid-cap funds, and 89 percent of small-cap funds failed to outperform their S&P 500 benchmark on a relative basis. SPIVA U.S. Year-End 2019 Scorecard, S&P Dow Jones Indices (2019), https://us.spindices.com/indexology/core/spiva-us-year-end-2019.
[3] Of course, the federal securities laws raise issues that are important for investment advisers who employ AI to consider even outside their fiduciary duties. For example, advisers using AI may want to rely on Rule 3a-4 under the Investment Company Act of 1940, which provides a safe harbor from registration as an investment company for advisory programs that are reasonably customized to particular investors, with respect to their investment programs. If an AI-based program provides similar advice to a variety of clients, however, the adviser may not be able to rely on Rule 3a-4. Thus, advisers who manage client accounts based on AI may need to consider the specific contours of their AI programs in light of the requirements of Rule 3a-4. This alert by no means covers the range of issues AI models raise for investment advisers.
[4] Adam Satariano and Nishant Kumar, “The Massive Hedge Fund Betting on AI,” Bloomberg (Sept. 27, 2017), https://www.bloomberg.com/news/features/2017-09-27/the-massive-hedge-fund-betting-on-ai.
[5] Adam Satariano and Nishant Kumar, “The Massive Hedge Fund Betting on AI,” Bloomberg (Sept. 27, 2017), https://www.bloomberg.com/news/features/2017-09-27/the-massive-hedge-fund-betting-on-ai.
[6] Robo-advisers are online platforms, and typically registered investment advisers, that provide discretionary asset management services to their clients through online algorithmic-based programs. See Division of Investment Management, Robo Advisers, IM Guidance Update No. 2017-02 (Feb. 2017).
[7] See, e.g., Ryan W. Neal, “Wealthfront Turns to Artificial Intelligence to Improve Robo Advice” (Mar. 31, 2016), https://www.wealthmanagement.com/technology/wealthfront-turns-artificial-intelligence-improve-robo-advice.
[8] SEC v. Capital Gains Research Bureau, Inc., 375 U.S. 180, 194 (1963). An investment adviser’s fiduciary duty is imposed under the Advisers Act in recognition of the relationship of trust between an investment adviser and a client and is made enforceable by the antifraud provisions of Section 206 of the Advisers Act.
[9] See, e.g., Capital Gains 375 U.S. at 194; SEC v. Tambone, 550 F.3d 106, 146 (1st Cir. 2008); SEC v. Moran, 944 F. Supp. 286, 297 (S.D.N.Y 1996); Transamerica Mortgage Advisors, Inc. v. Lewis, 444 U.S. 11, 17 (1979); Commission Interpretation Regarding Standard of Conduct for Investment Advisers, Investment Advisers Act Release No. 5248 (July 12, 2019); IM Guidance Update No. 2017-02 (Feb. 2017); Proxy Voting by Investment Advisers, Investment Advisers Act Release No. 2106 (Jan. 31, 2003). See also SEC Commissioner Kara M. Stein, “Surfing the Way: Technology, Innovation, and Competition,” Remarks at Harvard Law School’s Fidelity Guest Lecture Series (No. 9, 2015), available at https://www.sec.gov/news/speech/surfing-wave-technology-innovation-and-competition-remarks-harvard-law-schools-fidelity (former SEC Commissioner Stein discussing the application of fiduciary duties to digital advice provided by robo-advisers).
[10] See, e.g., Investment Advisers Act Release No. 5248) (in interpretive guidance related to investment advisers’ fiduciary duties, explaining that while all investment advisers owe their clients a fiduciary duty, “that fiduciary duty must be viewed in the context of the agreed-upon scope of the relationship between the adviser and the client; the fiduciary duty itself may not be waived, but the exact responsibilities of the adviser in managing the client’s account may be limited”).
[11] Although not directly on point, guidance released by the SEC’s Division of Investment Management with respect to robo-advisers provides some light on the types of considerations advisers should address based on their fiduciary duties to clients when adopting non-traditional methods of investment advisory services. IM Guidance Update No. 2017-02 (Feb. 2017). Additionally, former SEC personnel have spoken on the SEC’s own use of AI. See, e.g., SEC Commissioner Kara M. Stein, “From the Data Rush to the Data Wars: A Data Revolution in Financial Markets” (Sept. 27, 2018), available at https://www.sec.gov/news/speech/speech-stein-092718; Scott W. Bauguess, Acting Director and Acting Chief Economist, SEC Division of Economic and Risk Analysis, “The Role of Big Data, Machine Learning, and AI in Assessing Risks: a Regulatory Perspective” (June 21, 2017), available at https:// www.sec.gov/news/speech/bauguess-big-data-ai.
[12] Investment Advisers Act Release No. 5248.
[13] Mark Hulbert, “Using AI for Picking Stocks? Not So Fast,” Wall Street Journal (Jan 5, 2020), https://www.wsj.com/articles/use-ai-for-picking-stocks-not-so-fast-11578279960.
[14] In a response to a request for a no action letter, the staff of the SEC explained to an investment adviser purporting to rely on abilities as a psychic medium that he would be required to disclose that the predictive value of his methods had not been scientifically established. John Anthony, SEC Staff No Action Letter (Mar. 19, 1975). Similarly, because the predictive value of AI has not been conclusively established, investment advisers should consider the extent to which the SEC would expect them to disclose the lack of proof that AI-based algorithms are better than or even equal to the success of more traditional tools.
[15] Investment Advisers Act Release No. 5248.
[16] The Financial Industry Regulatory Authority (FINRA) published guidance in 2016 related to the governance and supervision framework advisers should have in place to adopt client-facing digital investment advice tools. While not directly related to AI, the guidance emphasizes the importance of supervising algorithms used in digital-advice tools and periodically assessing “whether [the] algorithm is consistent with the firm’s investment and analytical approaches.” See FINRA Report on Digital Investment Advice (Mar. 2016), available at https://www.finra.org/sites/default/ files/digital-investment-advice-report.pdf.
The SEC has previously brought enforcement actions against investment advisers for failing to supervise those in charge of programming and monitoring algorithmic or other automated trading strategies. In one enforcement action, the founder of a hedge fund allowed his co-founder complete control of operating and monitoring an algorithm to make trade decisions. He was made aware that the algorithm was not working as expected and made no efforts to follow-up with the co-founder or inform investors or prospective investors of any issues related to the algorithm. In the Matter of Timothy S. Dembski, Investment Advisers Act Release No. 4671 (March 24, 2017).
[17] For example, robo-adviser Wealthfront was the subject of an SEC enforcement action for providing a false statement that it monitored all client accounts to avoid transactions that might result in wash sale, which would potentially reduce investor returns, when it did not actually do so. See In the Matter of Wealthfront Advisers, Investment Advisers Act Release No. 5086 (Dec. 21, 2018).