A new machine-learning algorithm developed by business researchers was able to not only accurately predict the success of individual directors at public companies but also identify which directors were most likely to be unpopular with shareholders, an innovation that could help diversify corporate boards.
In a study published April 20 in The Review of Financial Studies, researchers sought to determine whether algorithms could assist firms in making smarter decisions when nominating corporate directors. Directors predicted by the study's algorithms to perform poorly in their position did indeed perform poorly in practice, compared with a realistic pool of candidates in out-of-sample tests.
Another noteworthy finding: Directors predicted by the algorithm to be "bad" were more likely to be male, accumulate more directorships, and have larger networks than the directors the algorithm would recommend to be nominated in their place. "Bad" candidates identified by the algorithm were also more likely to receive a nomination from companies with weaker corporate governance structures.
"Boards of directors are key pillars of corporate governance," said Léa Stern, co-author of the study and an assistant professor at the University of Washington's Michael G. Foster School of Business. "We've known for a long time that the traditional way by which directors are selected may not be in the best interest of investors and may be shutting out people who do not belong to C-suites' networks. ... Machine-learning prediction tools are particularly well suited to help us make progress on this front."
For their machine-learning approach to selecting the directors of publicly traded companies, Stern and her colleagues constructed a large database of publicly traded U.S. firms and independent directors appointed between 2000 and 2014. They then built several algorithms to predict director performance, analyzing combined data at the director, board and firm level.
Four machine-learning models — XGBoost, ridge regression, lasso and neural network — were trained to select director candidates, and the outcomes of those candidates were compared to directors actually chosen by firms. To estimate theoretical board performance, researchers constructed a realistic pool of potential candidates from directors who joined the board of a smaller neighboring company within a year. The study assumed these individuals would have seriously considered the opportunity to be on the board of a larger nearby company, since directorships at larger companies tend to pay more and be more prestigious. Relative shareholder support received by directors across three annual reelections was then used as a market-based measure of individual directors' performance.
The machine-learning models demonstrably outperformed more traditional econometric approaches to prediction problems, such as ordinary least squares regression, a common estimation method for linear models. The study's OLS model, the researchers wrote, was unable to predict who would perform well compared with alternative candidates and who would not, and it found no relation between predicted and actual director performances.
The study's XGBoost model, however, provided particularly strong estimates. Directors predicted by the XGBoost algorithm to perform poorly did, in fact, exhibit poor performance compared to potential available alternatives. On average, directors predicted by XGBoost to be in the bottom percentile of performance, measured by shareholder support, actually received 3.1% less shareholder support; directors in the top percentile of predicted performance had 1.1% greater observed shareholder support.
The results also suggest that machine learning holds promise for understanding the process by which governance structures are chosen and has the potential to help real-world firms improve their governance.
According to Stern, directors who join a company's board out of financial or self-promotional motivations are unlikely to prove helpful to the firm during critical times of crisis. A genuine motivation to advance the company's mission, then, should be identifying what qualities to prioritize when selecting new directors.
If industry experience is not in short supply on the current board of directors, for instance, then such experience might not be prioritized in subsequent nominations. Capturing this flexibility was the goal behind developing an algorithm and, according to the researchers, precisely what the XGBoost algorithm delivered.
"Our algorithm was helpful to show that predictably bad directors tend to have larger networks and accumulate board memberships. This matches well with what I believe is one of the most overrated qualities of a director, which is being a big name," Stern said. "Big names have little time. And time and commitment to make a difference by supporting the executive team are underrated."
Stern said showing that an algorithm can lead to "better and potentially more diverse" corporate boards is "an important step forward."
"Boards have not really been held accountable for how they select new members," she said. "It is possible that in the future, institutional investors consider algorithmic input as a basis on which to hold boards accountable for their selection of directors."
Algorithms' recommendations are not devoid of potential biases, the researchers point out, and care should still be taken to ensure that underrepresented minorities are included among potential candidates. Another limitation of the study's machine-learning approach is that algorithms are blind to chemistry between individuals, which the researchers say is a key ingredient for any well-functioning board of directors.
Stern said the ideal board operates as a partnership whose common goal is to support and help the executive team achieve the company's mission.
"This is an important reason why we strongly believe that directors are unlikely to ever be selected solely based on an algorithmic score, and why they probably shouldn't," Stern said. "We believe that algorithmic input can be useful to expand the pool of potential candidates, broaden the board's horizons and potentially be a tool to hold boards accountable for their choices of directors."
She added that boards "must obviously exercise their judgment and bring in the best people they can for the job," even if they were not recommended by an algorithm.
"But if a board systematically brings in people with low scores who end up not doing well, this is potentially raising a red flag for investors," Stern said. "In that sense, the algorithmic decision aid, as black box as it is, makes the human decision-making process more transparent and accountable."
The study, "Selecting directors using machine learning," published April 20 in The Review of Financial Studies, was authored by Isil Erel and Michael S. Weisbach, Ohio State University, National Bureau of Economic Research, and European Corporate Governance Institute; Léa Stern, University of Washington; and Chenhao Tan, University of Chicago.