From officials’ mouths come words that predict bankruptcy, interest spread and more.
Using an analysis of thousands of words spoken by corporate executives, Washington University in St. Louis’ Jared Jennings and three other researchers have created a new way to help lenders make better loan decisions.
Their study uses qualitative information to assess a business’ credit risk. “It’s all based on language,” said Jennings, associate professor of accounting at Olin Business School. “Our measure captures unique attributes of credit risk that are not readily identified by existing measures.”
As it turns out, the words company officials use in quarterly earnings calls with investors and analysts can be, well, telling.
“Our results suggest that our measure improves the ability to predict future bankruptcies, future interest spreads and future credit rating downgrades,” Jennings said.
Evidence also suggests their measure more consistently captures a borrower’s credit risk than other methods.
They call their measure the “text-based credit score,” or “TCR Score.” The TCR Score could be particularly useful when other market-based measures of a firm’s credit risk aren’t available, Jennings said. “Our analyses suggest that only about 22% of firms with long-term debt are assigned credit ratings by leading rating agencies.”
Their working paper, “Measuring Credit Risk Using Qualitative Disclosure,” is under revision for the Review of Accounting Studies.
‘A tighter link’
Traditional credit risk measures mostly use numerical, or quantitative, data.
Jennings and his co-authors set out to measure the spoken word. They used three machine-learning methods to create a measure of credit risk based on information disclosed in 132,060 conference call transcripts from 2003-2016.
Jennings and co-authors John Donovan of the University of Notre Dame; Kevin Koharki of Purdue University; and Joshua Lee of the University of Georgia grouped into categories hundreds of top words, phrases and topics that their machine-learning methods identified.
One method identified language associated with liquidity, debt and performance. The other two identified phrases associated with performance, industry and accounting.
“By connecting the language identified by the machine-learning methods to economic intuition, we are able to draw a tighter link between the construct of credit risk and our proxy,” the researchers wrote.
The study adds to the growing body of research using machine-learning methods to gather information from conference calls and 10-Ks to explain accruals, future cash flows, fraud and other outcomes.
It also adds to research that examines other useful signals extracted from conference calls, such as vocal and video cues, and tone. (See “When Upbeat Language Belies Downbeat Results,” about research by Xiumin Martin, professor of accounting at Olin and Guofu Zhou, professor of finance.)
“We expect that practitioners and academics could use our measure to supplement existing credit risk models to obtain a more comprehensive and independent estimate of credit risk,” Jennings and co-researchers wrote.