When is a loss not a loss?

Stacked coins

Ten years ago, when the financial and banking crisis first erupted, accountants were suddenly in the spotlight. One particular method recognised as good practice by both the International Accounting Standards Board (IASB) and the United States' Financial Accounting Standards Board (FASB) - appeared to have contributed to the escalation of problems. Why weren’t the high levels of risk-taking being better reflected and made more explicit in reported accounts?

The method for dealing with credit losses on loans and other financial assets were based on the ‘incurredloss’ model. This meant a credit loss could only be recognised if there was evidence based on past events that a loss had actually been incurred: losses couldn’t be recognised on the basis of expectations of future events. There was a reasonable justification for this approach. It restricted banks' ability to build up excessive loss allowances that could then be used to facilitate an undesirable ‘smoothing’ of their reported profits. But in the wake of the crisis, it was claimed to have delayed banks' recognition of what had been predictable credit losses, causing a sudden and nightmarish clustering in recognised losses when the crisis hit.

The FASB and the IASB moved quickly to propose changes that would improve the timeliness of how credit losses are reported. They wanted information about likely losses to be reflected more quickly in financial statements. Within this, they wanted to deal with the fact that, when banks make loans, they typically expect that there will be some as-yet-unidentified defaults on the loans. These initially-expected losses are compensated within the interest charged to borrowers but, under a restrictive incurred-loss method, they are only recognised when loss events occur, which is typically after much of the associated interest income has been put into the books. 

Failure Of Convergence

The boards developed methods based on the principle of ‘expectedloss’. They initially issued separate proposals, and then attempted unsuccessfully - to put together a joint solution. Their failure to achieve convergence led to the two boards each developing their own different accounting standards.

The IASB initially proposed a theoretically-ideal method which spread the recognition of initiallyexpected credit losses over time alongside interest income, but this was seen as too difficult to implement. The FASB initially proposed a method that recognised at day one all losses that were expected to occur over the full contractual life of in-scope assets (the recognition of all initiallyexpected losses immediately after a loan is made, for example).

Within their joint deliberations, the FASB and the IASB were able to agree quite straightforwardly on a broadening of the information set that could be used as a basis for the recognition of credit losses. This could now include reasonable and supportable forecasts. This should itself go a long way towards addressing the delays in credit-loss recognition highlighted at the time of the crisis. In the FASB's and the IASB's attempts to deal with initiallyexpected losses, methods that explicitly involved spreading of initially-expected losses over time were eliminated from consideration, in part because of the practical problems of implementing them. The FASB and the IASB explored a joint proposal involving the recognition of some (but not all) initially-expected losses immediately on the initial recognition of financial assets (at day one). However, after lengthy deliberation, the FASB and the IASB couldn’t agree on this proposal, and developed different accounting standards on credit losses (effective from 2018 for the IASB and from 2020/21 for the FASB):

•  The FASB's final standard required recognition at day one of all losses expected to occur over the full contractual life of assets.

•  The IASB's final standard required a loss allowance, including at day one, for only 12 months expected losses for assets that have not deteriorated in credit quality since initial recognition, with a lifetime loss allowance for those that have. It was felt that, although it wasn't strictly a spreading-based approach, the 12-month allowance at each reporting date would approximate the outcomes of a spreading-based approach.

The FASB's method was seen by the IASB as excessively conservative (loss allowances too high); the IASB's method was seen by the FASB as insufficiently conservative (loss allowances too low). There’s an issue here for how comparisons can be made between the reported financial results from organisations using the different standards. 

My co-authors, Dr Noor Hashim and Dr Weijia Li, and I undertook research for the Institute of Chartered Accountants in England and Wales (ICAEW) into the process involved in finding the new solutions. We looked at more than 1,500 comment letters written in response to FASB, IASB and FASB/IASB proposal documents from 2009 to 2013 plus FASB and IASB publications and meeting records as a means of understanding the implications of each suggested approach. 

An interesting feature of the comment letters was commentators' attitudes to day-one recognition of initially-expected losses: lifetime expected-losses in the case of the FASB and 12-month expected losses in the case of the IASB. Day-one losses don't make much sense in the context of accounting. It might be readily acceptable to take immediate account of expected losses at day 1, and all sorts of other things, for the purpose of determining the required level of capital that a corporation should hold, but it is not so readily acceptable to take account of expected losses at day 1 for the purposes of measuring the assets and the yearly profits reported to shareholders and others in financial statements. 

For example, a bank might make loans totalling £100 million to a number of borrowers at a contractual interest rate of 10% per year, which is the going market rate for loans of that type. The bank expects that there will be some as-yetunidentified defaults (credit losses). This is expected to result in the return that the bank will actually earn on these loans (net of the credit losses) being 6% rather than the contractual rate of 10%. The bank knows that it is making the loans in the expectation that it will earn 6% and not 10%. A full day-one-loss requirement would mean that all of the shortfall between the cash receivable by the bank according to the loan contracts and the cash itexpects to receive (net of expected defaults) would have to be recognised as a loss as soon as the loans are made (day one). There are a number of ways in which the problem with day-one losses can be expressed. All of these could be found in comment letters that we examined, in particular those commenting on the more conservative FASB proposals:

•  Immediately writing a loan down to below the amount lent, which for a loan made on market terms is its fair value, is inconsistent with the economics of the lending transaction. It wrongly implies that the bank has just lent more than it expects to recover

. •  Initially-expected losses are being counted twice: once in the pricing at fair value of the amount lent and again in the day-one-loss write down.

•  Credit losses are being recognised in advance of the associated credit-premium-inclusive interest.

•  Day-one losses provide opportunities for shifting reported profits from one year to the next:make some loans in one year, booking a day-one loss that reduces the book value of the loans to below fair value and reduces that year's reported profit; sell the loans at fair value in the next year, reversing the day one loss to increase that year's reported profit.

•  Having to book losses as soon as loans are made could constrain lending, particularly for longer dated higher-credit-risk loans.

Our examination of letters revealed a substantial degree of objection from commentators on the FASB's final proposal document (largely carried forward to the final FASB standard) to the requirement for lifetime loss allowances, which had to be recognised at day one. Commentators on the IASB's final proposal document (largely carried forward to the final IASB standard) were happier than the commentators on the FASB proposals were. Many IASB commentators disagreed on conceptual grounds with the IASB's day-one loss requirement. However, they were largely happy to accept this on the grounds that it only applied to 12-month expected losses and was likely to give an acceptable pragmatic approximation to a spreading-type outcome. 

Misgivings about day-one losses extended to members of the standard-setting bodies themselves. Some IASB members saw the IASB's 12-month allowance as conceptuallyflawed but acceptable as a route to a practically-implementable approach. Two of the seven FASB members were sufficiently concerned that they included a dissenting opinion in the final FASB standard. In support of their objection to the FASB's requirement for day-one recognition of initially-expected losses, they wrote that “they are unaware of any other area of financial reporting for which a loss and a related valuation allowance are immediately established to reduce the value of a recognized asset that is purchased or originated on market terms". 

The results of our analysis suggest a number of possible problems that might emerge when the new standards become operational. For FASB constituents: will the requirement to recognise allowances for full-contractual-life losses be easily implementable and provide useful information? Will day one recognition of full-contractual-life losses cause problems, for example by disincentivising lending? For IASB constituents: will the day-one recognition of 12-month losses cause problems, for example by disincentivising lending? And for both groups, will the potentially significant differences in loss allowances (including those established at day one) between the two standards for a given set of circumstances cause comparability problems?

Too Radical?

Our overall impression is that things might have worked out better if the standard setters had taken a less radical route to addressing the pressures arising from the turbulence of the crisis. Beneficial changes in accounting for credit losses might have been achieved in a more straightforward and more converged manner by taking a broaderinformation-set-based modifiedincurred-loss route rather than by taking the more radical expectedloss route that they took. 

The converged broadening of the information set that the two standard setters agreed should enable reported credit losses to reflect new information in a more timely way. We think that such broadening of the information set within an incurredloss framework might also have addressed, maybe not perfectly but well enough, the need for initiallyexpected losses to be recognised in a significantly more timely manner, achieving better matching with associated interest revenue, than under pre-existing more restrictive incurred-loss practice. At the very least, day-one losses and their associated problems might have been avoided. 

Back to News