You've booked the trade/priced the policy/granted the rating/set the reserves. So, the analytical heavy lifting is done, right? Not quite.
Most major non-agency mortgage market participants would agree that some degree of loan level analysis has become the norm. Traditionally, most (if not all) granular analyses were completed pre-trade during price/rating/reserve discovery. After the transaction closed, many market participants were content to employ risk and surveillance systems that primarily consisted of periodic, canned, batch report distribution. As HPI hummed along at an annual 8% to 10% per annum clip, no one paid much attention to these monthly distribution list mailings, and many institutions unwittingly slipped into the role of passive portfolio observer while the risk profiles of their exposures evolved over time...often with disastrous results.
Fortunately, both capital markets and risk management departments of those institutions that have survived the fallout are placing a renewed emphasis on active portfolio management for both inherited legacy assets and newly purchased assets often acquired at deep discounts. There are a variety of datasets and data analysis tools that allow users - including bond investors, whole loan buyers, servicers, insurers, rating agencies and regulatory entities - to analyze vast amounts of granular portfolio data, benchmark results against relevant cohorts, and take appropriate action based on interpretation of results. Specifically, many institutions have discovered the power of ongoing loan-level analysis as a tool for a number of active portfolio management activities, including: risk ranking of portfolio assets, loss mitigation of controlled/serviced assets, and fraud detection.
Risk ranking of portfolio assets
One of the most pressing priorities for managers of impaired portfolios (or raters/insurers) is prioritizing assets to sell (or downgrade/set reserves against).
During this process, asset managers attempt to identify holdings that they believe may perform more poorly in the coming months/years relative to current market expectations. Generally speaking, managers can maximize the competitive advantage of their analytic systems by analyzing traditional data sources faster and more efficiently than their competitors, or by incorporating novel and useful sources of data which are still, to a large extent, untapped by the broader market.
In terms of speed, if a manager who seeks to liquidate positions is able to process and uncover performance signals buried in the latest monthly release of a loan-level or econometric time series faster than competitors, he or she will have a small window of time in which to set offer levels appealing to counterparties whose analyses haven't yet caught up to the data.
Regarding new information, a variety of new data sources are available which provide loan level insight into every aspect of the borrower's current credit health. This is worlds away from the traditional approach of credit analysis for seasoned assets, which is primarily based on the borrower's credit health at the time of loan origination in conjunction with payment history of the subject asset. Current consumer credit data can provide users with loan-level bimonthly updates of the borrower's entire credit profile, including current credit score, performance on other credit accounts, revolver utilization, credit inquiry activity, etc. This information allows risk stratification of loans which may have identical origination characteristics and pay histories. Armed with this information, a portfolio manager who is prioritizing asset liquidations might examine two bonds in his portfolio which will look very similar (i.e., having the same loan originator, servicer, LTV at close and HPI-inferred, geographic concentrations, vintage, percentage 30/60/90+, similar credit subordination and excess spread, etc.) to those utilizing only traditional analytics tools and see radically different risk profiles. Perhaps for the current loans in Deal 1, credit score has improved since origination, no one's missing auto payments or running up credit card balances, and credit inquiry activity is limited; but inspection of Deal 2's current loans tells a different story: credit score downward drift, marked deterioration of performance on other debt accounts, revolver utilization ramping up and significant recent credit inquiry activity. The potential counterparties for today's trade who do not possess this insight may not detect the difference in the credit health of current borrowers between the two deals until that point in the future when the delinquency profiles begin to diverge.
Loss mitigation for controlled/serviced assets
Investors who are fortunate enough (or cursed, depending on your point of view) to control the loss mitigation process as loan servicer, credit risk manager, or investor with some semblance of control rights can extract even more utility from the types of analyses described above since they can incorporate the results of loan level analyses into their loss mitigation activities.
To begin with, granular analyses is the only way for a servicer, special servicer or credit risk manager (or the 'passive' whole loan or bond investor whose investment to a large extent relies on the expertise of the servicer) to effectively benchmark and measure his performance relative to peers and the market as a whole.
Dynamic, linked analyses can provide for a more apples-to-apples comparison by allowing corrections for geographic or econometric variations; i.e., simply comparing principal reduction modification recidivism rates generally between two special servicers might yield some insight, but if we layer in additional information to correct for geography-specific HPI and rent vs. own affordability measures, a truer comparative value emerges.
Beyond benchmarking, servicers who are able to perform ad-hoc analyses of large loan-level datasets can look across the current DQ/FC/REO pipeline of the non-agency market, compare the characteristics of pipeline loans with their own portfolio and adjust asset-level exit strategy with some knowledge of likely future competitive supply. Taken a step further, an examination of timeline trends for those servicers with loans in the pipeline can give a savvy servicer insight into likely timing of future supply. In addition, the loan level consumer credit data described in the preceding paragraphs provides a real-time window into consumer credit trends at zip code granularity. This allows REO managers to refine block-by-block HPI estimates when contemplating inventory liquidation strategies.
Virtually all parties in residential mortgage capital markets have some interest in detecting fraud, particularly investors vulnerable to certificate writedowns from nonperforming fraudulent loans, and insurers (at the certificate and loan level) who may be asked to pay claims on fraudulent loans.
Investors and insurers who have the ability to identify outlier events from a sea of data have the greatest chance of combating fraud, and minimizing resultant losses and payouts, successfully.
For example, a loan liquidation in a securitized deal might result in a loss severity that is consistent with that particular deal/vintage/cohort's recent history. Taking the analysis a step further, if an HPI series or comparison vs. similar property types is applied, does the severity look reasonable or should the origination appraisal be questioned?
If the investor is armed with loan-level linked credit data, a multitude of attributes which might indicate increased fraud potential are available for every asset in the pool prior to any credit event. Does the borrower receive mail at the property address for loans which are reported as owner occupied? Does the borrower have multiple first liens? This type of data effectively gives bond investors access to the type of loan level insight previously available only to whole loan market participants or servicers. Granular analyses allow bond investors to engage in an informed and active dialogue with a deal's servicer who, to a large extent, controls the economic outcome of their investment.
Residential mortgage capital markets participants of all types - investors, dealers, servicers, rating agencies, insurers and regulators - should realize that current technology enables anyone with an Internet connection to perform all the types of analyses described above at the office, on a train, or in the airport. The user can analyze the same datasets with the same database architecture used by the world's largest financial institutions with virtually no setup time or infrastructure/ hardware capital investment.
While the end users of these types of analyses may differ in their interpretation of analysis results, it is clear that market participants who choose to ignore the advances in granular data analyses will find themselves on the wrong side of an ever-widening asymmetry of information.
(c) 2009 Asset Securitization Report and SourceMedia, Inc. All Rights Reserved.