The call for loan-level mortgage data sounded soon after the industry began to pick up the pieces from the housing market crash. It's grown louder since then and the information is slowly becoming more available. But without a way to identify and decipher what it means, the data are worthless. Luckily, there is new technology coming to market that provides lenders, investors and servicers the ability to monitor the loans being originated, evaluate their quality and monitor repayment performance.
Currently, the secondary mortgage market in the United States consists almost entirely of loans purchased by Fannie Mae and Freddie Mac and those insured by the Federal Housing Administration and sold to Ginnie Mae. Private-label investment is limited because of concerns about loan quality. For those private-label investors to return to the mortgage market, there has to be trust in the product that's originated.
While investors remain leery of mortgage debt, technology vendors' scramble to provide new tools and methods to collect and analyze loan-level data on all new mortgage originations — not just a sampling from a pool of mortgages for sale — could be farsighted. Other tools are providing servicers and investors the ability to review individual files within portfolios of existing loans to predict future performance.
BlackBox Logic of Denver has spent more than two years aggregating loan-level data on nonagency residential mortgage-backed securities. Today, the BBx Data service covers more than 90% of the nonagency RMBS structured since 1999. BlackBox's database includes more than 300 origination and performance data points on more than 21 million mortgages that make up more than 7,500 securities.
"The crisis has been a big blow to people's trust in the securitization market in general," said Wyck Brown, a former investment fund manager and BlackBox's chief marketing officer. "In order to rebuild that trust so that securitization is considered acceptable as a financing choice, I think the data has to be considered clean and comprehensive."
In addition to the scope involved in collecting data on 21 million loans, BlackBox had to overcome the inconsistencies in how the data are recorded among trustees, servicers and lenders.
When a small mortgage company or community bank originates a mortgage, the loan data are recorded in the process of selling the loan to an aggregator or investor and then typically sold to a Wall Street investment bank, Brown explained.
"That Wall Street bank will use a trustee to manage the data for the deal and that trustee becomes the repository for the information," he said. "Over time, that trustee ends up having various different data sources. Even at the aggregate level, you're talking about typically over 100 formats for just 14 trustees. And that's just at the trustee level. Each one of those trustees is collecting data from several servicers."
BlackBox customers, generally secondary market participants like large institutional investors, use the company's data as a performance benchmark for securitizations and other loan portfolios they own, manage or are evaluating for possible acquisition. To do this, investors build costly and complicated custom platforms. It's a cumbersome process that requires staff with the expertise to build models and is often inaccessible to smaller industry participants.
To address this need, BlackBox developed a browser-based platform to handle the manipulation and analysis of its data. The Crystal Logic platform is currently in beta testing for a scheduled July launch.
The platform comes preloaded with the securitization deals in the BBx database, organized by Committee on Uniform Security Identification Procedures (CUSIP) identification numbers. Investors can also upload additional loan portfolios. From there, Crystal Logic users can slice and dice the loan data using various metrics that include location, performance, loan type and security, among others to create customized reports.
"You have to have that telescoping capability to look at the data aggregately or drilled down to the loan-level data," Brown said. "The loan-level data allows you to analyze deals, but it also allows you to analyze the entire U.S. residential mortgage market in general."
The implications for this capability go beyond the buying and selling of mortgage debt on the secondary market, as it has applications for lenders and servicers.
With loan-level data, a servicer can evaluate the performance of its portfolios of loans compared with the broader mortgage market or other portfolios with similar characteristics so that they can benchmark performance against competitors and identify areas of improvement. Similar performance metrics can also be evaluated for originators.
"Right now there's more focus than ever on originators retaining the risk of their originations or at least being interested in how those originations perform," Brown said. "Even if they sell those originations into the market, they know there's going to be a focus on how those originations perform with respect to their origination practices and they're going to have an interest in being able to do peer-to-peer comparisons."
Fannie Mae may be one of the first mortgage investors doing just that. In September, Fannie released EarlyCheck, one of the first pieces of its Loan Quality Initiative. Lenders can use EarlyCheck to run documents through Fannie Mae's data verification system before the loan closes.
The service is available for free via a browser-based interface to approved Fannie Mae sellers that use Desktop Underwriter, Fannie's automated underwriting platform that reviews and approves eligible lender-submitted loan documents. Some loan origination system providers have built integrations that connect users to the portal without leaving the LOS environment.
Following the release of EarlyCheck, two other services have emerged to provide lenders more insight into loan portfolios and investors and aggregators the ability to conduct quality control checks on the loans they're purchasing.
New services offered by RealEC Technologies, a subsidiary of Lender Processing Services, and Ellie Mae play off the respective companies' positions in the loan underwriting process.
When data verification services like appraisals, credit checks and income verification are ordered during underwriting, the data is stored in a central location where it can be reviewed during the origination process by quality control employees and later, by loan aggregators and investors.
"What we're envisioning at some point in the near future is that there will be other investors in the market besides Fannie, Freddie and FHA," said Dan Sogorka, the president of RealEC. "At that point in time, we think we'll have a great tool to enable them to configure their own rule sets so both people can see what's going on as they deliver the loans."
Investors have typically performed quality reviews of only a sample of loans in a portfolio by reordering some, but not all, of the underwriting reports for them, said Jonathan Corr, chief strategy officer at Ellie Mae.
"Because there are two different parties involved and in this dynamic of reps and warrants, the correspondent lender is doing all the checks, running compliance, income, fraud, asset checks, etc., and passing that closed loan on to the investor," he said.
"The investor may trust most of the folks out there, but they have to be consistent in their process and have to go and do it again. They can't be comfortable with a loan file with copies of the verification," Corr added.
But the new technology preserves the original verification reports ordered by the lender. Investors can access those documents and run quality reviews on all the loans with all the verification data, confident that the documents haven't been tampered with.
"It creates a whole bunch of automation benefits for the correspondent lender in the manufacturing process and creates a sped-up process to provide access to that information to investors," Corr said.
Sogorka concurs. "Five to eight years ago, if you had 1,000 loans, you looked at five of them and you made your decision based on that," he said. "In an automated data-centric environment, you're looking at all 1,000 and know exactly what you're getting."
"The goal is to create 2011 to 2012 vintages that are perfect," he said.