Risk Management: A Differential Diagnosis
This paper is an examination of the context and management of two types of Complex Data Set clients. Supervisors should have their own model of Transparency Requirements to have consistency and make all the banks use it. Then everything would be comparable and Basel II Pillar 3 disclosures might actually tell us something about the quality of a bank’s loan book.
In short this paper is calling for
- Standardised methodology for benchmark accounting and
- standardised reporting (including risk measurements),
- ultra fast technology for transparency and
- visibility of data,
- controlled compliance and regulation,
- MDE environments for quants, data quality, granularity and management,
- buy not make wherever possible providing economies and consistency and shortening implementation cycles,
- supporting transparency of assets in wholesale credit markets,
- it’s a no-brainer really!
Risk Management: A Differential Diagnosis
Fully transparency-capable financial institutions are going to be the winners in the future, that’s the value proposition hurdle concept.
You can’t play in a post-Credit Rating Agency landscape where everything is (internally Rated) IR unless your IR is fully up to scratch because full transparency entails ETE (Exchange Traded Everything) and you have to be prepared to make IR public to join an exchange.
The salient point is that the real constraint on future transparency in banking, the limit of the validity of the best concepts of transparency, is technology. The key issue I want to share is in the questions surrounding true transparency in Banking and Financial Markets. The Exchange Traded approach is by definition, transparent as a result of Standards. Banking Transparency is a genuinely scary very large scale complex data set management problem of our time, since by definition an ETE philosophy entails a massive data set which requires to be managed, since the universe of financial instruments predicates an instrument level data set which satisfies transparency. Consolidate that up and even conceptually you have a massive data set.
Essentially the CRA’s (Credit Rating Agencies) were a commercial utility response to a massive data set problem. They effectively managed that massive data set for a fee and broadcast cryptic summaries when they felt like it. But since the Bear Stearns collapse it has been patently obvious that the well document inherent conflict of interest in the CRAs had become a social issue, exhibiting conscious predatory behavior. It was in the systemic DNA. So the Financial Institutions must DIY, manage that data set themselves, understand how to report transparently that subset of data at instrument level which returns confidence in those instruments.
The current regulatory framework is a total mess and until requirements become clearer (which will take a couple of years in some cases) banks and Financial Institutions need to have sufficient systems flexibility to accommodate those changes which are inevitably coming. There are strong regulatory incentives to remove the over-reliance on CRAs and let investors undertake their own due diligence.
In order to do so, you will need data, loads of it, ideally loan-level data and the recent SEC proposal (that requires issuers to post deal data in XML) is a big step in the right direction. In the real world in Risk Management in Financial Institutions, the client applications (of the complex data set) are for the most part either appliance model or Model Development Environments (MDE).