Saturday, November 30, 2013

European Market Infrastructure Regulation, Collateral Highway and Bank Analyzer.

Dear,

As you know, we’re in the middle of a transition period of the Financial System, from a business model based in Volume to a business model based in Efficient Capital Management.

Some weeks ago, we introduced the concept of Collateral mobilization, and today we’re going to elaborate it a bit more.


As Capital is scarce the main priority is managing it efficiently, wasting capital is not acceptable and the regulation will drive the change by increasing the Banks’ Capital Requirements.

One of the most important sources of regulation towards the new model are; the Market Infrastructure Regulation in Europe and the Dodd–Frank Wall Street Reform and Consumer Protection Act in the US.



They have some differences but are coincident in the general objectives.

- Reporting obligation for OTC derivatives

- Clearing obligation for eligible OTC derivatives

- Measures to reduce counterparty credit risk and operational risk for bilaterally cleared OTC derivatives

- Common rules for central counterparties (CCPs) and for trade repositories

- Rules on the establishment of interoperability between CCPs

And the consequence in both of them is the same, making collateral squeeze visible.


As collateral becomes scarce, financial agents demand new sources of this critical resource.

The market has seen the opportunity, and new services oriented to cover the collateral shortage and improve the inefficiencies in collateral management are being developed.

A very interesting example is the Collateral Highway, a joint initiative by Euroclear and The Depository Trust & Clearing Corporation (DTCC).


The Collateral Highway is an electronic marketplace which connects financial agents providing lending and borrowing collateral functionalities. 

This way the market can unlock collateral pools that otherwise would be infra-utilized, increasing liquidity in the financial system and mitigating the effects of the collateral scarcity.

But as in any other marketplace, offering eligible collateral is critical for successful trade, and there’s no better way of making financial assets eligible than disclosing its value, and we don’t have a better tool to disclose the value of financial instruments than Bank Analyzer.

But successful collateral mobilization confronts Banks with other challenges.

Communication is key, SWIFT is a real time network, but the communication flow can be very complex, involving the investor, the global custodian, the central securities depository , the International central securities depository and a third party.

Remember that Banking systems have been built in silo-style architectures with point to point connections in multiple flows like the example above, and even including manual steps. Efficient capital management is also about re-engineering and reducing complexity in the communication flow; we need a single, homogeneous and centralized repository of the Bank’s assets, another core value of SAP Bank Analyzer.

This central vision of the Bank’s collaterals brings the foundation of the Enterprise Collateral Management, another critical activity in Collateral Optimization.

There’re other techniques, like securitization, which are going to play a very important role in the new model for increasing capital mobilization; we’ll talk about them in a future post.

Looking forward to read your opinions.
K. Regards,

Ferran.

Saturday, November 23, 2013

Understanding the Bank Analyzer - Results Data Layer. Chapter II.

Dear,
Last week, we look at the open architecture of the Results Data Layer, in terms of the opportunity it represents for integrating non-Bank Analyzer data which is required for Accounting or Capital Requirements purposes.

On the current stage of the transitionary period, this is a very important feature. There’re many reasons (economic, strategic, technical ...) why we’re having heterogeneous landscapes in the Banking Systems, with SAP and non-SAP components coexisting in the same technological infrastructure. On this environment, open architectures facilitating the integration between former silo-style components are a necessary requirement.

But on the other hand, and in the middle of the systemic change, the open architecture of the Bank Analyzer-RDL is an opportunity for implementing new integrated scenarios that were not feasible some years ago.

This year I've been collaborating part-time in an advisory role for a European client who implemented AFI some years ago and now it’s considering a Profit Analyzer implementation.

In this particular case, we analyzed the possibilities of calculating the process costs of impaired loans.
Simplifying; in addition to the process costs of a performing loan, impaired loans also generate additional process costs for the bank (dunning and collection costs, collateral liquidation costs, etc.).

The probability of Loan becoming impaired is represented by the probability of default of the counterpart. This means that the standard process costs are related to the probability of default of the counterpart, the higher the probability of default the higher the potential process costs.

Collecting the historical real process costs in the Controlling modules (Activity Based Costing, Cost Center Accounting, Profitability Analysis, etc.) is well known functionality of SAP-ECC.

The Bank has a very detailed analysis of its process costs, including Activity Based Costing models, supported with data collected by their CRM System and leveraged to the SAP-ECC Controlling Components. Amongst other parameters, the Bank is capable of capturing most of the process costs by client type.

By including the Rating of the Counterpart as a reporting dimension in Profitability Analysis we will get an accurate tracking of the real costs of the process costs according the rating of the counterparts, including dunning costs, collection costs, etc.

In the proposed model we would transfer the historical real costs to SAP Business Planning and Consolidation. From there we will build planning models for estimating the evolution of future process costs by counterparty ratings.

Finally, the calculated plan costs and the estimated standard costs will be transferred to the RDL for two purposes.

- Making the basis of the standard costs, escalated by counterpart rating, in the AFI sub-ledger.

- Tracking the dispersion between estimated planned process-costs, estimated standard costs and real costs, and consequently the accuracy of the standard costs.

This is just an example of the integration capabilities of the SAP Banking business suite, I’m sure that as the market comes with new requests we’ll find new opportunities to show them.

K. Regards,
Ferran.

Saturday, November 16, 2013

Understanding the Bank Analyzer - Results Data Layer. Chapter I.

Dear,

One of the last components in joining the Layers architecture of Bank Analyzer has been the Results Data Layer (RDL), which is available since the Version 5.

When 7 years ago I was working in my first Bank Analyzer project, the system didn't have a Results Data Layer. The functions of storing results data coming from the Process and Methods Layer, was performed by the Results Data Base, which was an evolution of the component with the same name in SEM Banking.

At the time, we heard that with the new Bank Analyzer 5.0, the RDB was going to be replaced by a new component called RDL, whose main difference with the RDB is that the RDL supports the storage of non-Bank Analyzer originated data.

My first Bank Analyzer project was a Credit Risk/Basel II implementation in Bank Analyzer 4.2. When the project team heard about the RDL we saw it could be very useful for storing risk parameters (Loss Given Default, Exposure at Default, etc.) of risk exposures not-included in the scope of the project, and whose risk parameters, were not calculated by Bank Analyzer.

As the RDL was not available at the time we follow the work around of storing the data directly in the Operational Data Stores and Infocubes for the Basel II regulatory reporting of BIW; obviously with a much weaker level of integration.

The main priority for the Bank’s executives at the time was to get the approval by the audit authorities of the central bank. For the audit authorities there was no much difference in using the RDL or not, as they did not have the capacity of checking the source of the data provided in the reporting results.

Integrated systems, capable of offering reconciling functionalities between the Transactional and Analytical Banking systems, were not available at the time, and the capacity of the auditors for requesting these reconcilable reports was very limited.

This is one of the reasons while the Financial Situation is what it is, reconciliation capabilities between Analytical and Transactional Banking information are “control oriented” functionalities. We come from a Banking system driven by volume, in which capital consumption control was not the priority, with the catastrophic consequences we witnessed in 2008 and we’re still suffering 5 years later.

By the way, if we want make SAP Banking a successful business, we should educate the audit authorities about its reconciliation and reporting capabilities.

Remember; solvency is not only what it is, it is also what it looks like.

http://sapbank.blogspot.com.es/2010/07/stress-testing-what-solvency-is-and.html

After this first Bank Analyzer project I always have had the Results Data Layer available for the integration of accounting or credit risk data that for some reasons (project scope, technical limitations, etc.) could not be managed by the standard Bank Analyzer flow (Source Data Layer -> Process and Methods Layer -> Results Data Layer).

A typical example is the uploading Provision postings for Impaired Loans in the Bank Analyzer RDL. As previous versions of Bank Analyzer couldn't calculate provisions for impaired loans, we would calculate those provisions in the Reserves for Bad Debts module of IS-Banking, and upload the accounting postings of those provisions directly in the RDL. Consequently, the Bank enjoyed in the AFI sub-ledger the full vision of the Loans valuation, including impairment provisions.

Another example is the calculation of Hedge Accounting adjustments; valuation of Derivatives products and Hedge Accounting adjustments are initially calculated outside of Bank Analyzer and uploaded to the RDL; from there they enjoy the standard integration with the Financial Statements and the General Ledger.

But the open architecture of the RDL opens the gate for more challenging integration scenarios; we’ll talk about some of them next week.

K. Regards,
Ferran.

Saturday, November 9, 2013

Collateral Mobilisation and Bank Analyzer Value Proposition.

Dear,

In a new era of Capital Scarcity, efficient management of any form of Capital is the most critical activity.

Collateral is a form of Capital, over-collateralized assets means having non-allocated capital. The new model comes with very limited growth potential and bringing any contribution to GDP growth, reducing the capital consumption will be the priority.

Bankers will be incentived to manage efficiently any form of Capital, including collateral, by the constantly higher capital requirements and limited growth and revenues potential.

We already can see this tendency, but it will grow as we cross the systemic crisis towards the new model.

A very interesting example of this tendency is a new discipline called Collateral Mobilisation

http://www.youtube.com/watch?v=mVcgRLqBOsM

We’ll look in detail at the Collateral Mobilization and its characteristics in a future post, but today, I’d like to focus in the important role SAP Bank Analyzer must play in this critical activity of Collateral Management efficiency.

As you probably remember we’ve commented in the past that we’re in a transitionary period of the Systemic Crisis.

Since Hank Paulson’s rescue package of 2008, followed by the unconventional monetary policies of the Central Banks, Financial markets have been flooded with Massive Liquidity Injections.

http://en.wikipedia.org/wiki/Emergency_Economic_Stabilization_Act_of_2008

http://en.wikipedia.org/wiki/Quantitative_easing

Apparently, those liquidity injections have prevented the world to fall in a Depression, maintaining some level of stability in the Capital Markets.

But some authorized voices are warning that those measures have inflated another huge bubble whose burst can bring catastrophic consequences.

http://www.huffingtonpost.com/2013/10/14/nobel-prize-bubble-housing_n_4098409.html

We’ve already mentioned here last June speech of Mr Jaime Caruana, General Manager of the Bank for International Settlements; “Making the most of borrowed time”.

https://www.bis.org/speeches/sp130623.htm

Quantitative Easing of the U.S. Federal Reserve, bond-buying programmes of the European Central Bank and the Bank of Japan have only one purpose, maintaining some level of stability in the Capital Markets, giving time to the construction of the new Financial System. In Mr. Jaime Caruana’s words, they are just borrowing time.

But those measures have also accelerated the un-sustainability of the debt bubble bringing the unconventional monetary policies close to the end.

http://www.economist.com/content/global_debt_clock

Either the end of the unconventional monetary policies bring the burst of the Capital Markets bubbles or the authorities succeed in deflating them; in both cases, we’ll see at the time how the lack of solvency in the Financial System becomes visible, drying massive amounts of liquidity from it.

On this scenario, it will be necessary enjoying a system for mobilising any form Capital in order of allocating it efficiently.

The key word is eligibility; mobilising capital (or collateral) means transporting it from where it lies to where is needed; meaning, trading collateral for cash or another underline.

But solvency also means confidence; and consequently, capital scarcity will come with lack of confidence.

In an environment of distrust, making collateral eligible for mobilisation will require proving its core value, and here comes the value proposition of Bank Analyzer, certifying and improving the eligibility of the Bank’s collateral.

Once again, Bank Analyzer is not only about providing regulatory reporting, its competitive advantage comes from its disclosure capabilities and they are going to be very valuable in the new model.

Looking forward to read your opinions.

K. Regards,

Ferran.