Sunday, July 28, 2019

SAP Profitability and Performance Management, Analytical Accounting and Data Governance.

Dear All,
Some months ago, we shared some impressions about the new SAP Profitability and Performance Management (A.k.a FS-PER).


Since then, some relevant updates deserve we look again at this very interesting product.

The first one is the new name, the old SAP Financial Services Performance Management Solution now is called SAP Profitability and Performance Management.

Also, with the new name, SAP Profitability and Performance Management comes with a bigger Business Content, covering a wider scope of industries, including Telecom, Public Sector, Oil & Gas, Healthcare, Utilities, Transportation and Logistics Providers, Consumer Products, Automotive, Retail, etc.

Additionally, now SAP Profitability and Performance Management is available in both On-Premise and On-Cloud delivery.

In my opinion, there is some overlapping between the functionalities offered by SAP Profitability and Performance Management and SAP Business Planning & Consolidation for S/4HANA, particularly outside of Financial Services and Insurance industries.


On the other hand, since the release of SAP Financial Products Sub-Ledger (SAP-FPSL), the border between SAP Financial Services and Non-Financial Services solutions (other industries) is much narrower and the capacity of integrating Financial Services Business Process and Non-Financial Services Business Processes is much bigger.

Since the release of Smart-AFI and SAP FPSL, we missed an Accounting for Financial Instruments Solution including strong Funds Transfer Pricing and Internal Costs (Standard, Operational and Capital Costs) functionalities. With Classic-AFI we had Profitability Analysis as and Integrated Financial and Management Accounting solution for Financial Instruments, but the solution was not included in Smart-AFI and SAP FPSL.

SAP Profitability and Performance Management covers this gap. The Funds Transfer Pricing Business Content of SAP PaPM gives us the possibility of determining the Funding Costs of Financial Instruments, but it’s open architecture and capacity of connecting agnostically to any SAP and Non-SAP Data Provider and Repository (FI-CO, FPSL-SDL, FPSL-RDL, SAP-BW, HANA-Tables, etc.) also offers the possibility to leverage all its calculation functionalities to practically any Business Process.
Same logic applies to the Calculation of Capital and Operational Costs; SAP PaPM Business Content comes with many scenarios for the management, allocation and distribution of Direct and Indirect Costs.

But this flexibility also brings a challenge, Data Governance is a big opportunity for improvement in most of the banks, and implementing SAP PaPM with a previous redefinition of a centralized data-model can increase the risk of adding complexity to the banks IT landscape.

A very few banks have a robust, reconcilable central repository of Operational and Analytical Data. As we mentioned in a previous blog, SAP Banking data-model offers this capacity, but the reality is that only a few banks have seen already the competitive advantage of the SAP Integrated Financial and Risk Architecture.


In my opinion, it’s becoming mandatory a very hard exercise of redefinition of the Banks IT Architecture focused on improvement of the Data Governance, and the regulator is going to make it clear very soon.


Looking forward to read your opinions.

K. Regards,

Ferran.


Join the SAP Banking Group at: https://www.linkedin.com/groups/92860

Visit my SAP Banking Blog at: http://sapbank.blogspot.com/

Let's connect on Twitter: @FerranFrancesGi

Tuesday, June 11, 2019

Get ready for the Targeted Review of Internal Models with SAP Bank Analyzer – Credit Risk.

Dear all,
In December 2015 the European Central Bank decided that it would carry out a project to assess whether the internal models currently used by banks comply with regulatory requirements, and whether their results are reliable and comparable.

Since the 2008 Financial Crisis regulators have shown concerns about the use of internal models to determine Risk Weighted Assets and Regulatory Capital Requirements, mainly due to two reasons.

- Complexity of the Models which make difficult to asses whether risks are being mapped correctly and consistently.

- High variability and potential inconsistencies between the Risk Weighted Assets and Regulatory Capital Requirements calculated by different banks with similar portfolios when each one of them used their own internal models.

Basel II agreement trusted in the bank’s capacity for developing their own internal risk models for the calculation of credit, operational and market risk exposures and their correspondent calculation of the Capital Requirements. Unfortunately, in my experience, many banks rely in lightly formalized and difficult to audit risk models, including the use of desktop based spreadsheets.

On January 2013 the Basel Committee on Banking Supervision issued the Document "BCBS 239: Principles for effective risk data aggregation and risk reporting" establishing the 14 Principles for risk data aggregation that banks should follow.
To some extend, these 14 Principles established the generic guidelines that some years later have been formalized in TRIM.
I shared some details about it four years ago.
https://www.linkedin.com/pulse/bcbs-239-principles-sap-bank-analyzer-ferran-frances-gil/

Another alert came from the European Banking Authority, that on March 2015 issued a consultation paper called "Future of the IRB Approach" which concluded that there are significant differences in the application of the IRB requirements in EU banks, and subsequently divergences in the risk estimates and capital requirements, that cannot be explained by the differences in risk profiles.

https://eba.europa.eu/documents/10180/1003460/EBA-DP-2015-01+DP+on+the+future+of+IRB+approach.pdf

TRIM presents important challenges to banks.
Most banks present heterogeneous and very low quality data spread in different silos, lacking on tracking capabilities to the bank’s Operational Systems.

Although banks IT architects are trying to improve the situation with the building of Central Data Hubs, I am pessimistic about the final result.

Central Data Hubs are designed following the add-hoc/on-demand approach. Data architects collect data requirements from the data consumer systems, and design the data repository according to these requirements. And then, define the interfaces for collecting data from the source systems and populating the data in the destination systems.

To some extend, this approach implies reinventing the wheel, according to the bank’s own experience, and limited integration capabilities of the bank’s information architecture.

Additionally, regulation evolves, and new requirements force the data architects to enhance the multiple repository tables and in/out interfaces, compromising the integrity of the initial design.

You can find some details here.
https://www.linkedin.com/pulse/central-data-hubs-sap-finance-risk-platform-ferran-frances/

SAP Bank Analyzer – Credit Risk provides an strong framework of tools for fulfilling the Targeted Review of Internal Models.

- The Source Data Layer provides a Central Repository of homogeneous Operational Data (Master and Transaction Data).

The Primary Data Objects of the Bank Analyzer Source Data Layer have been designed to cover the Analytical and Regulatory requirements, independently of the capacities/limitations of the Operational Banking System. Bank’s architects can take advantage of these standard templates as a basic reference, trusting that the regulatory requirements will be fulfilled, and performing a gap analysis which helps them to identify data inaccuracies and redundancies.

- The Extract, Transformation and Loading capabilities of SAP Smart Data Integration in Premise and in the Cloud, including the high-performing capabilities of SAP HANA for storing and managing very-high volumes of data, reducing intermediate tables and assuring the referential integrity of the database.

- The Historical Database provides the functions for Collecting Data from the Operational Banking Systems, Calculation of the Default Rates, Data Storage for the Calibration of the model and Audit and Historization capabilities.

- The Credit Exposure calculation determines the risk key figures for exposures and their collateral, meeting the requirements of the Basel III Accord.

At the same time that TRIM audits are taking place, other regulatory developments are happening, particularly implementing the IFRS9 accounting framework.

TRIM demands coherency on the result of the Expected Loss Calculations in Basel IV (Solvency) and IFRS 9 (Accounting) which require an effective collaboration between Finance and Risk divisions of the banks. In my experience, bank’s information systems lack on a common architecture and holistic data-model facilitating the reconciliation.

This is the foundation of the Integrated Financial and Risk Architecture of Bank Analyzer. And this integration has improved significantly with version 9 and Smart-Accounting.


This was just a short description, you can find more details about TRIM in the Guide issued by the ECB on February 2017, and we will talk about this topic again in future articles.
https://www.bankingsupervision.europa.eu/ecb/pub/pdf/trim_guide.en.pdf

Looking forward to read your opinions.
K. Regards,
Ferran Frances.
www.capitency.com

Join the SAP Banking Group at: https://www.linkedin.com/groups/92860

Visit my SAP Banking Blog at: http://sapbank.blogspot.com/

Let's connect on Twitter: @FerranFrancesGi

Ferran.frances@capitency.com

Thursday, May 23, 2019

Capital Optimization with SAP Financial Products Subledger.

Dear all,

The Financial System is under a big stress, coming from two forces.

1) Excess of Debt ($250 Trillion). Debt consumes Capital and historically high levels of Debt reduce the Bank’s Capital available for Investing and Lending.

https://www.bloomberg.com/graphics/2019-decade-of-debt/

2) Limited Economic Growth. Capital is generated by economic growth and slow economic growth means weak Capital generation.

As we are in an economic environment of Limited Capital Generation and persistent Capital Consumption, Capital has become scarce; and don’t forget that Capital is the most important Resource of the Financial System.

Since 2008 Crisis, Bailouts and Quantitative Easing Cycles have produced the illusion that Capital was available, delaying the effects of the Capital scarcity and the necessary transformation of the Financial System.

https://www.reuters.com/article/us-eurozone-ecb-qe/the-life-and-times-of-ecb-quantitative-easing-2015-18-idUSKBN1OB1SM

I have been an SAP Analytical Banking consultant since 2006 and after 2008 Financial Crisis I thought that the time of the Financial System transformation have arrived; I was wrong.

For a decade, ultra aggressive monetary policies have delayed the problem, pushing down the yields and inflating artificially the value of the assets.

Again, this has delayed the problem of the Capital scarcity but it has not solved anything.

This week IMF urged German banking sector to accelerate restructuring.

https://www.reuters.com/article/us-germany-economy-imf/imf-urges-german-banking-sector-to-accelerate-restructuring-idUSKCN1SN14X

Reducing the Operational Costs by restructuring banks is just the first step of a much deeper transformation; the Financial System must change from a model based in Volume to a model based in Efficient Management of Capital, and this is a much more complicated challenge.

Setting Efficient Management of Capital as the main Priority requires redesigning the Bank’s Information Systems Architecture, 3 objectives need to be fulfilled.

1) Planning the bank Sales (Lending & Investment), maximizing the Profit weighted by Capital consumption.

2) Applying timely and efficiently Risk Hedging strategies.

3) Developing a Capital Allocation model which assures that Actual Sales Operations follow Sales Planning.

Planning the bank sales, maximizing the Profit weighted by Capital Consumption requires a centralized and holistic modeling of the bank’s exposures and collaterals, and an integrated calculation of the Portfolio Valuation, Capital Consumption and Provisioning.

You can find a presentation of the concept in the following Youtube link.https://www.youtube.com/watch?v=GkcVF5CWVrU&t=1s

Applying timely and efficiently Risk Hedging strategies requires detailed (Financial Transaction granularity) and fast reporting and simulation capabilities. Something that can only be achieved with the Integrated Financial and Risk Architecture of Bank Analyzer, in combination with the high-performing computing capabilities of SAP HANA.

Finally, developing a Capital Allocation model which assures that Actual Sales Operations follow Sales Planning requires a bidirectional and seamless integration between the Operational and Analytical banking system. SAP Bank Analyzer 9 offers a more simplified interface with SAP Banking Services than previous releases, and as the simplification effort pays back, the required bidirectional communication will be feasible. I will elaborate more in this topic in future articles.

Looking forward to read your opinions.

K. Regards,

Ferran Frances.

www.capitency.com

Join the SAP Banking Group at: https://www.linkedin.com/groups/92860

Visit my SAP Banking Blog at: http://sapbank.blogspot.com/

Let's connect on Twitter: @FerranFrancesGi

Ferran.frances@capitency.com

Tuesday, May 7, 2019

Collateral Management with SAP S/4HANA for Financial Products Subledger Data Platform.

Dear,
As you probably know, SAP recently released the new SAP S/4HANA for Financial Products Subledger https://www.youtube.com/watch?v=-veOZgkxllQ

There are many advantages on the SAP S/4HANA for Financial Products Subledger and I would need several articles for describing them, but today I will focus on its capacity for the efficient management of the Bank’s collaterals.

One of the main consequences of the 2008 Financial Crisis was the acknowledgement that the Financial System was severely under-capitalized; the issue was tackled with two complementary approaches.

Governments and Central Banks recapitalized the Financial System with Bail-Outs, Troubled Asset Relief Programs and Quantitative Easing Cycles.
Regulators put the focus on increasing and making visible the Capital Requirements with new Solvency and Accounting regulations (IFRS 9, IFRS 15, IFRS 16, IFRS 17, Basel III, Basel IV, etc.)
Recapitalizing the Financial System has been a temporary measure. Global debt has kept growing and weak economic growth has generated new capitalization tensions; more visible as the end of the Quantitative Easing Cycles has been announced in Europe.

With the normalization of the monetary policy Non Performing Assets will become illiquid, pushing down their value and increasing again the capitalization issues.

In this scenario of capital scarcity all forms of capital need to be managed efficiently, and collaterals are probably the form of capital with the poorest representation in the Bank's Information Systems.

Banking Information Systems are General Ledger centered, and as Collaterals are not represented in the Balance Sheet, Bank’s IT Architects have not paid much attention to their modelization.

As any other bank right or obligation, collaterals management has an Operational and an Analytical component.

The Operational management of collaterals focus in the technical details of the collateral and its contractual relationship with the asset, whose risk is hedging,

The Analytical management of collateral focus in the sustainable value of the collateral and its capacity for reducing the capital consumption, and limiting the impairment provision of the asset whose risk is hedging.

In Bank Analyzer and S/4HANA for Financial Products Subledger collaterals are modelled in two different objects.

As a SDL-Financial Transaction for representing the contractual relationship between the collateral and the asset.
As an RDL-entry representing the effective capacity of the collateral for reducing Credit Risk exposures and limiting impairment provisions.
Although a collateral has a Nominal Value, it can have several different Credit Risk mitigation capacities, according to the Solvency calculation approach that the bank is following; Simplified Standardized, Comprehensive Standardized, Foundation IRB and Advanced IRB.

Many times a group of collaterals is covering a group of exposures; determining the most efficient distribution of the collaterals to the exposures reduces the capital consumed which is the foundation of the Dynamic Management of Collaterals, one of the main Capital Optimization techniques.

High performance in-memory computing capabilities of HANA facilitates building simulation scenarios and stress-testing but before banks can take advantage of them, they must improve the representation of the collaterals in their Information Systems.

SAP Bank Analyzer and S/4HANA for Financial Products Subledger provides a centralized repository of collaterals, facilitating regulatory reporting, Risk Weighted Assets and Impairment provisions calculation, stress-testing of the collateral values and simulation scenarios for Capital Optimization.

This is just a brief description of some of the advantages of the Data Model of the Integrated Financial and Risk Architecture of SAP Bank Analyzer and S/4HANA for Financial Products Subledger. We’ll continue in future blogs.

Looking forward to read your opinions.

K. Regards,

Ferran.

www.capitency.com

Join the SAP Banking Group at: https://www.linkedin.com/groups/92860

Visit my SAP Banking Blog at: http://sapbank.blogspot.com/

Let's connect on Twitter: @FerranFrancesGi

Ferran.frances@capitency.com