36
Bank Analyzer (FS-BA) Purpose Bank Analyzer supports risk and return management by calculating, measuring, and analyzing financial products. The structure of Bank Analyzer is based on the Integrated Finance and Risk Architecture (IFRA) and meets today's requirements (International Accounting Standards (IAS), Basel II, Risk Adjusted Performance Measurement, and Sarbanes-Oxley) for financial products. Bank Analyzer is a family of products that consists of the following components Data Load Layer (FS-BA-DL) Source Data Layer (FS-BA-SD) Processes and Methods (FS-BA-PM) Results Data Layer (FS-BA-RD) Analytics (FS-BA-AN) Infrastructure (FS-BA-IF) Tools (FS-BA-TO) Data Load Layer (FS-BA-DL) Purpose This component contains the functions for importing source data and results data from SAP NetWeaver Business Intelligence (BI) to the specific interfaces in the Source Data Layer (SDL) or Results Data Layer (RDL) in Bank Analyzer. This is part of the general extraction, transformation and loading process (ETL process) that you can use to transfer data from your own source systems to Bank Analyzer. Integration The following graphic shows the components that are part of the ETL process:

Bank Analyzer

Embed Size (px)

DESCRIPTION

SAP BA

Citation preview

  • Bank Analyzer (FS-BA)

    Purpose Bank Analyzer supports risk and return management by calculating, measuring, and analyzing financial products. The structure of Bank Analyzer is based on the Integrated Finance and Risk Architecture (IFRA) and meets today's requirements (International Accounting Standards (IAS), Basel II, Risk Adjusted Performance Measurement, and Sarbanes-Oxley) for financial products.

    Bank Analyzer is a family of products that consists of the following components

    Data Load Layer (FS-BA-DL)

    Source Data Layer (FS-BA-SD)

    Processes and Methods (FS-BA-PM)

    Results Data Layer (FS-BA-RD)

    Analytics (FS-BA-AN)

    Infrastructure (FS-BA-IF)

    Tools (FS-BA-TO)

    Data Load Layer (FS-BA-DL)

    Purpose This component contains the functions for importing source data and results data from SAP

    NetWeaver Business Intelligence (BI) to the specific interfaces in the Source Data Layer (SDL) or

    Results Data Layer (RDL) in Bank Analyzer. This is part of the general extraction, transformation and

    loading process (ETL process) that you can use to transfer data from your own source systems to

    Bank Analyzer.

    Integration The following graphic shows the components that are part of the ETL process:

  • ...

    1. Extraction

    The system extracts data from operational systems (full load or delta load) and saves the extracted

    data in SAP NetWeaver BI. The data is stored in DataStore objects, which have the same structure as

    the data from the feeder system.

    2. Transformation

    In SAP NetWeaver BI, the system transforms the extracted operational data into an analytical format,

    and saves this as the result of the transformation process. The analytical format is largely the same

    as the format used in the inbound interfaces for the Source Data Layer and Results Data Layer.

    3. Loading

    The system loads the transformation results from SAP NetWeaver BI as InfoProviders into Bank

    Analyzer.

    Features

    The load process

    The Data Load Layer connects the transformed data within SAP NetWeaver BI and the storage

    locations in Bank Analyzer, and reads the data from the InfoProviders in SAP NetWeaver BI. It calls the relevant interfaces in the Source Data Layer and Results Data Layer.

    since the volume of data may be large, the data load process can be run as a parallel job.

    Custom key figures and characteristics can be transformed flexibly during the data load process

    if appropriate Customizing settings are made.

    Process control

    Process control is part of the Data Load Layer and is also integrated in the SAP NetWeaver BI

    technology. This ensures that the complete ETL process is subject to a standard process control and monitoring.

    The function is integrated into BI technology, which contains the new process chain

    category FS Data Load Function, which can be used in the definition of a BI process chain. The process is scheduled and monitored in BI.

  • The status of the process is written back to BI.

    Tracking of changes

    Each object that was changed during the transformation process in BI is included in the loading

    process. The changes are handled as change pointers in the Change Notification Service (CNS). This tool collects all the changes made to an object (in this case the Bank Analyzer primary object) in order to make the all the changes at once.

    The change indicators, which are created in BI and stored in Bank Analyzer, are the starting point for the loading process. The loading process updates in Bank Analyzer all the objects that were changed in NetWeaver BI (the update BAPIs are called for the SDL objects, or the APIs are called for RDL data),

    A log is created of all the primary objects that were changed.

    Constraints

    The Data Load Layer does not contain data checks. The system sends data that has been

    transformed and mapped directly to the inbound interface of the Bank Analyzer system.

    Each load process can supply the last version of an object only. It is not possible to process

    more than one version for each business day.

    The system does not load business partner data The only way that the system can load

    business partner data into the Bank Analyzer system is by means of an existing interface for business partners.

    Source Data Layer (FS-BA-SD)

    Purpose You use this component to manage original data for the Bank Analyzer system.

    The system uses the Data Load Layer component to load original data from other operational systems or source systems into the Source Data Layer (SDL) by means of an extraction, transformation, and loading process (ETL process). The SDL saves, consolidates, and manages the original data. At the same time it provides interfaces to additional operational systems.

    The primary objects of the Source Data Layer (SDL) and their scenario versions are a flexible way of saving master data and flow data. They also group this data into units that belong together logically from a business perspective. This ensures that the Bank Analyzer components that are linked to the SDL have a standard, consistent data source.

    In addition to storing primary object data, the SDL provides the following primary objects functions for applications linked to it:

    Access to Source Data

    General Functions for Source Data Layer

    Methods for Source Data

    General Access to Corrections

    Tools

  • Integration The SDL provides both the central original data basis and a part of the underlying infrastructure for linked applications. It is therefore a key element in ensuring the consistency of data and results.

    Processes & Methods (FS-BA-PM)

    Purpose You can use this component to carry out all financial and risk calculations for Bank Analyzer. Unlike

    Methods, Processes combine the selection, checking, and processing of data into one step.

    The system generates the calculation results using either original data from the Source Data Layer

    (SDL) or existing results data. Existing results data comes from either source systems or previous

    calculation steps. The system then stores data that has been completely valuated in the Results Data

    Layer (RDL).

    General Calculation and Valuation Methods (FS-BA-PM-GM)

    General calculation and valuation functions provide you with various methods for upstream

    processing.

    Various Bank Analyzer components can use the results data from this method.

    Determination of Net Present Values and Calculation Bases (FS-BA-PM-EIC)

    You use this process to calculate net present values and other key figures that you can use as input

    for calculating funding costs and standard costs. This component calculates funding results, standard

    cost rates and the effective capital over time, for instance.

    Accounting for Financial Products (FS-BA-PM-AFP)

    Accounting Processes

    Accounting processes comprise business transaction processing and financial position

    management in Accounting for the subledger scenario.

    Cost Accounting Processes

    Cost Accounting Processes contain the functions for profitability analysis.

    Hedge Processes (FS-BA-PM-HP)

    Hedge processes provide various functions for IAS and Basel II. In particular, you can use these

    service functions for key date valuations and hedge accounting.

    Credit Risk (FS-BA-PM-CR)

    Credit risk provides up-to-date control instruments for the simulation, planning, and analysis of the

    overall bank with its different levels. Risk management reflects the reporting obligations imposed by

    the banking supervisory authorities.

  • Results Data Layer (FS-BA-RD)

    Purpose You can use this component to store, display, and edit results data. This results data is based on accounting-related or risk-related analyses of financial transactions or financial instruments in Bank Analyzer (Basel II, IAS Financial Reporting), or on analyses using other analysis tools. Results data is stored in the Results Data Layer (RDL) in results data areas in the form of result types.

    The RDL is part of the Integrated Finance and Risk Architecture (IFRA). By means of common dimensions (for example, financial transaction ID, financial instrument ID, or legal entities) that are shared by results within a results data area, the RDL provides a basis for the integration of results data. It stores data in an infrastructure that is semantically and technically standardized, which enables standardized usage for existing and future applications that are integrated in the system.

    The RDL provides the following functions:

    Storage of results in results data areas

    Aggregation

    Versioning

    Archiving

    External interfaces

    User Interfaces

    Example You have started a process that generates documents for financial transactions and financial instruments. These documents are stored in the RDL. The financial reporting process reads the data from the RDL and generates results that are used for the profit and loss statement or for the annual financial statement.

    Analytics (FS-BA-AN)

    Purpose This component contains analytical applications that call results data for Processes and Methods from

    the Results Data Layer (RDL) and, if necessary, continue to process this data.

    The Regulatory Reporting Interface, for example, gets data from the RDL and transfers this to the

    reporting functions in SAP NetWeaver Business Intelligence (BI). The Historical Database gets data

    from the Source Data Layer (SDL) and processes it as part of data storage based on a time series in

    accordance with Basel II.

    Features

    Components Relevant for Accounting

    General Ledger Connector (FS-BA-AN-GL)

    If you use the subledger scenario, the General Ledger Connector reads the subledger documents

    from the RDL and transfers results data to a connected general ledger.

  • Financial Statement Preparation (FS-BA-AN-FSP)

    Financial statement preparation includes Balance Object Manager, Balance Processing, and

    Aggregated Transactions. In Balance Object Manager you create balance objects (BO) that define the

    processing level for processes in Balance Processing, in particular the object that is to be included in

    reporting. Balance Processing loads results data from the RDL and prepares the period-end

    processing for financial products, such as the balance sheet and income statement including notes to

    the financial statements.

    Merge Scenario (FS-BA-BA)

    The merge scenario processes only those financial instruments and transactions whose IAS

    valuation differs from its valuation according to local GAAP. The merge component converts local

    GAAP data to IAS data. The system creates a complete IAS balance sheet, including an income

    statement and notes to the financial statements.

    The merge scenario stores the results data not in the RDL but in the Result Database (RDB).

    Components Relevant for Basel II

    Historical Database (FS-BA-AN-HDB)

    The Historical Database is a time-based data store and meets the Basel II requirements for

    managing historical data. The system can provide the HDB with data from the Source Data Layer

    (SDL), RDL, or another source system.

    Disclosure and Reporting (FS-BA-AN-DR)

    The Disclosure and Reporting component provides utilities for selecting and extracting reporting

    data and meets Basel II requirements of the Capital Accord. The Disclosure and Reporting

    component supports external disclosure and internal reporting, and provides support for supervisory

    investigations and stress test reports.

    Regulatory Reporting Interface (FS-BA-RR)

    The Regulatory Reporting Interface ensures connection to external reporting tools in accordance

    with Basel II. It loads data from the SDL and RDL, converts it into a fixed format, and provides

    reporting tools.

    Additional Components

    Limit Manager (FS-BA-AN-LM)

    Limit Manager provides support when determining, analyzing, and limiting counterparty/issuer

    risks, country risks, or Basel II-specific key figures. Banks set different maximum risk amounts in order

    to limit the potential harm caused by the insolvency of a business partner. Limit Manager also

    provides operational functions and supports both internal and external reporting.

    Profit Analyzer (FS-BA-AN-PA)

    Profit Analyzer ensures that costs and revenues are assigned to the single bank transactions,

    customers, or other segments that gave rise to them. During the profitability analysis, the system

    updates results as single items and evaluates them in terms of various criteria. You can use Profit

    Analyzer to carry out sales planning based on custom characteristics.

    Strategy Analyzer (FS-BA-AN-STA)

    Strategy Analyzer provides a net present value analysis and a gap analysis for market risk

    management. The net present value analysis shows the value of a portfolio on a key date. You can

  • use the gap analysis to examine your portfolio with regard to interest rate risks by creating incoming

    and outgoing payments, liquidity, and net interest income for a longer period of time, for example.

    Infrastructure (FS-BA-IF)

    You can use this component to call functions that provide central services to the various Bank Analyzer components.

    Infrastructure contains the following functions:

    Data Load Layer (FS-BA-DL)

    Communication and Worklist Services

    Calculation and Valuation Process Management

    Extraction and Reporting Services

    Correction Services

    General Scenario Management

    Settings for XI Services

    Tools (FS-BA-TO)

    Purpose You can use this component to call functions that are used in various places in Customizing

    for Bank Analyzer.

    In addition, the following tools are available:

    Garbage Collector

    Schedule Manager

    Segmentation Service

    Features

    Derivation Tool (FS-BA-TO-DE)

    The derivation tool enables you to control how the system derives characteristics and key

    figures from other characteristics and key figures, and how it derives the fixed fields of a field

    catalog. In Bank Analyzer the system calls derivations from the coding or by using a

    secondary data source. You can create this secondary data source with the module editor in

    Customizing for Bank Analyzer.

    You can state the derivation environment for deriving the validity of a hedging relationship, for

    example, in Customizing for Bank Analyzer by choosing Processes and Methods Hedge

    Processes Portfolio Fair Value Hedge Configuration Derivation of Validity. You use

    this derivation process in the secondary data source in order to use the characteristics of a

    transaction to derive whether the transaction is one of the qualified positions or unqualified

    positions in hedge accounting.

    Module Editor (FS-BA-TO-ME)

    The module editor generates modules that contain a sequence of processing steps. The

    modules are used to enrich user-defined information and provide the system with secondary

    data sources.

    An application makes entries into the fields of an input structure and calls the module. The

    system applies each processing step of the module in the sequence defined in Customizing.

  • The system can call function modules, derivations, or primary data sources within the module.

    The system then makes entries into fields of the output structure.

    Modules can have various functions. The selection module of the Strategy Analyzer, for

    example, selects data using the Primary Data Source processing step. The calculation

    module of Profit Analyzer carries out complex calculations for the processing

    steps formula, derivation, and function module.

    You can find the settings for the module editor, for example, in Customizing for Bank

    Analyzer by choosing Bank Analyzer Analytics Profit Analyzer Profit

    Engine Calculation Edit Modular Costing.

    Result Database (FS-BA-TO-RDB)

    The Result Database (RDB) is a database in which the system saves results data

    permanently. Results are then available for further processing, by reporting, for example, or

    for additional calculation runs.

    The RDB and the Results Data Layer (RDL) are two different results databases in which the system can store results data. Each database is based on different principles. The RDB is found in a variety of forms in Bank Analyzer. These forms depend on the various areas (Financial Accounting, Basel II). The RDL is a standardized results data store for accounting and risk-based analyses of financial transactions or financial instruments.

    For the long-term we recommend that you use the central RDL to store results data in a standardized way. In Customizing for Bank Analyzer you can choose whether the system is to store Basel II-specific results data in the RDB or the RDL.

    Processing Framework (FS-BA-TO-PFW)

    The Processing Framework supplies the processing rules with data from various data

    source categories. The calculation and allocation processing rules are available in Profit

    Analyzer, for example. The system uses suitable selection conditions to create a worklist. The

    system can also add further information using a secondary data source. The result records

    generated by the processing rules are forwarded to the temporary buffer, The system

    provides verification lists which you can use to check whether the result records are plausible

    from a business perspective. The result records are then updated in data drains.

    You can also start the processing steps manually. In a typical scenario, you include the

    processing steps in the Schedule Manager which then carries out an automatic month-end

    processing on the basis of this.

    Run Administration (FS-BA-TO-RUN)

    Run administration provides you with various processing functions for the runs in the

    individual Bank Analyzer applications. Run administration therefore enables standard, general

    run administration.

    Aggregation Tool (FS-BA-TO-AGT)

    The aggregation tool is used to aggregate data from primary and secondary data sources,

    BAPIs, and the Data Processing Framework. The aggregation type is determined using

    granularities such as the branch or the business partner. Possible aggregation functions are

    determining minima, maxima, totals, or the number of occurrences of a certain value.

    You can find the settings for aggregation, for example, in Customizing for Bank Analyzer by

    choosing Analytics Historical Database General Settings for the Historical

    Database General Settings for Data Selection Settings for Aggregation Processes. You

    can use the Aggregation Business Add-In (BAdI) to override the results from the aggregation

    process you defined in the IMG activity Edit Aggregation. This enables you to change

    individual results.

  • Data Processing Framework (FS-BA-TO-DPF)

    The Data Processing Framework provides selection processes for processing data to the

    Historical Database, the Limit Manager and Bank Analyzer-wide to generic BI data

    extraction and generic ad hoc reporting For example, you determine the selection settings

    in Customizing for the -Historical Database in the Edit Basic Settings for Data

    Sources section. Every selection is assigned to a fixed context (application of the Data

    Processing Framework) which is, in turn, assigned to a certain application of the module

    editor. Data processing that is either triggered by a report or by generic data extraction, for

    example, can contain both selection BAdIs as well as aggregations and general selection

    criteria.

    Configurator (FS-BA-TO-CON)

    The configurator reads characteristics and key figures and generates customer-specific

    database tables and field structures for further processing. The system calls these processes

    "generation". The system currently uses only both Bank Analyzer accounting scenarios for

    generation. For more information, see the documentation about Generation.

    The division of the components ensures that data is stored in an integrated and consistent way. The

    system loads original data from operational systems or source systems into the Source Data Layer

    (SDL). The SDL is the original data basis for the processes and methods of Bank Analyzer. The

    valuation results of processes and methods are stored in the Results Data Layer (RDL). This structure

    ensures that original data, methods, and valuation results are clearly separated. The open, modular

    structure of Bank Analyzer supports a gradual implementation into existing system landscapes.

    Bank Analyzer provides a consistent view of a bank's operational data and enables you to process

    data promptly so that you are always in a position to provide current financial and risk information.

    Results data is therefore always available for decision-making and for day-to-day business.

    The figure below shows the structure of Bank Analyzer:

    ... ...

  • 1. 1. The SDL manages the basic data for the measurement of financial products. This data is

    loaded from the operational source systems by means of extraction, transformation, and

    loading (ETL) processes.

    The SDL is the source for semantically integrated data for all valuation processes that are based on financial products, and is also a central consolidated source for analyses.

    The SDL is not used to store data that has already been analyzed completely. Instead, this data is stored in the RDL.

    2. 2. The RDL manages consistent and reusable financial and risk data from various calculation

    and valuation processes for financial instruments and financial transactions.

    3. 3. Reporting and Analytics read results data from the RDL. The Analytics layer contains

    analytical applications that call results from the RDL and process them as required. This means

    that results data is analyzed specifically for each application.

    4. 4. Infrastructure and Tools provide central services and utilities for the various Bank Analyzer

    components.

    In addition to the RDL, Bank Analyzer also has a Result Database (RDB). RDL and RDB are two different results databases where the system can store results data. The RDB is found in a variety of forms in Bank Analyzer. These forms depend on the various areas (Financial Accounting, Basel II). The RDL is a standardized results data store for accounting and risk-based analyses of financial transactions or financial instruments.

    Integration The integrated data store for product-based source and results data is based on SAP NetWeaver

    Business Intelligence technology. SAP NetWeaver is the basis for integrating Bank Analyzer in

    various IT environments and internal bank solutions.

    Features Bank Analyzer contains the following solutions:

    SAP Financial Database

    The SAP Financial Database solution offers an extensive database infrastructure for analytical data

    and accompanying data processing systems. It is technically compatible with other SAP applications

    and with third-party applications.

    SAP Financial Database uses the following Bank Analyzer components:

    SDL (FS-BA-SD)

    RDL (FS-BA-RD)

    Cash Flow Generation (FS-BA-PM-GM-CFG)

    Correction Server (FS-BA-IF-CS)

    The system uses ETL processes to load original data from other systems or source systems into the

    SDL in the form of primary objects. Primary objects are a flexible way of storing master data and flow

    data in entities that belong together logically from a business perspective.

    Results data from financial calculations and valuations are stored in the RDL in results data areas in

    the form of result types. The SAP Financial Database uses the SDL and RDL to support the extensive

    versioning and authorization concept. In the SDL it provides functions to support the principle of dual

    control. This means that you can define special release rules to protect certain processes.

    Cash flow generation generates cash flows that are made up of a number of flows (for example,

    disbursement, interest, payment).

  • The correction server enables data flow management and records corrections to find and display any

    inconsistencies. The correction server records corrections and can find and display any entities

    belonging to these corrections, provided the relevant system settings are made.

    SAP Basel II

    The SAP Basel II solution supports the Basel II regulations for risk and capital adequacy management

    as well as new supervisory review and disclosure processes. The solution integrates both internal and

    external credit risk management on a central platform. Bank Analyzer supports all methods for

    calculating credit risk, from the standardized approach to the IRB advanced approach.

    In addition, the software covers the local requirements for the EU Directive and the German Solvency

    Regulation. You can use Customizing settings to define whether the calculation is for Basel II, the EU

    Directive, or the German Solvency Regulation.

    The system runs the calculation not only for real data, but also for stress data (for example, changes

    in the ratings of sovereigns or business partners).

    The SAP Basel II solution uses the following Bank Analyzer components:

    Account Pooling (FS-BA-PM-GM-AP)

    Free Line (FS-BA-PM-GM-FL)

    Determination of Default (FS-BA-PM-GM-DD)

    Credit Exposure (FS-BA-PM-CR-CE)

    Historical Database (FS-BA-HDB)

    Disclosure and Reporting (FS-BA-DR)

    Regulatory Reporting Interface (FS-BA-RR)

    SAP Accounting and Financial Instruments

    The SAP Accounting and Financial Instruments solution supports compliance with the International

    Financial Reporting Standards (IFRS) and local accounting standards.

    Subledger scenario

    In this scenario you use Bank Analyzer as a subledger for the accounting of financial instruments.

    You transfer financial instrument data to the Bank Analyzer system here. You can then post and price

    the related business transactions, aggregate documents, and transfer them to the general ledger. You

    can also create the financial statements for the end of the period. You can link the hedging

    relationships between financial instruments, test the effectiveness of the hedging relationships as per

    the accounting rules, and create accounting documents for the hedged items.

    In addition to the SDL and the RDL, the subledger scenario uses the following components:

    Accounting Processes

    Hedge Processes (FS-BA-PM-HP)

    General Ledger Connector (FS-BA-AN-GL)

    Financial Statement Preparation (FS-BA-AN-FSP)

  • SAP Accounting for Financial Instruments is released for volumes of up to 1 million financial transactions only. If the volume of your business exceeds 1 million transactions, a fit/gap analysis is required. For more information, contact your SAP account executive, or create an OSS message under component FS-BA.

    Merge scenario

    You can use this scenario to process financial instruments in accordance with IFRS, determine

    financial reporting data, consolidate data from individual companies, and create company reports. The

    system merges the calculated IFRS data with the local GAAP (Generally Accepted Accounting

    Principles) data and calculates the required financial statement items. You can link the fair value

    hedging relationships between financial instruments, test the effectiveness of the hedging

    relationships as per the accounting rules, and create accounting documents for the hedged items.

    You can display the results in reporting.

    The merge scenario stores results data in the RDB.

    SAP Hedge Management

    The SAP Hedge Management solution handles all hedging activities in line with IAS 39. Bank

    Analyzer covers fair value hedges, cash flow hedges, and portfolio fair value hedges. The system

    identifies hedged objects and hedging instruments, and maps these as hedging relationships in line

    with IFRS. Bank Analyzer provides prospective and retrospective effectiveness tests, and extensive

    functions for hedge accounting.

    SAP Profitability & Management Accounting

    This solution comprises scenarios for profitability analysis. Profitability analysis measures the indirect

    costs and income generated by each transaction in the bank's retail business. These include cash-

    flow-based financial transactions such as loans and accounts that can be measured on the basis of

    periodic volume information. The indirect costs and income to be measured are funding costs, funding

    revenue, and the standard costs for the following components: process costs, risk costs, and the cost

    of equity.

    Profitability analysis with accounting function (integrated accounting for financial products)

    You can use this scenario in conjunction with the subledger scenario for financial products only. It allows you to integrate financial accounting and management accounting. The integrated accounting scenario allows you to create income statements and balance sheets for organizational units such as business units or profit centers.

    Profitability analysis without accounting function

    In this scenario, you supply direct costs from source systems and use the profitability analysis functions without the Bank Analyzer component for accounting processes.

    SAP Profitability Analysis & Management Accounting and SAP Limit Manager are released only for volumes not exceeding 300 000 transactions. If the volume of your business exceeds this, a fit/gap analysis is required. For more information, contact your SAP account executive, or create an OSS message under component FS-BA.

    Additional Components

    Limit Manager (FS-BA-AN-LM)

    See the note under SAP Profitability Analysis & Management Accounting.

    Strategy Analyzer (FS-BA-AN-STA)

  • Profit Analyzer (FS-BA-AN-PA)

    Counterparty Risk

    Country Risk

    Limit Manager (FS-BA-AN-LM)

    Purpose To meet the requirements of risk management regulations and business considerations, Bank Analyzer contains functions for measuring, limiting, and analyzing default risks.

    Banks set different maximum risk amounts in order to limit the potential harm caused by the insolvency of a business partner.

    This function helps you manage defaults by means of limits and the online monitoring of these limits. These functions can be used to produce comprehensive reports for management purposes and for external purposes.

    Integration Limit Manager is part of Bank Analyzer. It uses the attributable amounts calculated from Credit Exposure, for example, and allocates them to the limits you define. You can display the results of the limit utilization runs using the SAP List Viewer (ALV) or SAP NetWeaver Business Intelligence (BI).

    For more information, see Architecture of Limit Manager.

    Features You use Limit Manager to manage risks by defining limits and monitoring them continuously to ensure that these limits are observed. Limits can be managed flexibly, since the limit characteristics that are available can be combined in any way.

    Limit Manager enables you to define different levels for the limitation of default risks. The limit area represents the highest level, and is used to separate different areas that are logically independent. There are different limit types for each limit area. You assign defined limit characteristics, such as an organizational unit, a business partner, or currency, to the limit types. Within a limit, you define specific limit amounts that are related to the characteristic values of a limit type.

    You can create a limit for each combination of limit characteristics and limit characteristic values. The limit is a maximum amount for limit utilizations that is defined in relation to certain values of the limit characteristics of a limit type.

    Architecture of Strategy Analyzer

    Integration Strategy Analyzer is one of the Bank Analyzer applications. As is the case with the other applications,

    Strategy Analyzer is also provided with data from the Source Data Layer (SDL). Reporting

    functions are provided by SAP NetWeaver Business Intelligence (BI) or directly in Bank Analyzer by

    the SAP List Viewer (ALV)

    SAP provides fixed key figures for NPV analysis and gap analysis in Strategy Analyzer; you cannot

    change these key figures. SAP provides pricing models for the valuation of financial transactions and

  • instruments. You can add your own pricing models in Customizing, and you can also connect external

    price calculators.

    Strategy Analyzer uses the General Calculation and Valuation Methods component in Bank Analyzer,

    which contains cash flow refinement methods , derivation strategies for preparing selected transaction

    data, and the price calculator for pricing transactions and positions.

    Data Flows Strategy Analyzer uses the same architecture for the net present value analysis and the gap analysis.

    For this reason, Strategy Analyzer is divided into two runs: the valuation run and the aggregation run.

    The valuation run prices transactions, and the aggregation run consolidates cash flows and net

    present values across a maturity band. In net present value analysis, you start the valuation run only.

    For gap analysis, however, you start both the valuation run and the aggregation run, except for the

    aggregation of single records in gap analysis, in which the results of a valuation run are displayed

    without being consolidated.

    NPV and gap analyses can be started online or as batch jobs. We recommend you start them in

    online mode only if the volume of data is small. In batch processing, Strategy Analyzer uses the

    Result Database (RDB) for interim results (IntR-RDB) and final results (FinR-RDB):

    In online processing, only the main memory is used and not the RDB. Moreover, reporting can only be carried out in the SAP List Viewer (ALV).

    You can write the results of the valuation runs to a file. You make this setting in Customizing for Strategy Analyzer for each valuation run type. If you select File as the data drain, the system writes the results of the valuation run to the application server in the form of a file. This file is then also available to other systems, as well as Bank Analyzer. The administrator of the application server has to ensure that only authorized users can access the data. We also recommend that you encrypt the data.

    Dependencies Not all valuation run results can be saved in file form on the application server. This is possible for split cash

    flows only.

  • Valuation Run

    Valuation runs are started for net present value analyses and gap analyses. In order to improve performance, a valuation run is usually divided into subvaluation runs that are started separately and that are processed in parallel. Each subvaluation run involves the following steps:

    Creation of a worklist

    The system uses InfoSets and selection characteristics to select the object IDs of the transactions and positions that are to be analyzed from the SDL.

    You can use selection criteria to restrict the worklist of a valuation run or its subvaluations. You might need to do this if, for example, you assign a valuation run multiple subvaluations that are provided by the same InfoSet but that you want to process in different worklists. The selection criteria must not overlap, but they must make up the entire valuation run worklist.

    Selection of transactions and positions

    The transactions and positions are selected in the secondary data source.

    Formatting of cash flows

    In the secondary data source, the system calls up the Cash Flow Engine. The Cash Flow Engine contains multiple cash flow refinement methods that the system uses to change the valuation structure of transactions and positions in order to prepare the data for the analysis.

    Measurement of transactions and positions

    The system calculates the key figures of the selected key figure family (net present value or gap).

    Summarization of the segments

    In order to improve performance and reduce the volume of data, the system summarizes the results before it writes them to the Result Database and displays them there. Summarization is carried out for the segments defined in Customizing for Strategy Analyzer.

    Aggregation Runs

    The aggregation run is started for gap analysis only, and involves the following steps:

    Maturity band summarization

    The system summarizes the interim results along the maturity band.

    Calculation of the net interest income

    Segment hierarchy summarization

    The system summarizes the interim results across the specified segment hierarchy along the maturity band.

    Currency translation

    The system translates the results into the display currency.

    Interpretation

    The system formats the aggregated gap analysis results and the net interest income in such a way that a complete result is available for each maturity band date. The system carries out this step for all the reporting settings that were determined in Customizing for the aggregation run.

  • Net Present Value (NPV) Analysis

    Purpose To obtain an objective view of the financial and risk position of a bank, it has to be possible to value all financial assets by the sales price realizable on the market, and all financial liabilities by the redemption price demanded by the market. The net present value analysis in Strategy Analyzer is used for this purpose. This analysis enables the mark-to-market values of individual items or of a portfolio, for example, to be calculated.

    In addition to the mark-to-market valuation, financial transactions and financial instruments can also be valued at theoretical prices. This is particularly useful if you are unable to carry out a mark-to-market valuation of the items or cannot because market data is missing.

    In the net present value analysis, you can enter any horizon you want so that the system can carry out evaluations for the current date and for future dates. You can also specify market data scenarios that the system is to use. This results in the following options for carrying out the net present value analysis:

    Evaluation today based on current market data

    All future cash flows are priced using the specified current market data, and the net present value is discounted to the horizon.

    Evaluation using scenario data

    All future cash flows are priced using the specified market data scenarios, and the net present value is discounted to the horizon date.

    Evaluation in the future using forward rates

    Transactions and positions are priced for a horizon in the future. Here the system calculates forward rates for the horizon from the current market data or market data scenarios on the evaluation date. It uses these forward rates to price all cash flows after the horizon date by discounting the net present values for the horizon date.

    You can also carry out the net present value analysis for historical dates. In this analysis, the system also uses the market data that is valid on the evaluation date (here, the historical market data).

    The transactions are selected from the Source Data Layer (SDL) by using selection characteristics, which you can define as required. A large number of settings are provided for the NPV analysis. These settings can be used to define how the net present values are displayed in reporting and include cash flow splitting and cash flow view settings.

    The relevant bid/ask spreads quoted on the market can be used for the financial positions in the NPV analysis. The system also prices transactions that are traded in different markets (German federal bonds or mortgage bonds) using yield curves that are specific to these markets. Likewise, the system uses different volatility curves to calculate the prices of standard options and exotic options.

    Process Flow Depending on the volume of the data that is to be analyzed, you should either start the NPV analysis immediately (online processing) or schedule it for a later point in time (batch job).

    Online analysis

    The analysis is called immediately, and the report is generated straight away. This type of analysis is suitable for small volumes of data only.

    Batch evaluation

  • The NPV analysis and the reporting of the results of the analysis are scheduled to start at a later point in time. This method is recommended for large volumes of data.

    You can display the results of the NPV analysis in reporting.

    Net Present Value (NPV) Analysis

    Purpose To obtain an objective view of the financial and risk position of a bank, it has to be possible to value all financial assets by the sales price realizable on the market, and all financial liabilities by the redemption price demanded by the market. The net present value analysis in Strategy Analyzer is used for this purpose. This analysis enables the mark-to-market values of individual items or of a portfolio, for example, to be calculated.

    In addition to the mark-to-market valuation, financial transactions and financial instruments can also be valued at theoretical prices. This is particularly useful if you are unable to carry out a mark-to-market valuation of the items or cannot because market data is missing.

    In the net present value analysis, you can enter any horizon you want so that the system can carry out evaluations for the current date and for future dates. You can also specify market data scenarios that the system is to use. This results in the following options for carrying out the net present value analysis:

    Evaluation today based on current market data

    All future cash flows are priced using the specified current market data, and the net present value is discounted to the horizon.

    Evaluation using scenario data

    All future cash flows are priced using the specified market data scenarios, and the net present value is discounted to the horizon date.

    Evaluation in the future using forward rates

    Transactions and positions are priced for a horizon in the future. Here the system calculates forward rates for the horizon from the current market data or market data scenarios on the evaluation date. It uses these forward rates to price all cash flows after the horizon date by discounting the net present values for the horizon date.

    You can also carry out the net present value analysis for historical dates. In this analysis, the system also uses the market data that is valid on the evaluation date (here, the historical market data).

    The transactions are selected from the Source Data Layer (SDL) by using selection characteristics, which you can define as required. A large number of settings are provided for the NPV analysis. These settings can be used to define how the net present values are displayed in reporting and include cash flow splitting and cash flow view settings.

    The relevant bid/ask spreads quoted on the market can be used for the financial positions in the NPV analysis. The system also prices transactions that are traded in different markets (German federal bonds or mortgage bonds) using yield curves that are specific to these markets. Likewise, the system uses different volatility curves to calculate the prices of standard options and exotic options.

  • Process Flow Depending on the volume of the data that is to be analyzed, you should either start the NPV analysis immediately (online processing) or schedule it for a later point in time (batch job).

    Online analysis

    The analysis is called immediately, and the report is generated straight away. This type of analysis is suitable for small volumes of data only.

    Batch evaluation

    The NPV analysis and the reporting of the results of the analysis are scheduled to start at a later point in time. This method is recommended for large volumes of data.

    You can display the results of the NPV analysis in reporting.

    Gap Analysis

    Purpose Gap analysis enables banks to monitor and manage interest rate risks from transactions so they can make strategic decisions with regard to gap positions for defined points in time. Liquidity analysis and the cash flow evaluation enable banks to manage their liquidity requirements and NPV risks.

    In contrast to NPV analysis, where risks are recorded using NPVs and future values, in gap analysis, position and maturity volumes as well as cash flows and liquidities are displayed on key dates or for periods. The gap positions, interest rate risk, currency risk, and liquidity risk that are disclosed in this way are then displayed.

    You can carry out gap analysis for single transactions or for user-defined segments in a segment hierarchy. In reporting, you can switch between different segment hierarchy levels and display the results by different cash flow views, market data scenarios, and currencies.

    The Strategy Analyzer gap analysis includes the following evaluations:

    Position evaluation

    The system compares the development of lending and borrowing positions from both the balance sheet and off-balance-sheet areas. You can carry out both a key date position evaluation and an average position evaluation.

    Maturity evaluation

    The system shows the NPV interest rate risk by using; the fixed-rate cash flows. You can restrict the evaluation to particular currencies.

    Cash flow evaluation

    The system displays the NPV interest rate risk; the cash flows cash flows are displayed only up to the time point at which the interest rate was fixed. You can restrict the evaluation to particular currencies.

    Liquidity evaluation

    The system depicts the incoming and outgoing payments for the capital tie-up. In contrast to the cash flow evaluation, only incoming and outgoing payments that are expected to be realized are displayed.

    NPV evaluation

    The system displays the NPVs of a portfolio or the associated cash flows in the maturity band. You can also use market data scenarios in the analysis. You can calculate full scenarios and delta scenarios.

    Net interest income evaluation

    The system calculates the potential net interest income for each maturity band. The capital tie-up is used as the basis for this. For variable items, the interest revenue or the interest expenses that has not been determined is calculated using the forward interest rate.

  • If the default setting is used, the system does this in all evaluations. In gap analysis, you can specify that the system does this for certain evaluations only in order not to impair system performance. For more information, see Creating Valuation Runs.

    You can use gap analysis as follows:

    To display the interest rate risk as a potential negative deviation in the net interest income per

    period from the expected net interest income per period

    To display position volumes for key dates and for periods and maturity volumes for key dates

    and periods in terms of their fixed interest rates and capital tie-up, and to display fixed-rate cash flows and incoming and outgoing liquidity

    To display gap positions as a comparison of the volume of lending and borrowing positions, and

    maturity volumes, as well as incoming and outgoing cash flows or liquidity flows

    To analyze positions, maturity, and cash flows from fixed-rate items for any subportfolio on a

    daily basis

    To display the net interest income for old business whilst using scenarios

    To include variable items without a fixed-interest period by means of due date scenarios

    (demand deposits and savings deposits) and forwards (for example, floaters, the variable side of swaps and forward rate agreements) in the analyses

    To include non-interest-bearing items without a fixed-interest period by using due date

    scenarios (for example, equity, provisions, land, and buildings) in the analyses

    To include optional interest rate instruments and their underlyings or delta-weighted underlyings

    (for example, forward swaps for swaptions, (fictitious) bonds for OTC interest rate options, options on futures) in the analyses

    To display the results distributed over maturity bands, which can be subdivided into any time

    period, for example, day, month, quarter, half-year, and year

    Example An interest rate risk exists, for example, if a fixed interest rate gap exists in the lending positions for a particular currency. The diagram below illustrates this:

    In the closed fixed interest rate block area, there is no risk because the product interest rates of the assets and liabilities are not affected by the market interest rates. The net interest income is therefore not affected by changes in the market interest rate. In the closed variable-rate block, it is assumed

  • that the changes in the market interest rates are reflected in both the asset-side and the liability-side items, meaning that the final net interest income is unchanged in this block too.

    Therefore, the actual risk is seen in the area of the fixed interest rate gap; in the area under Assets in this example. If, for example, the interest calculated for the variable-rate liabilities increases as a result of increases in the market interest rate, then you expect a decrease in the net interest income.

    Prerequisites Settings have to be made for the gap analysis in Customizing for the General Calculation and Valuation Methods and for Strategy Analyzer. For information about this, see Strategy Analyzer Architecture.

    Process Flow Depending on the volume of the data you want to analyze, you should either start the gap analysis immediately (online processing) or schedule it for a later date (batch processing).

    Online evaluation

    The analysis is called immediately, and the report is generated straight away. This type of analysis is suitable for small volumes of data only.

    Aggregation of valuation runs

    The aggregation run is called immediately on the basis of a valuation run that has already been carried out. The results are displayed straight away.

    Batch evaluation

    The gap analysis and the reports are scheduled to run at a later point in time. This method is recommended for large volumes of data.

    The system stores the results of the gap analysis in the Results Database (RDB). Reporting is carried out in SAP NetWeaver Intelligence (BI) or the SAP List Viewer (ALV).

    Run Administration

    Definition Run administration includes the following functions:

    Execute or create run

    Display an overview of runs

    Display application log

    Edit run

    Manage run

    Replace run

    Select run for archiving

    Delete run

    Log of deletion function

  • The above functions are not all available for each application. For more information, see the application-specific documentation.

    Use The following table lists the runs available for each application:

    Application Run

    General Methods in Bank Analyzer Account Pooling

    Facility Distribution

    Determination of the Free Line

    Collateral Distribution

    Determination of Default

    Stress tests:

    Stress test for account pooling

    Stress test for facility distribution

    Stress test for the determination of the free line

    Stress test for collateral distribution

    Stress test for default determination

    Credit Risk Credit Exposure Run

    Country Risk Run

    Stress test:

    Stress Test in Credit Exposure

    Historical Database Version management:

    Historization Run for Data Layers

    Historization Run for Banks In-House Models

    Uploading of Files

    Calculation functions:

    Determining Default Rates

    Determining Average Default Rates

    Determining Default Figures

    Calculation of Migration Matrices

    Data retrieval:

    Exporting Data to In-House Models

    Downloading of Files

    Stress runs:

    Stress Run for Supplying Models with Data

    Generation of Scenario Data in the Source Data Layer

    Generic BI Data Extraction Testing the BI Extractor

    BI Extraction Run

    Extraction runs are created and executed in SAP NetWeaver

  • Business Intelligence (BI).

    The system displays information about extraction runs in run administration of Bank Analyzer.

    Regulatory Reporting Interface Data Extraction Runs

    Limit Manager Limit Utilization Run

    Strategy Analyzer Valuation Run

    Subvaluation Run

    Aggregation Run

    Fair Value Effectiveness Test for Hedging Relationships

    Fair Value Effectiveness Test Run

    Cash Flow Hedge Analysis Creating Valuation Runs

    Subvaluation run

    Creating Aggregation Runs

    Portfolio Fair Value Hedge Initial Generation Run

    Portfolio Item Run

    For some of the Bank Analyzer components, you can use the Schedule Manager to schedule and control jobs. If you use multiple applications, you can define the sequence in which the runs are to be carried out. For more information, see Schedule Manager.

    See also: Status Overview for Run Administration

    Tools

    In order to provide an overview of the evaluation bases while the system is in operation, you

    can display the individual Customizing settings. You have the following options:

    Displaying Field Instances

    Editing Secondary Data Sources

    Current Settings You can change the following Customizing settings in your operational system:

  • Create Maturity Band

    Edit Due Date Scenario

    Edit Scenarios and Scenario Progressions

    To set up scenarios, on the SAP Easy Access screen choose Bank Analyzer Processes and

    Methods Hedge Processes Cash Flow Hedge Analysis Current Settings Edit

    Scenarios or Bank Analyzer Analytics Strategy Analyzer Current Settings Edit Market Data Scenarios.

    To set up scenario progressions, on the SAP Easy Access screen choose Bank

    Analyzer Processes and Methods Hedge Processes Cash Flow Hedge

    Analysis Current Settings Edit Scenario Progressions or Bank

    Analyzer Analytics Strategy Analyzer Current Settings Edit Scenario Progressions.

    For information about other functions, see the document Market Data Scenarios in the Source Data Layer (SDL) documentation.

    The Strategy Analyzer contains the function Edit Filter.

    Tools

    In order to provide an overview of the evaluation bases while the system is in operation, you

    can display the individual Customizing settings. You have the following options:

    Displaying Field Instances

    Editing Secondary Data Sources

    Profit Analyzer (FS-BA-PA)

    Purpose

    This component provides a costing and allocation system that allows costs and revenues to be

    assigned to individual bank transactions, customers, profit centers, or other definable segments in a

    way that reflects their true cause.

    The results are updated as line items as part of a profitability analysis and can be evaluated in

    accordance with various user-defined criteria. The results can be evaluated on the basis of market

    segments, such as products, customers, regions, or organizational units, for example, a profit center.

    In this way, Profit Analyzer allows you to cost, for example, a product, a customer, or a profit center.

  • Profit Analyzer can also be used to plan sales on the basis of user-definable characteristics and key

    figures.

    Features

    Profit Analyzer is divided into the following components:

    ...

    5. 1. Profit Engine

    6. 2. Profitability Analysis

    7. 3. Profitability Planning

    ...

    8. 1. Profit Engine

    In the Profit Engine, individual contracts, or any other segments, are costed by means of modular

    costing. A variety of valuation functions that can be combined are provided for this purpose. The

    allocation module carries out allocations between individual segments. The processing

    framework provides data, manages and logs processing, and updates the results.

    9. 2. Profitability Analysis

    All the results determined by the Profit Engine are consolidated in Profitability Analysis. In terms of

    processes, Profitability Analysis is responsible for the following subprocesses:

    Depicting completed periodic contribution margin accounting and Profitability Analysis.

    Structuring and updating line items

    Providing data at any aggregation level

  • Providing results data for internal and external access

    Data flow and controlling through Profitability Analysis

    Complete profitability analysis means period-specific contribution margin calculation after all

    allocations have been carried out.

    Profitability Analysis is part of Business Accounting (B-Accounting). For more information, see the

    relevant documentation.

    10. 3. Profitability Planning

    Profitability Planning in Profit Analyzer supports the overall process of sales planning of instrumental

    reporting for financial institutions. User-defined key figures are planned. They are classified by user-

    defined characteristics.

    In order to carry out operative sales planning, Profit Analyzer uses the SAP SEM-BPS application. This application is shipped separately and is not integrated in Profit Analyzer. For more information, see the documentation on the SEM-BPS application.

    Profitability Management

    Definition Business Accounting is both the most important data drain and a Profit Analyzer data source. To

    enable Profit Analyzer to use Business Accounting, you have to make specific settings for Profit

    Analyzer (Profitability Management) in addition to the basic accounting settings.

    These settings concern in particular:

    (Profitability management view) variant

    Line items

    Realignment

    Special key figures

    Use Set Up a Variant

    A profitability management view is a variant of a set of basic data (the data basis). The data basis is

    the highest entity in Business Accounting. The accounting systems are provided with the key figures

    and characteristics of the data basis. The variant contains the key figures and characteristics of a data

    basis that are relevant for Profit Analyzer and comprises a consistent analysis of profitability

    (calculation/contribution margin accounting) in Profitability Management (not to be confused with the

    entry variant for line items).

    Only one variant can be active for each data basis. The active variant is the central data store for

    Profit Analyzer. You use the variant to first store the Profit Analyzer data as line items in Business

  • Accounting, and then as totals records (aggregated line items) in an InfoCube in SAP NetWeaver

    Business Intelligence (BI). From this InfoCube, Analyzers can request the data via a primary or

    secondary data source; see also: Data Storage for Accounting Views.

    Line Items

    You can create line items manually if data was not supplied from the source systems on time or

    correctly.

    This is a delta correction, in which missing values (such as key figures) are added, and existing

    documents are not overwritten.

    Example:

    The nominal volume of a transaction has been incorrectly entered as 1 million instead of 1.2

    million. You have to create a new line item with the same characteristic values and a nominal

    volume of 0.2 million.

    If you need to change the characteristic values of a posted document, you first have to cancel the

    original document and then create a new document that contains the correct characteristic values.

    Example:

    A business transaction was assigned to the wrong organizational unit. You have to cancel the

    original document and then post a new document that contains the correct organizational unit.

    The posting date of the new document can be either in the past or in the future. The system displays

    all the characteristics and key figures of this data basis variant. You use the entry variant to determine

    whether fields can be maintained or whether they are predefined. Note that when you enter a

    currency, the key figure currency of all the key figures that refer to this currency field contains the new

    currency.

    Realignments

    Realignment is the process in which you change the structure of a company, template hierarchy, or

    organization, for example. During this process, postings that have already been made are adjusted

    retroactively. Two InfoCubes are available for this purpose: The first InfoCube (As Posted view)

    contains the data originally posted. The other InfoCube (By Current Structure view) contains the

    changed data as if the new structure had always existed in this form.

    Special Key Figures

    You use BI technology to calculate key figures at runtime. These calculated key figures (special key

    figures) are to be used in addition to the updated key figures, and can be defined in Profitability

    Management. You can define your own aggregation processes in addition to using the BI logic for

    aggregating values.

    Activities ...

    11. 1. Set Up a Variant

    To set up a variant, in Customizing for Bank Analyzer choose Analytics Profit

    Analyzer Profitability Management Set Up Variant.

    When you set up a variant, you have to consider the following issues:

    Which basis key figures, calculated key figures, and characteristics you want to use

    for costing/contribution margin accounting.

    Are any realignments planned? If so, which characteristics are affected?

  • The more characteristics and characteristic values you include in the

    variant/InfoCube, the more time the system requires for the analyses.

    12. 2. Line Items

    In order to enter line items later, you first have to create an entry variant. To do so, in

    Customizing for Bank Analyzer choose Analytics Profit Analyzer Profitability

    Management Line Items Characteristic and Key Figure Groups/Entry Variants.

    An entry variant is the form that you use to update line items for corrections, for example, in

    Profitability Management. Entry variants are therefore a selection of characteristics,

    characteristic values, and key figures that define the part of the variant of the data basis that

    you want to correct. You can create any number of entry variants.

    When you create an entry variant, you have to consider the following issues:

    Which characteristics and key figures are to be entered?

    Which fields should be required entry fields?

    Which fields should contain default values? If required entry fields contain default

    values, can these default values be overwritten?

    Whether the calculation module can be used to fill additional fields that are locked

    for entry.

    To enter line items, you can use an authorization concept based on characteristics or apply a calculation module to the data that was entered to check whether the data is plausible, or for calculation purposes, for instance.

    To assign a calculation module and a characteristic profile to an entry variant, in Customizing

    for Bank Analyzer choose Analytics Profit Analyzer Profitability Management Line Items Assign Costing Module and Characteristic Profile to a Screen Variant. You can determine whether a calculation module is to be used and if so, which one. If no calculation module is run, the data is forwarded directly to the data store in order to be updated.

    To enter a line item, on the SAP Easy Access screen choose Bank

    Analyzer Analytics Profit Analyzer Profitability Management Line Item Entry for

    Corrections.

    To use the document you have just posted as a template, choose Transfer Template. You can

    change this template.

    Two additional options are also provided for filling a new document

    (you can choose New Line Item to empty the fields):

    To use an existing document as a template, choose Environment Line Item Entered

    Manually. You can select a document and choose the appropriate pushbutton to use it

    as a template.

    To display and cancel the source document, choose Environment Line Item Entered

    Manually Source Document.

    To call a calculation module and to carry out a valuation, choose Valuation. The result of the

    valuation is displayed, but not updated. You must have already set up the calculation module

    and assigned it to an entry variant.

    You can also choose Simulation to carry out a valuation. In this case, however, the documents

    are also displayed in the form in which they would appear if they were posted in Business

    Accounting.

    To post the documents, choose Save. When you post the documents, the system checks the

    authorization in accordance with the characteristic profile that you have assigned to the entry

    variant.

  • The valuation is also carried out when you post the documents. Once you have posted the

    documents, the system automatically notifies the correction server.

    Two IMG activities are required for this purpose:

    In Customizing under Bank Analyzer Infrastructure Communication and Worklist

    Services Data Sources Primary Data Sources Edit Primary Data Sources.

    In Customizing under Bank Analyzer Infrastructure Correction Services Edit

    Correction Components.

    See also: Entry of Line Items.

    13. 3. Realignment To define a realignment, on the SAP Easy Access screen choose Bank Anal yzer Anal ytics Pr ofit Anal yzer Pr ofitability Management Edi t R ealignments.

    Create a realignment request. When you do so, the data affected by the realignment is selected

    for a data basis. The actual realignment is executed in the realignment run.

    To execute a realignment, on the SAP Easy Access screen choose Bank

    Analyzer Analytics Profit Analyzer Profitability Management Execute Realignments.

    You use a derivation strategy or an externally defined method to execute the realignment. You

    can define how the data is realigned for each characteristic.

    For more information, see Realignment in the Business Accounting documentation. You can define

    that the realignment process is to be subject to user authorization checks based on characteristics. ...

    14. 4. Assign Calculated Key Figures

    InfoObjects are usually used for the communication of the data for characteristics and key

    figures between individual Analyzers and the Source Data Layer.

    However, there are no InfoObjects for calculated key figures. This means that you have to

    assign each calculated key figure to a key figure in the environment catalog (SDL).

    To do so, in Customizing for Bank Analyzer choose Analytics Profit Analyzer Profitability Management Special Key Figures Assign Calculated Key Figures.

    15. 5. Assign Special Aggregation

    Special logic (average calculation, last value) is assigned to the key figures. This involves

    enhancing the logic that is already available in B. You can use this logic for primary or

    secondary data sources of the Profit Analyzer category.

    Example:

    The system contains an entry for the months January to March. No income was obtained for

    the months April to November in this area. You want to calculate the average for the calendar

    year at the start of December, including November. The total income is divided by 11, using the

    AVG aggregation category.

    LAS delivers the last value. In this example, the last value is not the last posted value

    (revenue from March). The last value is the value for November, which is zero.

    In Customizing for Bank Analyzer choose Analytics Profit Analyzer Profitability

    Management Special Key Figures Assign Special Aggregation.

  • Profit Engine (FS-BA-PA-PE)

    Purpose You use this component to calculate bank-specific costs and revenue, in particular the revenue

    components of the asset, liability, and service transactions in banks, as well as the standard unit costs

    incurred at different levels.

    The results components of the costed transactions can be neutralized at different hierarchy levels or

    distributed to various items. This enables a previously costed bonus or premium that was allocated to

    one customer service representative to be removed (neutralized) at overall bank level, for example. If

    a results component is distributed, a revenue component is assigned to two customer service

    representatives in a particular ratio, for example.

    The results data records are forwarded to Profitability Analysis, where line items are generated from

    the data records and consolidated in a user-definedcontribution margin scheme.

    Features The Profit Engine component is divided into the following subcomponents: ...

    16. 1. Processing framework

    17. 2. Modular costing

    18. 3. Allocation

    19. 4. Value determination

    20. 5. Derivation

    21. 6. Verification lists

    Processing framework

    The processing framework reads data from a data source and provides it for costing or allocation

    purposes. The data records generated are transferred to Profitability Analysis for the purpose of line

    item generation. The data records can also be transferred to a file or table. Status management for

    the costing or allocation processes is carried out within the processing framework.

    Modular costing

  • Modular costing generates new costing components by carrying out various valuation and retrieval

    functions. Modular costing consists of elementary functions that can be combined for particular

    processes.

    Value determination

    In modular costing, it must be possible to derive currency amounts, percentages, or quantities on the

    basis of characteristics:

    The values are determined depending on any combination of characteristic values.

    These currency amounts, percentages, and quantities are determined using a multi-step access logic. The system first searches for a particular customer group and product combination, for example, a percentage. If this is not available, the system searches for a valid percentage first at product group level and then at organizational area level.

    The value determination tool determines the above values for modular costing.

    Allocations

    Allocations are:

    The distribution of profitability values

    The neutralization of imputed results figures at aggregated level

    Distribution:

    Distribution is a transfer of profitability values (in particular costs or revenues) from one

    or more senders to one or more receivers.

    Neutralization:

    Costing results are determined in both real (for example, actual costs) and imputed

    results figures (for example, bonus/premium, standard unit costs). These imputed results

    figures are identified on lower levels (single transaction, for example) but have to be

    taken out of the figures at higher levels (overall bank, for example) so that the overall

    bank result is correct.

    New data records are generated during the allocation process.

    Derivation

    In the derivation tool, additional, logically dependent characteristics are determined on the basis of

    particular characteristics. The derivation can be carried out in several steps.

    The characteristic branch is determined on the basis of the characteristic branch office and the characteristic business area is then determined on the basis of the branch.

    Verification lists

    You can display the results of modular costing and of the allocations in verification lists before the

    data records are updated in Profitability Analysis. In Profitability Analysis, the data records that have

    been processed without errors can be checked for business accuracy. To enable comparisons to be

    made between the result records and the results from previous periods, the data records can be

    extracted from the verification lists to the Business Information Warehouse (BW).

  • Profitability Planning

    Purpose Profitability Planning in Profit Analyzer supports the overall process of sales planning of instrumental reporting for financial institutions. User-defined key figures are planned. They are classified by user-defined characteristics.

    In order to carry out operative sales planning, Profit Analyzer uses the SAP SEM-BPS (Business Planning and Simulation) application. This application is shipped separately and is not integrated in Profit Analyzer. For more information, see the documentation on the SEM-BPS application.

    Integration Sales planning is based on actual values, from which plan values are generated during the planning process, as well as data that is loaded from Profitability Analysis or non-SAP systems, for example.

    Data from the individual systems is merged within planning using SAP NetWeaver Business Intelligence (BI), which BPS uses for data storage purposes. Note that the granularity level at which planning is to be carried out can be generated when data is extracted to BI by means of simply aggregating the actual data records. If several Cubes are to be merged, all characteristics must be identical and filled.

    Sales planning is carried out at branch office level and profitability analysis data is available at account level. The data records in Profitability Analysis also contain the branch office characteristic, which enables the single records to be aggregated at account level.

    Counterparty Risk

    Definition The risk of an unexpected loss in the value of a receivable in a contract due to a worsening of the

    credit standing of a business partner.

    Use Counterparty Risk identifies risks and provides key figures to measure and control credit risk as part

    of the bank management process.

    Structure Counterparty risk is calculated as follows: ...

    22. 1. The input data is selected that is needed to calculate the counterparty/issuer risk

    (see Selection Management in the Source Data Layer). The main types of input data are:

    Business partner data

    Contract data

    Collateral data

    23. 2. Counterparty risk is calculated at business partner level, or for a group of business partners

    and their contracts. It is calculated as follows:

    The balances of contracts are netted off against one another on the basis of legal or

    economic aspects (see Account Pooling).

  • Business partner data is aggregated on the basis of legal or economic aspects, or as

    required for specific models, or for system performance reasons (see Summarization

    Schema).

    Summarized business partner data is transferred to a credit risk model (such as

    CreditMetrics, or CreditRisk+), which returns the calculated risk key figures

    (see Interface to Portfolio Models).

    a. Risk key figures are saved along with their characteristics, and made available to other

    business applications and processes.

    Since risk key figures cannot usually be returned at contract level, some business

    processes have to redistribute the key figures back to the individual contracts

    (see Redistribution).

    Integration Counterparty risk, or credit risk, is by far the greatest risk borne by banks. It is a risk they have borne

    since their conception. Yet new developments on the capital market and advanced methods for

    measuring and controlling credit risks present banks with new requirements in terms of business

    processes and technical systems for assessing credit risk. These requirements are increased by

    prospective changes to the banking supervisory regulations aimed at limiting banks default risk. Bank

    Analyzer aims to provide suitable solutions to meet the changing requirements of banks for processes

    and methods to measure and manage counterparty risk.

  • Portfolio Credit Risk

    Purpose This component enables you to measure, analyze, and control default risks. Default risk is the potential loss incurred from a financial transaction in the event of the business partner being unable to meet contractual obligations due to specific economic or political causes. Default risks are classified as follows:

    Counterparty risk describes the danger of a loss in the value of a receivable due to a

    worsening of the creditworthiness of the business partner. Country risk describes the risk of a loss in value due to a worsening of the credit standing of the country risk country. This is the country whose situation affects the business payments.

    Portfolio Credit Risk contains functions for counterparty risk only.

    Counterparty Risk

    Definition The risk of an unexpected loss in the value of a receivable in a contract due to a worsening of the

    credit standing of a business partner.

    Use Counterparty Risk identifies risks and provides key figures to measure and control credit risk as part

    of the bank management process.

    Structure Counterparty risk is calculated as follows: ...

  • 24. 1. The input data is selected that is needed to calculate the counterparty/issuer risk

    (see Selection Management in the Source Data Layer). The main types of input data are:

    Business partner data

    Contract data

    Collateral data

    25. 2. Counterparty risk is calculated at business partner level, or for a group of business partners

    and their contracts. It is calculated as follows:

    The balances of contracts are netted off against one another on the basis of legal or

    economic aspects (see Account Pooling).

    Business partner data is aggregated on the basis of legal or economic aspects, or as

    required for specific models, or for system performance reasons (see Summarization

    Schema).

    Summarized business partner data is transferred to a credit risk model (such as

    CreditMetrics, or CreditRisk+), which returns the calculated risk key figures

    (see Interface to Portfolio Models).

    a. Risk key figures are saved along with their characteristics, and made available to other

    business applications and processes.

    Since risk key figures cannot usually be returned at contract level, some business

    processes have to redistribute the key figures back to the individual contracts

    (see Redistribution).

    Integration Counterparty risk, or credit risk, is by far the greatest risk borne by banks. It is a risk they have borne

    since their conception. Yet new developments on the capital market and advanced methods for

    measuring and controlling credit risks present banks with new requirements in terms of business

    processes and technical systems for assessing credit risk. These requirements are increased by

    prospective changes to the banking supervisory regulations aimed at limiting banks default risk. Bank

  • Analyzer aims to provide suitable solutions to meet the changing requirements of banks for processes

    and methods to measure and manage counterparty risk.

    Risk Calculation Counterparty credit risks can be calculated externally (see also External Calculation of Risk or internally (see also Internal Calculation of Risk Calculation). When risks are calculated externally, the basic data is selected from the Source Data Layer (SDL) and transferred to an external counterparty risk processor, where the risk is then calculated. The data is transferred to the administration of counterparty/issuer risk runs, and then to the Result Database (RDB). At present, the interface for external risk calculation is provided only for external counterparty/issuer risk processors of the pilot customer.

    Internal risk calculation takes place almost exclusively within the SAP system. If required, certain counterparty/issuer risk key figures can be calculated in an external portfolio model. However, internal risk calculation can currently be used as a prototype function only.

    Country Risk

    Architecture of Country Risk

    Calculations in the Country Risk Component

    Characteristics and Key Figures in Country Risk

    Data Selection

    Figures used in the Calculation of Attributable Amounts

  • Starting Country Risk Runs

    Reporting

    Master Data Report for Facilities

    Tools

    Run Administration

    Features of Run Administration in Country Risk

    Schedule Manager

    Schedule Manager (CA)

    Postprocessing

    Archiving of Country Risk Data

    Data Archiving (CA-ARC)

    Interaction Between Country Risk and Limit Manager

    Country Risk

    Purpose This component provides an infrastructure for calculations and can be defined by the customer as required. Calculations are primarily used to determine attributable amounts for individual transactions.

    Integration Country Risk is part of Bank Analyzer. In Country Risk, you can use the results generated by the upstream General Calculation and Valuation Methods. You can process the attributable amounts

    calculated in Country Risk in Limit Manager.

    For more information, see the following documents:

    Architecture of Country Risk

    Interaction Between Country Risk and Limit Manager

    Features Since in practice a large number of methods are used to determine the exposure to default risk, a flexible and customizable interface is provided in Country Risk for the analysis of financial transactions such as loans and facilities. For each transaction entered in the system, the system calculates attributable amounts that disclose the risk content of each transaction. Formulas are assigned for each combination of determination procedure and default risk rule defined in Customizing. The formulas are stored in each transaction.