Riassunto risk

Embed Size (px)

DESCRIPTION

A brief summary on main topic on Risk Management

Citation preview

  • 1. Giulio Laudani #23 Cod 20163 RISK MANAGEMENTInterest risk _________________________________________________________________________________ 1 Methods to Compute Interest Risk: ______________________________________________________________________ 2 A proper Scheme to manage Internal Transfer Rate operation: _______________________________________________ 3Credit risk __________________________________________________________________________________ 4 The Expected Loss Estimates: ___________________________________________________________________________ 4 The measure of the Unexpected Loss: ____________________________________________________________________ 7Tools used to estimate UL: ________________________________________________________________________________________ 8Comments on the UL tools employed: _______________________________________________________________________________ 9 Application and General comments on Credit Risk methods: _________________________________________________ 9Market risk ________________________________________________________________________________ 10 Tools used to estimate market risk _____________________________________________________________________ 10 Different applications of the Market risk model: __________________________________________________________ 12Measure of Volatility ________________________________________________________________________ 13The general limits of the VAR approach are: _____________________________________________________ 13Basel Committee Framework: _________________________________________________________________ 15 Basel I: ____________________________________________________________________________________________ 15 Basel committee II ___________________________________________________________________________________ 15Insurance business __________________________________________________________________________ 17 Solvency II _________________________________________________________________________________________ 17The formation process and directive structure: _______________________________________________________________________ 18Some example: ________________________________________________________________________________________________ 19 Difference between Banks and Insurer __________________________________________________________________ 20 A new risk paradigm _________________________________________________________________________________ 21Problems related to the crisis: ____________________________________________________________________________________ 21All these problems have caused: __________________________________________________________________________________ 22How the new RM activity looks like after the crisis: ____________________________________________________________________ 22Interest riskThe source of this risk is due to the transformation of maturities and the mismatch between asset and liabilities. Those situations produce animbalance that will lead to:Refinancing risk whenever the maturity on asset is longer than liabilities one; Reinvesting risk when we are in the reversesituation; Changing in the demand/offer side of the demand of liabilities and call loans, hence the risk linked to the elasticity of demand curve (itwont be study).Those imbalances are shown in all items affected by any change in interest rate, hence we should consider not only the trading book, but thebroader one, that is the banking book, including any derivate whose value depends on market interest rate. The general rule on how treat this riskare explained within theBasel Committee framework, whose general principles have been proposed to help/offer guideline to national authority,which are in charge of the supervision of the bank institution, on how to best estimate and to best manage this risk. Here there is a list of the mainarea described in the Basel Committee framework:Significant involvement of senior manager to overcome the traditional independency of risk management unit with the other operationaldivision, this active rule of the senior management will grant uniformity on the criteria, objectiveness and proper procedure. This generalprovision is concretely declinated: The board must be informed regularly and must approve the policy enhanced It must ensure that the risk management has the competence to deal with1

2. Giulio Laudani #23Cod 20163All the monitoring activity must be put in place and there must operational limitsThe bank must communicate to the supervisory and also to the public their own riskCreate independent unit which has to help the senior management in the decision process, providing technical point of view and avoidingpotential conflict of interestBring the measurement of this risk at a consolidate level to adequately appraised and managedIntegrate this risk measure into a day to day management of the bank, to steer the corporate policy in the way of how business is leadMethods to Compute Interest Risk:There exist several models to capture/measure this risk from the simplest to more advanced one, however each of them can provide a goodguidance to Risk manager since they analyze different angle of the same problem:Interest gap is an income based approach, it considers the effect of change in the market interest rate on the NI1o There are several definition of Gap starting from the simplest to the most complicated/accurate, all of them depends on thetime windows considered, hence it is a time dependent measure The first is basically the difference between sensitive assets and liabilities, where by sensitive we mean all those asset which are going to be re-price/roll over in the time frame considered2 The second one is named Maturity-Adjusted GAP ,which is a weighted average with weights the exposure of each instruments within its GAP. This procedure is quite heavy demanding on the computational side, hence it is simplified 3 by using the concept of marginal GAP The standardized gap has been introduce to overcame the assumption of same effect for each category included in the balance sheet. It consists of computing the beta of each categories 4 To take into account the change on demand for those items which havent an automatic indexing mechanism , we willuse the average delay to allocate the instrument to the properly marginal Gap windows A similar approach can be used to compute the price and quantity interaction effecto The intrinsic limitation of this approach is the GAP is considering only the effect on the net income, hence it is ignoring the effecton the market value of the fixed part of the balance sheet, whose value is effected by the wayDuration gap is an equity based approach. It is more consistent with the newmarket to market accounting principle. We are going to usethe durationwhich is a measure of the residual life of the instrument, and the modified durationwhich is the first derivative to the respectof interest of the cash flow equation, hence it can be used as a first approximation of the change in value of financial instrumentso The duration GAP is estimated bycomputing the MD of Assets and liabilities, where the second one is multiply with the ratio ofmarket value between liabilities and assets (financial leverage)o Limits of this approach are:The immunization will last a very short time, because the duration will change over time, the changeof interest rate will change the DM, it is a linear approximation, it is based on the assumption of uniform changes in interestrate (basis risk), assuming parallel shift across maturities, and it is a costly strategyo Some of those limitations can be overcame by some wise modification of the general approach: the basis risk can be solved byusing a measure of sensitivity; the cost can be manage using derivatives instruments and the linear approximation can be reduceby using the convexity parameter (second derivatives, distribution of cash flow around the duration)Cash-flow mapping (clumping) is the model that enable to consider asymmetrical shifts on the yield curve, furthermore it is the modelsuggested by the Basel committee. The method consists of dividing all the instruments in each of its cash flow,and of converting all theinstruments into zero coupon, on whom we will use the appropriate zero coupon rate [maturity]1The usage of the GAP brings to different macro definitions:Marginal GAP is the GAP computed for a specific window inside the horizon consider. It is related to the cumulative one by two relationship:oThe sum of all the Marginal GAP is the cumulative one at time ToThe difference of two cumulative gap with different time horizon gives us the marginal GAP between the two horizonCumulative GAP is computed over the all-time horizon2It aims to overcame the limit of the previously one, namely the over simplistic assumption of instantaneous change of interest rate for all the sensitive items3 The marginal GAP consists of aggregating items in time category on whom there will be computed the significant value and the maturity is simple the average of the beginning and endingdate4The underling hp is that those items respond to the change with some delay 2 3. Giulio Laudani #23Cod 20163o In set up this model there are several choices that should be undertaken: 5 We need to estimate the vertices on the zero coupon curve, taking into accounttypical bank features: Change in short term interest rate are greater than longer rate, volatility decrease as maturities increase and banks cash flow are more concentrate in the short term We need to choose discrete interval since the item by item approach is too demanding, we are going to aggregate 6 them using their residual life or using MD to take into account the different sensitivity to change in rateo This method is suggested by Basel Committee to calculate a synthetic risk measure by applying this scheme: All the assets and liabilities are divided in 13 nodes applying the modified residual life For each node there is an associated risk coefficient, given by the product of DM and change in interest rate. It is multiplied by the Net position for each node It doesnt allow bank to completely off set gaps, to account for asymmetrical movements This measure is compared with the capital requiremento The Basel methods has several limits: Use of Book value instead of market one as reference Presence of instruments which repay capital before maturity create a bias in the modified method residual life approach [duration drift] Presence of instruments without a fixed renegotiation date or linked to market interest rate This method is not adequate to consider derivatives There is no compensation allowed for different currency Basel committee doesnt provide a unique solution to those problems, actually it leaves the national supervisory authority free to choose their modelso The clumping method provides a procedure to broken real cash flow in two virtual one in effort to better match the nodesvertexes, so that to improve the cash flow mapping results, but require an in-depth knowledge of all cash flow, that is why it canbe applied only to small portion of the banks portfolio The virtual cash flow must guarantee constant portfolio (same MV) value and same risk to ensure same change against interest rate movements o is the first node o where the rate and time is the node one A variation of the previously method is basing the clumping on the price volatility to take into account the risk correlation. The two virtual cash flow must sum up to the real volatilityA proper Scheme to manage Internal Transfer Rate operation:The ITR is a series of virtual transaction between branches and the risk center unit. It aims to centralize all the decision regarding the risk takingprocess, to better evaluate the profitability of each branch, to relive the branches from the need to care about the founding process and to transferall the risk to a central unit.There are two possible scheme to be used: one is the Single ITR, which is the more easier to implement, but it is criticized due to the unique(arbitrary, not related to market condition) rate which is used in the transactions, it handles only net flow, hence the branches are still affected bypart of the risk; the other one is the Multiple ITR, whichovercomes this limitation and allows to use multi interest rate for different maturities andgross flows, hence each operation is taken into considerationHere we are going to provide some examples of peculiar transaction and how ITR dealt with:Fixed rate transaction, the risk unit take all the risk, the branch is ensured by blocking is financing rateFloating rate transaction, the same of above plus a premium that repay the higher riskNot indexed at market rates poses two problems: there is no risk hedging instrument and there is a basis risk in fact if we use a derivative onsimilar underlying there could be an increase in the spread between the similar asset and the real one. The bank can decide who will bear thebasis risk since neither the treasury or the branch are able to handle it To avoid arbitrage against Treasury, it could be possible to add a premium over the internal rates to give an incentive to choose the most favorable rate5The nodes used are usually consistent with the maturity available for hedging instruments6Note that the degree of risk doesnt depend exclusively on residual life, hence we are simplifying the model in this case3 4. Giulio Laudani #23Cod 20163Derivatives and special payoff, the option: the case of cap or floor or early repayment; price must be paid by the client with the spread chargedor paid up-front. It can be beardby the branch or the treasury. The first possibility is simple using optionsWhen we are building up an ITR system, we need to address all the desirable features, aiming to ensure a proper risk management system within ourinstitution. Hence, we need to ensure that any Change of profitability is due to only of credit risk, and the total banks profitability shouldnt change;7we must protect our selffrom the interest rate risk and embedded options; we should use a multiple rate ITR and we should use gross flow 8methods ; we should use this system together with a cash mapping processCredit riskCredit risk represents the risk of the default of the counterpart (risk of insolvency) or deterioration in its creditworthiness (migration risk), hence itwill arise whenever the discount rate of future cash flow doesnt reflect the risk of the transaction. Besides to those main sources of risk there areother components: spread risk [whenever the market spread required for the same risk increase due to an increase in the risk aversion] andrecovery risk, country risk and pre-settlement risk [when a forward contract is cancelled due to the insolvency of the counterpart and the bank isforced to replace it at unfavorable condition]. The positions accounted for determining the credit risk are the balance sheet items and the offbalance one (OTC transaction)The real source of risk is its unexpectedcomponent in fact if the predictable one will be incorporate into the interest rate spread/premium and sototally eliminated. This risk has two components Expected loss (EL) and the Unexpected loss (UL), our task is to estimate the first to properly pricethe instrument and to measure the second to raise capital to cover this position (Basel Requirement). Estimate this risk and its impact is not an easytask, in fact banks credit exposure are recorded at historical value and there is no secondary market to easily determine the market value of those 9position , hence we need to use internal asset pricing formula .Here there is a set of rules to be implemented to properly set up adequate governance standards: Establishing appropriate credit risk environment, meaning that the board must approve and review credit risk strategies, pointing out the bank risk profile and profitability required. Senior managers must have the responsibility to adequately implement those strategies for all products Use sound credit granting process: establishing credit limits, new client policy or amendment as well as allowing re-new credit line with particular care Maintaining an appropriate credit control process Endure an adequate credit risk control, ensuring independency, IT instruments, recovery bad loan credit facilities and properly prudential level in assessing risk Supervisory should check that those requirement are met and posing limits on risk exposureThe Expected Loss Estimates:The EL is function of three different components [referring only to insolvency risk] that must be estimated:The PD estimation: 10PD is the probability of default, it can be defined by a more subjective or objective definition and it can be assessed by backwarding [human basedecision or automated algorithms] or byforwarding [expectation on future development, market data] looking methods. It is estimated by applyingseveral models: Credit scoring models are statistical approach based on economic variables and financial indicators. They are used to forecast default and risklevel of the borrow one by one or by discrete grades (this procedure is better since it reduces the error) Linear discriminant analysis consistson classifying the data using different variables, to define a discriminatory value and to draw a boundary line to clearly separate, as much as possible, the data between healthy and bad company.One of the possible variant is the Altmans Z-score which is a multivariate regression that already suggests some key variables.7 Close to those present in the market [even on the ask and bid side], facilitating the risk hedging8Each transaction is considered one by one, no netting procedure , otherwise the branches will still retina some risk9 Note that to estimate default risk we can simply use book value, we need market one to estimate the effect of migration risk10 Depending on the decision the PD will be greater or lower, it depends on the criterion used4 5. Giulio Laudani #23 Cod 20163 oThe score can be seen as a weighted average of the scores of a set of different variables, where unimportant variables areweighted close to zero, while important ones get high weights and counterproductive variables get negative coefficients. oWeights are chosen to discriminate as far as possible good firms scores from the bad firms ones, so mathematically we are 11aiming to maximize the distance between the two centroid with taking into account their variance [which is assumed to bethe same over all the category, this hp can be ease] oTo test the significance of this model we can use a parameter calledWilks Lambda which basically will compered thedistance between the two centroids. It is the ratio between the variance of the healthy and abnormal scores over the totalvariation (similar to R^2) oFrom the score we can assess the PD by a formula based on the hp of normal distribution of independent variables. Itdepends on the score, on the cut-off level (which can be simple or more sophisticated one) and on the prior probability ofdefault [ . o The cut off between company is computed using the past observations and can be simple the average between the distance of the centroids (simple rules) or to ensure a given PD for the accepted company we can use a given formula; the problem is that we can refuse good company that are in the grey/overlapping area.There is a possible correction term to take into consideration the average quality of the sample or to take into consideration the cost of I and II type of errors. It aims to minimize the first one by adding a new term which considers as an information the ratio between the I and II types of error o These approaches suffer of several limits: the assumption on normal distribution, there are variant with Heteroskedasticity, but they require lots of data and the so called sample bias o To select the meaningful variables there are two possible approach: backward elimination or forward selection, keeping in mind the ratio of the model Regression models require defining a sample size, independent variables and how to estimate coefficient. There exist several variants: o Simple which pose the problem of not limited dependent possible value o Logit whish substitute the previously linear relationship with an exponential one o Probituse a normal cumulative density function (smaller tail compared to logit) Heuristic inductive models 12 o Neural, it tries to minimize human learning approach (black-box features ) o Genetic based on the survivorship completion (only the best are able to transfer to future generation their genetic) General limits of this family of models are: o They dont consider migration and qualitative issues o The meaningfulness of the variable used is questionable, since they must be industrial specific (it is not the case) and needs big sample to avoid bias caused by the presence of too healthy companies in the sampleCapital market models are similar to a VAR approach Use of spot or forward rate to estimate the default probability within one year and beyond. The key parameters are the spread, the PD, the recovery and the risk free rate. The PD parameter is computed using the corporate yield curve to maturity and the risk free one, we will use e^ formo In case of maturity over one year we can use long term spot or forward spread, where d is the spread and k is the recovery rate spot case forward case o The spot will directly gave us the cumulative probability while the forward the marginal one, it is possible to obtain the cumulative with the forward by the formula o While the marginal one is obtained with the spot by using the following formula: o Limits of this approach are: It is applicable only for listed company, that have listed bonds for all the relevant maturities, and it is also suffers ofthe market illusion11 It is the average value of historical company data for each sample category12 The procedure is obscure, and it is compared to a black box from where we gain result without knowing how it is working inside 5 6. Giulio Laudani #23 Cod 2016313 Itassumes a risk neutral approach , hence we should consider/add a premium It relays on the expectation theory, however there exist liquidity premiumMerton model is an example of structural model, meaning it is a model that recognize default as a consequence of some intrinsiccompanys features, differently form the reduced model which simply recognize it as a possible event and reduce the problem to estimateits likelihood o Characteristics are: it is based on the ideathat the shareholder position is similar to an option to default (when the value of theassets are lower than liability one) It assumes that the value of the company follows a Brownian motion, with an increasing uncertainty At maturity it computes the frequency of the final result of the path simulated and check how many of them are below the threshold This probability is function of the asset volatility, nominal value of debt, starting company value and the debt maturity o The contingent claim analysis, since those risks can be hedge using a put option the investment can be seen as a risk freeinvestment, where the put premium will be a proxy of the default probability. Thanks to this relationship we obtain the risk freedefault probability, the value of the debt and the interest rate required by the bank o Analyzing the spreads and default probability computed by applying the Merton model we notice how the riskier companyspread curve is negative sloped, while the safer one is reverse in behavior, this is a direct consequence of the survivorship bias 14for the riskier company o Limitations are: It assumed a unique zero coupon debt repayment It assume Brownian motion to describe the evolution of the price It doesnt consider the case of early default, before maturity, meaning it doesnt recognize the possibility that some evolution path may overcome the threshold before time T It assumes constant risk free, that can be easily solved It doesnt consider migration risk Many of the variables need to be estimated since are not observable It is an arbitrage free approach, hence there should be the possibility to trade those asset, but it is not the caseThe KMV model is an attempt to overcome some of the Mertons limits, namely the first problemand the estimation one o Characteristic: it assesses the value of equity as equal to a call with same maturity of residual life of debt. Thanks to this idea itcan estimate the value of the company and the volatility, it use the price of the European call and the Itos formulatransformation to find the equation to estimate the volatility (2 unknown 2 equation) o The KMV approach takes two steps Risk index is computed using the innovative concept of distance of default. Thanks to this innovation they can acknowledge the existence of short term and long term debt, hence the company may default only if the assets value drop below the short term debtThis index is then converted into a probability using empirical law o Benefits are the speediness to adapt to change in the financial condition, it is stable compered to economic cycle and it allows to assign specific EDF o Limits are that it can be used only for traded company (it can be solved by using comps) and it relays on the efficiency of the marketEAD Estimation:EAD is the exposure at default, it is generally assumed to be a deterministic number, but it may be stochastic if the borrower has the right to change 15his size exposure. In this case we need to assess the drawn and un-drawn part . A Synonymous of EAD is recovery rate, it computed by applyingdifferent formula referring to gross, actual and loading duration o Gross method o Actual method, using for each flow its maturity to the occurrence of the default13 This model is assuming a risk neutral approach, meaning it starts with the equality between a risk free rate and the risky asset weighed by its survivorship probability14 Over time the pool of the weaker company will get stronger and stronger due the repayment of debt and the death of the worst among them15 The un-drawn part needs to be estimate considering the portion that will be drawn at default, to ease this task bank requires fees on this part that allows a more rational pricing (common inAnglo-Saxon countries) 6 7. Giulio Laudani #23 Cod 20163 o Loading duration, where d is the difference between RA duration and EAD duration /TOT_PV(EAD) where t is the time elapsed from default Where RA is the recovered amount, AD the administrative and legal cost, r is the appropriate discounter factor and d isthe duration of the process taking into account intermediate flowLGD Estimation:LGD is the expected loss rate in case of default, as PD it depends on the definition of default chosen by the bank, in fact the narrower the definitionthe lower the LGD. Its value depends on those drivers:Characteristic of the borrower: industry sector, country speediness of recovery procedure,financial ratio; Characteristic of the exposure: presence of collateral or guarantee, priority level; Characteristic of the bank facilities to recover loanand out of court settlements, and External factors, such as economic cycle [common with PD], interest rate level There are different procedures to estimate it, however all of them have to take into consideration all the indirect cost beard by the bank to recovery the proceed and the time needed: o Market LGD is similar to the PD estimate using market rate, in fact knowing the other parameters we can infer the related LGD o Work out LGD consists in building an historical database from where we could extract the LGD for a given type of borrower. This method is the only applicable to banks loan which doesnt have secondary market on which we can extract the market value after credit events We need to find about an appropriate discount factor of the recovery amount, which must be forward looking sincethe procedure will be start in the future (after the beginning of the credit line) We need to estimate the duration of the process itself, the book gives an example page 350. Note that this model is thepreferred one It is common in the risk management to find out some key variables which allow understanding the different recovery rate empirically experimented. This topic is important to understand the recovery risk, given the high volatility presented into database: It has been noted that the distribution is concentrated in the tails; The industry sector is a key element to explain differences; The priority level is important, but is not stable over time; and The presence of collateral is important Recovery risk is the risk to achieve a different LGD from the expected one, it is usually quite sizable and fluctuate over time due to a binomial distribution concentrated in the tails o This risk arise from the uncertainty of those variables: Amount to be recovered (EAD) Administrative cost to be beard Discount rate for future cash flow at future date The duration of all the process o It is calculated using those two formulas Assuming a non-stochastic LGD Assuming a stochastic LGDThe link between LGD and PD must be taken into account, since they share common systematic factors. This relationship is negative, as empiricalevidence based on junk bond had proven. Doesnt consider this risk may lead to an underestimation of risk. It is affected by:Chain effects: economicdownturn may reduce the value of assets; Financial asset and interest rates together with real estate one, and Industrial specific: the inventory maylose more value for certain industryThe measure of the Unexpected Loss:The UL can be simple define by the volatility around its mean (EL) or by some sort of CREDIT model based on VAR approach. This kind of exposurecan be effectively reduced by setting up a diversification policy.We need to choose appropriate time horizon and confidence level: Time horizon is arbitrary chosen equal to 1 y, because both subjective andobjective criterions are not applicable due to the presence of illiquid market, of contract without explicit maturity and so on. However this decisionhas a great operational fit:The bank use yearly budget, hence having yearly measure ease the budgeting work out; Usually 1 y it the correct timeelapse to raise new capital; The turnover of the asset is usually close to one year andFurthermore some model will account for long term riskincluding migration risk in their computation. Confidence level may consider that the data cannot be explained by a zero mean normal distribution, 7 8. Giulio Laudani #23 Cod 20163since the mean is below zero and data are strongly asymmetrical.Hence the choice of Confidence level need to be credible, because it is less stableand change a lot at different levelTools used to estimate UL:It is estimated using portfolio models that apply different definition of default (including or not migration) and level of loan correlation (explicitly orimplicitly modeled) as well as EL Creditmetrics is a multinomial approach it considers all borrowers change in rating as a credit eventso The model basically consists on deriving the empirical distribution for any rating class movements, and byusing the respective spread for each maturity to compute the expected value of the assets, and on doing the difference between this value and the face value to compute the expected lossWe should compute the VAR percentile using the probability for each possible movement in the rating level to have a measure of UL, since the distribution is not normal ( we cannot use the normal standard deviation time percentile approach)o To overcome the Wilsons critique regarding the accountability of the economic cycle it has been introduced the credit portfolio view which is taking into consideration the economic variablesNote that if the time horizon used is a point in time there is no need to adjust by economic cycle, in this case there would be a double countingo To use the model for portfolio of assets we need to estimate the correlation, by using those steps:A modified version of the Merton model, where we find out in the normal distribution, all the thresholds for default and change in rating. Using math integral we can compute all the probability of change marginal and cumulative for each assets We pool together all these info with a bivariate normal (given the correlation) and the new distribution willgive us all the joint probabilityAsset correlation, which is the one used in the precedent formula is computed using large building blockso Correlations are first estimated among a large set of industries and countries (risk factors), Foreach borrowers, a set of weights must be specified, expressing their sensitivity to different riskfactors and to idiosyncratic risko Combining those weights and the risk factor correlations, an estimate of the pairwise correlation oftwo firms can be obtainedIt is clever, because Asset value changes unlike the distribution of losses can be described by using normaldistribution, it can then be described through a double normal distribution with correlation parameter ,hence from this distribution, the joint probabilities of the values of the two loans can be inferred, taking intoaccount the transfer of risk o Benefits are: Uses objective and forward looking market dataInterest rate curves and stock indices correlations Evaluates the portfolio market value Takes into account migration risko Limits are: Needs a lot of data: forward rates, transition matrices Assumes the bank is price-taker Assumes stable transition matrices Proxies correlations with stock indices Maps counterparties to industries and countries in an arbitrary and discretionary wayCredit risk + is based on an insurance approach using Poisson distribution to estimate probability of defaulto The idea behind this model is the equivalency of the bank and insurance risk, the major difference is the correlation among banks clients. It aims to assess the risk portfolio, it doesnt give us PD, its input are: PD and its volatilityo We use a banding methodto link default probability with losses. It consistson creating category of similar expected losses and computes the number of default for each band, each of those will be used in the Poisson distribution, It is advisable to use, tosum up, the default number weighted with the ratio of j-esimo amount and band expected loss The creation of category is made by dividing all exposures Li by L and round them up, getting standardized valuesvi. The recovery rate is use to determine the exposure (used in the banking) in a deterministic way 8 9. Giulio Laudani #23 Cod 20163Each of those Poisson distribution, one for each bands, must be aggregate. However this first model assumes independency between variables, but this can be overcame by assuming that the number of default for each band is random (see below)oThe Poisson distribution assumed independent value and it works only for small PD. To estimate correlation we will run n simulation within all the bands [assuming that the PD are stochastic] and then we made the weighted average conditional to all the scenario occurrence and this new distribution will be the unconditional distribution, accounting for the asset correlationoBenefits are:PDs and exposures (book value) net of recovery are enough. The correlated version requires also sensitivities to the economic cycle factorsAn analytical solution exists, hence it is fast to be implementedPossibility to obtain the distribution of losses without recurring to simulation techniquesoLimits are:Only looks at default risk, so it Does not consider migration riskAssumes constant exposures and doesnotconsiderrecoveryriskIt isnt a dynamic model. Meaning it cannot be used for changing portfolio without recomputed all the calculationComments on the UL tools employed:Some comments on the major characteristic of those models:Default-mode versus multinomial one, where only credit risk + belongs to the firstFuture values vs. loss rate, meaning that the model can be based on the distribution of possible value or possible future loss, the first useas input the spread curve by maturity, while in the second the spread it is not necessary to be known. Creditmetrics is a typical marketvalue model, while credit risk + loss oneConditional vs. un-Conditional, portfolio views belong to the first, however this distinction is useful only if the model works through thecycleMonte Carlo vs analytical solutionAsset correlation vs. default correlation, its less important than other, in fact they are close to each other. Creditmetrics is belongs toasset correlation, while credit + to the second one.Some major limits:Treatment of the recovery risk is not random (besides for credit +) and independent from PDAssumption of independence between exposure risk(usually treated as known) and default, while they are empirical positive relatedhence this assumption brings to an underestimationAssumption of independence between credit risk and market riskImpossibility to back testing due to yearly frequency, there is not enough data sampleApplication and General comments on Credit Risk methods: Loan pricing by assessing EL and UL using a transparent process to properly consider the cost of capital absorbed. There is the problem to attribute the marginal benefitso ELR is the expected loss rate, it is the spread to compensate for the ELo VaR is a relative measure, is the cost of equity Risk adjusted performance measurement is used to decide to undergo a specific investmento where is the lending rate applied, this number will be compared with the banks RAROC Setting limits on risk taking of the different business units, however it is crucial to properly define the appropriate level of aggregation for units and the VAR limits and frequency with which are checked as well as its involvement in the budgeting processo this formula will give the max loan amount that can be granted Optimizing the portfolio composition, this is limited due to banks loan characteristics (geo, no secondary market and limited rotation). However thank to recent derivatives development and secondary market it is now possible, the best solution should be to divide the Risk management optimization form the origination process 9 10. Giulio Laudani #23Cod 20163Market riskMarket risk is usually identified as the risk inherent in the trading book (short term), but it should be extended even for those investments intendedto be kept in the financial statement for longer period. Nowadays, it has gained more importance due to the new accounting principle, thesecuritization process and the growth of derivatives instruments. The key elements of this type of risk are:Exchange risk; Interest risk (different fromthe previously one because it is related only to securities which have a secondary market and it will affect only a limited part of the balance sheet);Equity risk; Volatility risk andCommodity riskThe traditional approacheswere:Nominal value which considers the risk as proportional to the nominal value. It has severe limitations such as: thenominal value doesnt reflect MV, it doesnt capture the different degree of sensitivity to a change in the risk factor and it doesnt consider volatilityand correlation; Sensitivity analysis is based on the usage of coefficient representing the risk for each securities category. However there stilldrawback such as: you cannot aggregate those coefficients, it is an easy to communicate between division and senior since each position has aunique measurement and it doesnt consider the volatility and correlation (as nominal value method), meaning that without taking intoconsideration the volatility of risk factor is basically not considering the real risk of the positionTools used to estimate market riskVAR models are characterized by: confidence level, maximum potential loss that a portfolio can suffer and a certain time horizon. This measure iscomparable between all securities class. These models aim to define the risk factors, the probability distribution of those risk and to summarize thoseinformation in one risk parameter The easiest one is the parametricone: normal distribution of change in value in market factors, all possible information are summarized by Var- Cov matrix, the possible loss are correlated to the risk factor by a linear function and VAR is simply a multiple of standard deviation. o The most crucial hp are the one that ignore those problems: Empirical data shows that the distribution of risk factor is skewed and with fatter tails It works by simply reverting the normal distribution into density function from where we can compute the equivalentpercentile. [MV***] The linear relationship is represented by the sensitivity coefficient which (for example) is the modified duration for bonds;this is a deltanormal approach. Alternatively we can assumed that the prices are log normal distributed, hence there is aone to one correspondence, this is an asset-normal approach o Typical issues are: The confidence levelchoice represents the degree of risk aversion of the financial institution, or the target capitalrequirement that allows to have on balance sheet the investors expected creditworthiness (investment on it) , in fact thereis a positive empirical relationship between those value Time horizon is usually a short term measure (daily), the choice of the appropriate level is crucial since the higher the timelength the higher will be the VaR valueThe bank must take into account the liquidity level of its position, the size and a subjective idea on the time thatthe instrument will be in the trading bookIt is important to have enough data observation to ensure significance on the value. Hence for longer timehorizon there could be problems to achieve this. Sometimes to overcome the problem it is possible to use the LRWhp to convert daily data into monthly or weekly o When we apply this method to portfolio we need to compute the correlation between asset, which is simply the matrix of Corr pre and post multiplied by the ordered securities VAR. This formula will grant diversification benefits, but will arise sub-additivity property o When we want to use multi risk factors we need to break down each security into elementary components which depend solely on one risk factor, and then aggregate them as a portfolio. The approach suggested is the mapping of risk position: In the case of a foreign currency bond we have two risk components, we will proceed by computing each single VAR andthen sum them up using their correlation Forward currency position: three positions, exchange rate spot and a spot investment/borrowing amount Forward rate agreements consists of two components a debt spot position with a maturity prior to the investment spot one Stockpositions we can consider each stock as a risk factor or use fictitious position toward relevant stock indexes, hence theposition is mapped to the relevant betas. It works only for well diversified portfolio without specific components Bonds are instead mapped for each of their cash flow (clumping) o Limitations of this model are [note that this model is better if considering single risk profile]: Risk factors change are assumed to be normal distribute10 11. Giulio Laudani #23 Cod 20163 Since the assumption of risk factor is to be explained by Brownian motion the volatility is assumed to be stable over timeand serial independence across risk factor, it is not empirically true The relationship with loss is assumed to be linear oSolutions provided are: The possible solution to normal distribution hp is to use a mixture of Gaussian distribution (to solve the fait tale problemnot the empirical asymmetry) or t-student The non-linearity can be solved by using a curve parameter, but this solution poses the problem of not normal distributionof the underling changes since the curving coefficient is a chi-Square function. Furthermore by introducing a new parameterwe are increasing the model error, and, above all, there is the underling hp that the payoff are derivable, otherwise it is un-applicable. This approach is less effective for joint shock. Another solution is using full valuation approachSimulation modelallows to use different hp compared with the previously one:the risk factors may have different distributions, since the impact ofeach risk factor change is value by full valuation it is possible to have a different relationship between loss and risk factor and it is a way to ensureflexibility. Hence, This model are best used in case of nonlinear payoffs and to value extreme event [stress test]o Full valuation is a way of estimating price variation is not a simulation model, it consists on re-price the securities applying a price formula (that must be define a priori)o All distribution are empirical, meaning based on the simulation. Here there is the difference between historical and Monte Carlo where the fist use empirical distribution obtained by the observation while the second define a parametric distributiono Use of the Percentile logic, meaning that we will generate scenario by applying a given distribution and on those scenario ordered we will compute the interesting percentile, hence the VaR will be the difference between the asset current value and its percentile valueo There is great flexibility on defining the market risk change behaviorHistorical simulation transforms the historical data into future possible behavior, hence it uses the past distribution to predict future onewithout changeo Merits of this model are: It is simple to be understand and communicable It doesnt require hp about distribution or correlation (not Var-Cov required) Non linearity required between risk factor and price changeo Limits and possible solution are: It assumes a stable and stationarity distribution given by past data It may not applicable if the time series are limited or if there is the problem of the usual tradeoff between time length and meaningfulness, however using ahybrid approach putting together the exponential weights and the simulation approach we can use longer sample ensuring meaningfulness. Furthermore this hybrid approach allows the model not to have stable distribution, note that is an approximation, since the current data have higher weights any past change in behavior is smoothedo The methodology of bootstrapping and path generation are used to obtain bigger sample size Bootstrapping is made to avoid loss of observation when we move from daily to weekly or monthly frequencies. It consist of an extraction (allowing for reinterring) of observations, in this way we will built X paths (useful in case of exotic option pricing when we need to know the path is important in defining the price) from whom we will extract the distribution. The underling hp is that observation must be i.i.d To try to overcame the hp of i.i.dwe can used something similar to the hybrid approach or there are two better proposals:Hull and White suggest adjusting data by weighting with current volatility, meaning by proportional move returnsvalue. Heteroskedasticity can be incorporate by rescaling the time series with the available information at time TFiltered historical simulation is based on G-ARCH models which are used to filter data and to obtain residualswhich are used to create scenario. Each residual is standardized (hoping that doing so they are i.i.d) and used in apath generation process where the first data is the filtered return, not the sample one, and it is multiply with heestimated conditional value of next period (predicted volatility by G-ARCH times previously residual)Monte Carlo simulation is based on the definition of an a priori parametric distribution, which should be consistent with the future databehavior. This model is more computing efficient since by increasing the parameters involvedthere is a proportional increase of variablesnumber, but still demandingo Differently from historical simulation, it requires to compute or to assess the correlation between risk factor a priori (otherwise is like assuming independency), but it allows to know the path evolutiono To take into account the relationship we need a Corr matrix which is decomposed into two triangular matrices and used them to build up scenarios 11 12. Giulio Laudani #23Cod 20163o Limits are the needs of an asset based estimation of Var-Cov to find out the joint distribution Stress testaims to estimate the effects connected with extreme events, meaning that the simulation is carry out following predominantly arbitrary and subjective manner just for few scenarioo Use past shock and simulate themo Factor push analysis meaning use more standard deviation movements, note that there is the problem of great variation ofinsignificant risk factor are not significanto Derivatives policy group suggestion it is uni-dimensional (meaning each variable at time) process suggested on interest ratemovement in the slope or shift as well as on other macro indicatorso Multidimensional scenario can implement into two ways: Simple where only some risk factor are let change while the others are kept constant Predictive is the same of the previously one, but the others are changed according with their correlation to the moving oneo This instrument should be a complement to the previously ones and must be followed up by practical actions to reduce risk, hencevulnerability. It allows to test liquidity riskDifferent applications of the Market risk model: Unique riskmeasure for both horizontal and vertical communication between division Portfolio analysis is possible: there are several method to aggregate risk factors It is useful to determine the risk limit exposure both as nominal value, market value exposure (remembering that those measures will be affect by change in volatility) and maximum tolerable variation. It can be used to as a mean to risk adjust return by using RAROC [both ex-ante and ex-post], which is a profit/risk ratio, thanks to this instrument it is possible to monitor the units return, properly set incentive procedure12 13. Giulio Laudani #23Cod 20163Measure of VolatilityVolatility estimation can be divided into two groups: historical with constant parameters and those with changing one and implied option. It isimportant in the Var-Cov model, since it is a crucial parameter. For each of the subsequent model we need to define the time horizon/frequencyThe simplest is basically the computation of the Var-Cov matrix using an equally weighed average or an exponential one Riskmetrixusing asdata source historical dataset o Simple moving average model poses two problems:The sample size: the longer the higher the information, but it will be less reliable for prediction purpose, since it doesnt reflect current situation;and Echo effect meaning that any shock will strongly affect the value both in the entrance and in the exit from the sample o Exponential moving average will overcame the previously problems, but it poses the problem of choosing the appropriate decay factor (which should depend on the data behavior, i.e. how fast they react to change, but it must be noted that it changes overtime) and the number of past observations must be evaluated, this problem is solved increasing the frequency so that to maximize the info content and minimize the error samplingG-ARCH model stands for: Heteroskedasticity, Conditional and Autoregressive. It uses the maximum likelihood criterion, hence it use marketdata to get a better estimate of the decay factor, and however it needs lots of datapoint. The two common factors are: Past volatility it indicates the rate of persistency. It is above 0,7 Past prediction error square it indicates the rapidity with which volatility adapt to new market shock. It lower than the first one Benefits of this methodology are: It recognizes the serial correlation It gives adequately importance to new information The decay factor is directly determinate by market data o It use a normal distribution to describe the behavior of the prediction error, hence it is a poor method in case of skewness or leptokurtosis. The later problem is overcame by using t-student distribution while the first one (which come out of the fact that this model gives same importance to the sign of shocks, which is against empirical date)is solved by models that recognize this asymmetry, i.e. the negative effect has an higher impact: IG-ARCH require that the sum of the coefficient is one, hence if the constant is set to zero is basically the exponentialmoving average formula EG-ARCH models the natural log of the variance (allowing the equation to give negative output), so that instead of squarevalue it use absolute value and real value (response to good and bad is consider) AG-ARCH use the square, but it centered the data using a parameter which will amplify the negative effect an reduce thepositive one o It gives good measure for the immediately following period, but it is less informative for subsequent, in fact, since the parameters should guarantee mean reversing or better a converge series, it converge to the long term valueImplied volatility is basically the volatility taken out form derivatives instruments, hence it represent the market expectation on that period, notreally used in risk management o It may be affected by counterparty risk or by the liquidity level o It will change depending on the contract chosen (at/in/out the money), the maturity and the model used to price the option (it must be reliable and the instrument must be liquid to ensure efficiency) o The maturity and the time horizon for risk management system should coincide to ensure consistency o Computing covariance value is complex and we may not have data, moreover we need to check if the numbers are consistent between them, the matrix must be at least semi-positive. There are two way: the first by using derivatives with more than one underling such as quanto option, or using page 182The general limits of the VAR approach are: It doesnt define the behavior in case of extreme events higher than the thresholds (however it recognizes them); however the purpose of the VAR isnt to make the bank completely safe from bankruptcy, but to reduce this risk to an acceptable level. Remember that higher capital requirement will lower the net incomeo However VAR fails to gives us a measure (probability) that allows to discriminate within excess lost between portfolio, it isnt possible to understand which is higher13 14. Giulio Laudani #23 Cod 20163 The VAR wont take into consideration the customers relationship or any other qualitative indicator, however we can post that the bank is not force to use VAR as the only method to allow credit line, it may be the case that the management use other qualitative methods Some of the assumption behind VAR models are questionable, however if they are understood they can be overcame, furthermore VAR it is a good tool to allocate resource between units according to risk hence it gives more insight compared with other tools DifferentVAR approach can lead to different value, hence it can be seen as a weakness, however it must be noted that since the results highly depends the model there could be the cause that the underling hp of some of those approaches are not appropriate. Furthermore if there will be any bias it is uniform for all the securities, hence the relative risk allocation or perception is unchanged VAR is a too cyclical measure (it is different for each method), however the traders behavior follow this trendduring financial crisis VAR comes too late, meaning it is not able to predict crashes, however VAR purpose isnt to predict crashes (which is impossible for any historical predictive method), but to generate consistent and uniform risk measure during normal conditions for day by day business regulation VAR has a sub additivity property, diversification allows to reduce aggregate VAR, however it is not always the case (such as for non-parametric models), hence it may strongly underestimate risk.o This last problem can be solved by a different measure proceed: Expected shortfall:This method allows to have an idea of thepossibility to have excess lost compared with the VAR lost, it is the average of the excess lost beyond the VAR thresholds. It ensuresub additivity and it gives us a measure of how much money are needed to bail out the bank for the supervisory authority or theexpected payment that an insurer have to pay against bearing this risk Using the Backtesting methodology we will be able to check the performance among different VaR models and to check the consistency with confidence level. The method is basically counting the exceptions to test independence among them and to test the confidence of interval: there are two test proposed:o The Kupiec Test: the null hp is that the model is good performingWe will compute the LR test between two binomial distribution, the numerator without conditioning (hence using the theoretical percentile implemented), and the denominator conditional to the empirical observation. The test is distributed according to a chi-square with one degree of freedom conditional coverage Limits of the test are: it doesnt account for serial dependence (meaning it doesnt allow to value a model for its capabilityto avoid time concentration of excess losses, hence it doesnt value the quality of the model to react to change in themarket conditions); it requires lots of data, the power of the test is weak (and it decrease with the confidence level andnumber of data) oThe Kristofferson Test: the test is willing to check that all the probability of exception and non-exception occurrence is not correlatedto the previous occurrence The LR test: it is the ratio between the Likelihood function assuming independency and the perfect correlation, it is possibleto joint test this LR test with the Kupiec one by summing up them, this new test will be distributed according to a chi-squarewith two degree of freedom oThe paper of Saita and Sironi provides an empirical analysis using 7 equally weighed portfolio (equity, global) to test a Garch(1,1)model a Historical simulation test and an EW Moving average (lambda =0.94)at three confidence level All the three models have failed to grant a consistent confidence level at 0,99 and 0,97; while at 0,95 the EWMA has beenmore conservative and the other two still underestimate The Kupiec test has well performed (only at 0,95) for some cases three for EWMA, one HS an two Garch The extreme event have shown a higher vale than the theoretical one The Kristofferson test has failed for basically all the cases, it has performed better at 0,99 Internal model based on those models have been more capital demanding the HS is the most conservative due to the abruptchange The analysis wasnt able to define a winner, it was limited only for equity with daily horizon 14 15. Giulio Laudani #23 Cod 20163Basel Committee Framework:This agreement is the result of the meeting of the G10 central bank governors to set up a common banking standard to ensure solvency of theindustry and avoid unfair completion based on different capital requirement by national supervisory authority, hence it aims to uniform the bankcapital requirement (it applies to consolidated account, hence it would foster the soundness of institution controlled by foreign banking group) andto prevent future bank crisis (it would revert the bank trend to reduce capital). It is an ongoing work trying to achieve a best practice to ensure aglobal stability to all the financial institution.Basel I:Capital requirement computation is focus on the presence of three different tiers, which need to sum up to 8% i.e. the minimumrequirement bound. This measure is net of goodwill and investment in non-consolidated bank or financial institution o Tier 1is the most important and valued part of the capital. It consists of an upper tier: shares paid-up, disclosed reserves andcertain general provision (in UE the use of those items has been limited only for specific risk, and they have to respect: after tax,immediately available and size, distribution and provisions must be shown separately) and the lower one: innovative instrumentwhich respect: un-redeemable, permanent, callable by issuers will (after 5 y), must be junior to all other instrument, absorb losswithout liquidation procedure and the remuneration can be deferred or if impossible forfeited o Tier 2 is the first category of supplementary capital, it can be used at most for 50% of total capital. It consists on:Undisclosed reserves meaning all those reserves which are created using revenue from off-balance sheet, however they must be solid as the disclosed oneRevaluation reserve (sizable in German and Japan), they are the difference between historical and current market value (they must come from a revaluation process) and can be used at 45% of their value (due to possible negative variation)General provision, however they have been limited by the new accounting principle and must account for at most 1,25% of the total capitalHybrid capital instruments dont need to put the bank under liquidation to absorb loss, remuneration can be waived or reduce and cannot be redeeming by credit (only with supervisory authorization). It must unsecured (as all this instruments) and fully paid upSubordinated term debt, the main difference with the previously one is that to be liable of any loss there must be a liquidation procedure. The condition that those instruments have to meet are: maturity longer than 5 y, they must be depreciated at 20% each year and must be junior o Tier 3 it cannot be accounted for more than 250% of tier 1 for market risk and it cannot exceed 50% of total capital requirement.16It includes short term subordinated debt and can be used to cover only market risk.Risk weights in the first Basel agreement are quite simple and linear. They are: o 0% for cash, Government bond in the OCSE, claims on central bank o 20% claim on central bank or country outside OCSE, claims on bank less than 1 y and multilateral development banks o 50% loan secured by mortgage on residential o 100% the others o The OTC items pag. 555The major Limitsare: o Focus on credit risk only, there is no consideration for currency risk or others (1996 solved) o Poor differentiation of risk, there is no consideration for this issue on the weights, furthermore the class defined by the weightallows for regulatory arbitrage, they aggregate too much o Limited recognition of the link between maturity and credit risk as well risk mitigation instrumentBasel committee IIPillar 1 capital requirement has receipt all the Basel 1 criticism and it had tried to improve the overall capital system. It must be noted thatthe capital requirements are not fix, but can vary following the supervisory expectation both on the capital and the risk perceived byinternal rating method (by multiplying them by a scalar factor)o Risk weightsfor the normal one are related to the rating agency with some improvement, acknowledging the16 It must be at least 2 y, cannot be redeem, there is a lock-in if the capital ratio follow below the minimum plus 20%, and it must be reduce by loan loss forecast and security for trading one15 16. Giulio Laudani #23 Cod 20163 oInternal rate is the possibility given to certain bank to use internal procedure to identify and estimate the main component ofrisk, there exist two different versions:Fundamental allows only to make estimation on PD, while the other EAD, LGD and maturity are chosen by the national authorityAdvancedgave the rights to use own estimations for all the risk components.It can be used if those condition are met: 7 rating classes, where the first must have a PD of 0,03% and there must be a default category Pillar 2 has been made to enforce the supervisory capability to control the bank system. This approach ensure a proactive prudential behavior based on those principles o Bank must set up a system of process and techniques aimed at establishing the overall capital adequacy o Supervisory authority must check those process and ensure the respect of the minimum capital requirement and promptlyintervene to avoid capital deterioration Pillar 3 has been done to reduce the opacity and to force the market to discipline unfair behavior and to promptly penalize bank which are taking more risk. The bank is so forced to publish info regarding: their economic and financial results, financial structure, and their risk management strategies, exposure to risks, accounting policy and corporate governance. 16 17. Giulio Laudani #23 Cod 20163Insurance businessSolvency IIIt will take 8 years to totally implement the new framework started in 2005. There have been 5 insurance conferences (QSI) which has seen a wideparticipation. The main features are: This framework aims to enhance a more consistent standard across EU and ensure that capital requirements are more reflective of risks undertaken by insurerso Market consistency approach for valuing all assets and explicit pro-cyclical provisions Asset and liabilities [non insurance one+ should be valued at price arms length transaction with reference to IFRS and no recognition of change in credit standing for financial liabilities Asset and liabilities [insurance one] should be value with model (best estimate)Price needs to be the same irrespective of investment strategyUse of risk free rate adding a risk marginThe pricing model consists on computing technical provisions based on o Best estimate: PV of future cash flow, which are computed considering prudent and realistic assumption, management and consumer possible future actions o There is any more prudential provision (forbidden), all the guarantee, discretionary benefits and option must be valued o Reinsurance revocable must be valued separately and with allowance of credit risk o The choice of which risk free rate poses some problems [EIOPA provides some guideline] such as: Different currencies problem is solved using swap rate adjusted for credit risk andproviding by QIS5 spot rate curve for main currency Different maturities need appropriate rate:Illiquidity asset: duration cap at 15 euro and 30 GPB and linearly reduce to 0 in5 years and rate adjustedThe spot curve is calculated using a certain basket of corporate bondsThe liquidity premium is cap at the max available on the market for similar cashflow without risk, and it should account for liabilities nature (it is provided byEU institution with the same frequency of interest rate) The risk free rate are extrapolate using macroeconomic model to find out theunconditional long term rateo Better recognition of diversification, risk mitigation and loss absorbing itemso New supervisory approach, more proactive and EU coordinated among country and groups economic reality Group complementary supervisory which have primary responsibility (can used internal model) the methods areAccounting consolidation (to eliminate double counting)Deduction and aggregation recognition of diversification Third countries relationship is articulated in three aspectThe reinsurance supervision from equivalent third countries no differenceGroup capital requirement from equivalent third countries can be used theirGroup solvency from equivalent third countries, their authority is okTransitional arrangements are put in place for important countries, that are not equivalent to EU regulation The reform aims to promote confidence and transparency among insurance It equally apply to reinsurers It wants to overcame the Solvency I limits:o Lack of harmonizationo Inconsistency with new IFRS accounting principleo Capital requirement is not transparent and not adequate to risks, furthermore it was focus on back-looking aspect instead ofgovernance issue (good Risk management)o No recognition of economic reality of group (only plus requirement any reduction) 17 18. Giulio Laudani #23Cod 20163The formation process and directive structure:The formation process and authority involvement is basically EU lead by applying the Lamfalussy structure: L1 see the involvement of Parliament,Ecofin e commission helped by special committee and L2 and L3 see the joint committee *ESMA, EIOPC, EBA+ check law. The directive structurereflects Basel II/IIIs pillar approachPillar I risk calibration of financial requirement, hence it is focus on the quantitative issue o The SCR is calculated in terms of potential loss of value at a confidence level of 99,5% over a 1 year time horizon considering all quantifiable risks It should (internal model) account for even the liquidity premium risk Each risk is model individually with a modal approach using factor or scenario approach The individually risks are aggregate using corr matrix (provided by QIS5) It must be taken into account the absorbing capability of technical provisions (meaning future benefits: change in hp)and deferred tax Gross SCR are computed without considering benefits Net SC assumes bonuses to reduce/absorb loss The difference of the first two items will be FDB (future discretionary bonuses) Mitigation techniques and collateral and segregated assets are allowed under certain conditions (legally binding, actualtransfer of risk) Proportional recognition for techniques in force for less than 12 months are allowed under certain conditions( no riskfor rollover and counterparty risk take into account) o The MCR (minimum) is cap within the corridor of 25/45% of SCR Legal certainty, auditable and safety net protection It is based on percentages applied to combination of premium and technical provision o Solvency II allow to choose which level of simplicity or sensitivity can be used to assess risk form the simplified method to internal method The insurance must grants a sound framework to managing, controlling and measuring risk The authority can still request specific parameters for internal models o The Own funds categorization: Basic Own funds: excess asset over liabilities, subordinated liabilities and adjustment (expected profits, net deferredtax and restricted reserves) They can be used for all tier capital requirement Ancillary Own fund (prior supervisory approval): off balance items that can be called up to absorb losses such asunpaid shares, letter of credit other legally binding documents It can be used to tier 2 and 3 o The capital eligible to meet SCR must be at least 50% tier 1, hybrid instrument can be at most 20% of tier 1 and tier 3 cannot account for more than 15% o The capital eligible to meet MCR must be 80% tier 1 no tier 3 available and no ancillary funds o The criteria used to allocate OF in tiers are: subordination, loss absorbency, duration, freedom to redeem, absence of encumbrances and of mandatory servicing costsPillar II new supervisory relationship and setting up for governance o Authorities are granted to take action to restore critical situation when SCR get closer to MCR, to ensure even in extreme situation to preserve policyholders interest There is a convergence of supervisory standards Capital add on- are authority demand to adjust risk assessing They can draft and implement measures, so that they can promote fairness act against breach of EU law and act in theemergency situations o To take into account cyclical effects the authority can extend the recovery period, inherent rebalancing between SCR and available capital, liquidity premium and equity symmetric dampener o There are quantitative risk management standards, so that it will play a central role in the company Internal control system and internal audit and actuarial functions ORSA (own risk solvency assessment) specific risk profile, compliance with financial requirements, significance ofdeviation between the two o There are new disclosure requirements bring market discipline to bear insurers18 19. Giulio Laudani #23 Cod 20163 Pillar III opening up to market discipline provisions and reporting to authority o The disclosure to market must be at least annually and account for business performance, risk profile, system of governance,capital management and valuation purposes o The Regular supervisor report to must be narrative with quantitative data, every 3 years but it can be asked to be providedannually o Annual and quarterly quantitative template with a lag of 14 weeks and 5 weeks for the last oneThe main implementing problems and open issues are: Liquidity premium, currency curve and how to allocate them to liabilities, There are toomany uncertainty and different interpretations, The QSI5 have recognize 92% of OF as tier 1, The group diversification has a low to free 20% of OF onaverage which has created a huge difference with other non EU entities, Eliminating excessive complexity, reducing excessive volatility and reducingpenalization for long term business with guarantees (revising stresses for interest and spread risk at long duration); Improving calculation criteria contract boundaries and deferred tax and liabilities; Finding political compromises on political aspects illiquidity premium, future profits,diversification in risk margin and at group level; Designing appropriate transitional provision disclosure and reporting, internal model; Properlymanage the new task of supervisory authority (new tools, culture and power); and The impact of the new discipline will be huge, there is a change onthe overall business with extensive impact on governance, strategy, expectation of authority and new competitive scenarioSome example: Example of the usage of the aggregation of modules for SCR calculation purpose based on the market risk case, which is divided into several categorieso Market investment must have a minimum rating of BBB as an overall ruleo The equity risk module Speculative are treated depending on the market on where there are listed and on CEIOPS AdviceDuration based approach (22%) for certain lines of businessSymmetric adjustments are made by EIOPA on a benchmark of listed company and are sufficiently public. Theadjustment is cap between -/+ 10% and it is computed using a formula that take in consideration the currentlevel and the weighted average of daily data over a window of 36 month Strategic participations are stressed at 22%o The interest rate module All asset sensible to interest should be stressed, as well as insurance liabilities in their PV change There is no stress test on volatility of interest rateso Spread risk module consists in any change in the credit spread over the risk free rate It is calculated on separate items (Bonds, Credit derivatives and ABS on loan)Derivatives which are used for risk mitigation are excludedBonds and ABS are value using a table using rating, a conversion factor (larger for the last) Instruments of EU with AAA or AA rating are excluded, and the other have lower capital requirement as well ascorporate AAA ratingo Currency risk module It can be calculated on net or gross position It consider a max variation of 25% No diversification benefits, but the EU currencies are advantaged with lower stress levelo Property risk module, it the risk related to real estate 25% for all property (direct) investment in company in real estate is subject to equity module No geographical diversificationo Concentration risk module, it aims to consider the risk related to lack of sufficient diversification or to exposure to large default risk (single issuer) It apply to all the category of assets, considering only the exposures in excess of certain threshold It depends on the exposure and rating of the counterpart It is excluded participation internal the group, when the risk is beard by policyholders or when it is already consideredin the default counterparty module It doesnt consider geographical concentration EU with AAA and AA no capital requiremento Liquidity risk module is new and it allows for change in illiquidity premium Negative correlation with spread risk (-50%) 19 20. Giulio Laudani #23 Cod 20163The counterparty default risk module considers the possibility of unexpected loss or deterioration of the credit standing included in thespread risk module o I type: undiversified and rated exposureCollateral reduce the risk (adjusting for risk) the LGD and PD are given o II type: diversified and unrated exposure (policyholder debtors, mortgage loan) o The correlation granted between this two types is 75%Difference between Banks and InsurerThe relationship between insurance andbanks is getting closer and closer thanks to:New fashionable strategy implemented to manage the reserve17taken form the Asset management industry ; Bigger cross-selling activity thanks to innovative products sold by banks and insurance (Life segment)18or by the creation of new institution providing insurance or by the introduction of financial aspect to classic insurance policy , and The accountingprinciple are getting closer (financial conglomerates directive)However, The nature of the risk and liabilities that each financial institution is still different, even after a great convergence due to specific aspects:demographic issues, scale of operation and structure of liabilities:The main point is played by the difference between speculative and purerisk, where the last one is the one beard by insurance and it is themost characteristic differenceThe prospective of the clients is different scarifying small certain wealth to avoid the possibility of big uncertain wealth the source ofuncertain in the industry is typical, because is discontinued in nature and typical individual cannot hedge by his owno This idea came from the functional perspective, where any institution exist to respond to specific needs, that to be solved required a specific function (less volatile) which required a structure to be performed and where competition will force this structure to the most efficient oneo According to this theory the insurance allows to facilitate the entrance of risk adverse investors in riskier market and to guarantee wealth in specific state of natureThe accounting principle have still some difference, the insurance lead principle is the cost basis not the price one, furthermore manyscholars believes that a full harmonization will have adverse impacts on insurance businessThe securitizationprocess have less touched the insurance industry due to the specific know how requiredThe supervisory authority are different, even if the new framework with the creation of a unique joint committee aims toreduce/eliminate this difference of treatmentThe distribution channel are different and they need different expertise, in fact even if there could be a great efficiency in a uniquechannel the two distribution force are still too different (especially for nonlife instruments) and may need a large training cost to be done,bigger than the possible synergy in some fieldsThe liabilities have structural difference due to the underlying risk (pure one) and structural differences related to uncertainty both ontime and future exposure Furthermore, The theoretical tools used by insurance for managing and transforming its own liabilities are different since it is not possible to replicate the cash flow of the insurers liabilities or to exploit diversification theorem to reduce riskThe tools used are based on statistical rules based on Risk pooling (Law of big number and central limit theorem) [the accidental losses towhich the group is subjected become predictable within limits] to reduce the underwriting risk (the one that the actual payoff looksdifferent, it is a speculative risk), divided intoo Cash flow risk: It is the one related to any error in the estimation of Prob or Losseso Timing risk: It is the one related to errors in estimating the actual timingThe goals of the insurer is to build a good liabilities portfolio not a good asset one, hence the core risk is the underwriting risk not themismatch one (as the crisis have shown the key problem in bank is related to valuation)The hp of this solution is that by collecting big sample we ensure normality and a convergence between first and second moments amongsample and populationo However since the sample may not have a unique payoff diagram (homogenous), the knowledge of loss distribution and there is correlation among them (reducing the benefits of pool sampling)The presence of undertaking risk (different form financial one) is crucial and is based on the presence of risk of two new risk factors:insurance dimension and timing of the future outflow17 Active managing the reserve, increase diffusion of unit-index instruments, usually this activity are outsourced18 The insurance can use the larger bank base to increase its own business 20 21. Giulio Laudani #23Cod 20163oThe bigger the insurer the lower the risk especially in presence of low correlation among class of risks (which is the opposite for financial risk)o The timing problem is reducing by applying bigger sample to reduce the window, but since it isnt perfect the insurance cannot hedge their exposure (investment risk) , all their calculation are computed over expectation The insurance can use risk spreading (insurer different group) to reduce risk exposure or reinsurance agreements The absence of a secondary market due to structural characteristic (the holder cannot sell its insurance and will receive the payment only by the issuer) furthermore liabilities and asset are not perfectly matched, hence they cannot sell any time the asset with the related liabilitieso This problem is related to the lower rate of securitization in this industry (lack of know how) however some risk can be pooled tougher and sold in the market providing low correlated instrument with marketo As a consequence of less securitization the insurance industry doesnt come up with an originate to distribute business, on the contrary the insurance have less relied on external capital for daily activities The usage of hedging instruments can destroy value, since it is impossible to define an a-priori hedging strategy. It is better to manage the valuation processof insurance by taking specific assumption about future dynamic The VaR approach is pure performing in insurance due to lack of linear relationship between risk and maturity (the roll over and expected duration pose severe problem on assessing risk) Some definition provide on premium and its calculation:o Fair premium: It is the one to cover its underwriting risk, opportunity cost of its allocated capital as well as the administrativecost. It is discounted back since the payment is paid in advance. There is a trade-off between the needs to be competitive tocollect enough data and the needs to compensate the capital for ULo Pure premium: It is the one to cover the expected losso Expected claim cost: It is the weighted average of future payoff [remember the limitation compared with utility function]A new risk paradigmProblems related to the crisis: Risk transparency inside the corporations between Risk Management (RM) and board was un-sufficient:o Weaknesses in basic risk infrastructure: significant data availability and quality challenges, but there was an overreliance on complex mathematical models rather than insight into potential future risks.o Non-exhaustive quantification of risks and silos view of risk types especially in the board room there was limited understanding of liquidity, capital, and accounting implications Risk ownership consists in the incapability for same financial institution to be able to understand and properly manage their risk. This un- appropriateness is caused by:o Major flaws in strategy: decision to move into new businesses not based on profound understanding of potential risks inherent to that businesso (Lack of) own skills in the organization not considered or overestimated leading to false assessment of risk appetite/risk-taking capacityo First and foremost: lack of top management understanding in the boardo Market bias: "If my peers engage in this it cannot be wrong" anxious not to miss the opportunity Governance and structure limitation have blocked the institution in their reaction to the crisiso Insufficient information flow: parts of the organization did expect severe repercussions but did not inform the board in timeo No clear risk responsibilities for key topics around capital (too much socialization of the responsibility); liquidity falling between CRO and CFO and limited integration with BUso Businesses not ready to act: since accountability for risk evaporated the decision process was too slowo Skill deficiencies across the entire organization most prominently including the board Culture and incentive have facilitated adverse actiono Misaligned incentives on all levels from shareholder to individual desk, incentivizing risk taking in general and market conformity in particular rather than mitigationo Culture of fear to "take personal risk" and expose one-selves through bad messages and/or actiono No consistent and explicit risk culture: lines of defenses from front-line mitigation to capital not laid out, established, or workingo In particular lack of appetite early in time to take bold managerial action to divert from the mean21 22. Giulio Laudani #23 Cod 20163Regulations have been treated as adversaries rather than facilitators of stable marketso Compliance behavior focusing on fulfillment of requirements (e.g., Basel II) rather than proper risk management overallo Regulators skills kept at a distance, "trained to the extent deemed helpful" but not viewed as a partner in stability for effective marketso Systematic arbitrage of regulation especially on optimization of regulatory capitalo Reactive approach to regulation: shape both policies and skill building with regulators for short-term benefit of bankAll these problems have caused:Loss of confidence among the RM due to lack of understanding discontinuities and lack of a general idea on which model should bepreferred (which balance between bold and conservative?)A new business environment where authority will reshape all we do and there will be an increasing pressure to bring our risk managementto best-practice levelsA new RM approach is coming out, it is more focused on efficiency on communication and assessing risk, there will be a new risk paradigmHow the new RM activity looks like after the crisis:o Risk Transparency and insight: From models to insight and foresight and from daily VAR to structural risks Greater levels of risk transparency will not come from more complex modeling, but from understanding the positions and theportfolio, taking a forward-looking approach to risk assessment and its accounting implications, and exercising management judgmenton market behaviors. Structural risks will constitute a completely new risk category Risk identification & understanding: build insights into all relevant risk to build a comprehensive view to ensure an adequate aggregation using properly estimate corr matrix and to understand the accounting and P&L consequences Risk foresight: develop early-warning KPIs for structural risks and bubbles Risk modeling: test the resilience of the business against specific scenarios Risk tracking: build new MIS focused on decision support Risk infrastructure: ensure adequate robustnesso Risk ownership, the management need to build explicit strategy accounting for which risks are undergoing, how to mitigate their effects andwhich is the institutions risk appetite Natural Ownership; it is crucial to undertake only risk that are understood and to participate only in market on whom there iscompetitive advantage in managing risk Comprehensive analysis on business unit level across all types of relevant risks Gap analysis against Business inherent risks Regular benchmarking Against competitors Lines of defense: it is crucial to have an in house process to assess risk and to immediately take action in stressed situationconsidering: The resilience of the business: diversification and flexibility Own capability to assess risk: front line force skills are central The financial strength to undergo with losses: forward looking and peer comparison Risk capability and appetite: Define specific action programs to increase robustness by implementing no-regret moves, to becomerisk leader in selected areas, and to improve resilience by creating extraordinary flexibility to benefit from risk events Selection of those risks that are generally acceptable against target business model and actual/potential risk ownership advantages Dynamic setting of acceptable risk exposure and necessary mitigation depending on market environment to avoid unwanted risk and to exploit opportunities Definition of actions to shape risk return for optimum upside and reducing downside impact in different market situationso Governance and structure must be shaped to mirror the new paradigm creating formal mechanisms to foster debate about the evolution of themarket scenarios and to fully leverage the information within the organization At the board level: there must be an active role (and well defined) in risk management and to properly check the maintaining of theoverall risk within the target 22 23. Giulio Laudani #23 Cod 20163At the managerial level: the decision making process must be less deterministic, it must look forward a scenario approach. It should aim to optimize the risk taking process toward an higher transparency in the identification, a clear strategic view on the evolution to properly plan/budgeting the business At the daily institution activity: it should be less cyclical dependent, hence more forward looking (more macro oriented) and to be