BYJ commercial bank has $100 million of retail exposures. The 1-year probability of default averages 2% and the recovery rate averages 60%. If the correlation parameter is estimated at 0.1, what will be the 1-year unexpected loss at 99.9% confidence? A $7.68 million B $8.01 million C $4.32 million D $12.8 million The correct answer is: C V (0.999, 1) = N [{N-1(0.02) + √0.1 N-1(0.999)} / √1-0.1] = 0.128 This is showing that the 99.9% worst case default rate is 12.8% The 1-year unexpected loss at 99.9% confidence is given by: (WCDR - PD) X LGD X EAD =(0.128−0.02)*(1−0.6)*100 = 4.32 million *User Question: Can you please clarify the nature of this underlying formula V (0.999, 1) = N [{N-1(0.02) + √0.1 N-1(0.999)} / √1-0.1] = 0.128 and what does WCDR represent in the late UL equation

A random sample of 50 FRM exam candidates was found to have an average IQ of 125. The standard deviation among candidates is known (approximately 20). Assuming that IQs follow a normal distribution, carry out a statistical test (5% significance level) to determine whether the average IQ of FRM candidates is greater than 120. Compute the test statistic and give a conclusion. A Test statistic: 1.768; Reject H0 B Test statistic: 2.828; Reject H0 C Test statistic: 1.768; Fail to reject H0 D Test statistic: 1.0606; Fail to reject H0 The correct answer is: A The first step: Formulate H0 and H1H0: μ = 120H1:μ > 120Note that this is a one-sided test because H1 explores a change in one direction onlyUnder H0, (x̄ - 120)/(σ/√n) ∿N(0,1) Next, compute the test statistic: = (125 – 120)/(20/√50) = 1.768Next, we can confirm that P(Z > 1.6449) = 0.05, which means our critical value is the upper 5% point of the normal distribution i.e. 1.6449. Since 1.768 is greater than 1.6449, it lies in the rejection region. As such, we have sufficient evidence to reject H0 and conclude that the average IQ of FRM candidates is indeed greater than 120. Alternatively, we could go the “p-value way” P(Z > 1.768) = 1 – P(Z < 1.768) = 1 – 0.96147 = 0.03853 or 3.853%This probability is less than 5% meaning that we have sufficient evidence against H0. This approach leads to a similar conclusion. *User Question: Can you please explain how did we get P(Z > 1.6449) = 0.05 ?

The standard deviation of the daily portfolio value changes is given as 0.0231, and its mean as 0.0012. Given that there are 250 trading days in a year, what is the annual 99% VaR for the portfolio according to the delta-normal model? A 0.78 B 0.83 C 0.58 D 0.63 The correct answer is: B According to delta-normal mode, VaR is given by: $$ \text{VaR}={ \mu }_{ \text{P} }+{ \sigma }_{ \text{P} } { \text{U} } $$ Since we are dealing with 99% VaR, it implies that $$ \text{U}=-2.326 $$ So that daily VaR is given by: $$ \begin{align*} \text{VaR}=& 0.0012+0.0231\times-2.326=-0.05253 \\ \text{Annual VaR}=& \sqrt{250}\times-0.05253=-0.8306 \end{align*} $$ *User Question: While scaling up and finding an annual VAR, shouldn't the return be multiplied by 250 and the standard deviation be multiplied by square root of 250. Since we will use the property that the sum of 'n' IID with mean R and Std Dev D is n*R and (square root of n)*(Std Dev D). This will give us a normal distribution with mean of n*R and std dev of (square root of n*Std Dev D), which will be 250*0.0012= 0.3 (mean) and standard deviation of (square root of 250)*0.0231= 0.3652. Now applying U = -2.33 we get VAR as 0.5509. How can we scale up a value of a normal distribution at a particular percentile directly by the square root of time as it is done in the solution?

A fund manager has the option to buy the following bonds: I. A bond with 10% coupon and a tenure of 15 years II. A bond with 10% coupon and a tenure of 10 years The fund manager expects the interest rate volatility to increase and wants to compose a portfolio which will help him generate maximum return due to the volatility. The fund manager must buy: A The bond with a tenure of 15 years B The bond with a tenure of 10 years C Both bonds, since they react in a similar manner to interest rate volatility D Both bonds, since the diversification effect will help him generate maximum return The correct answer is: A The larger the duration, the more the impact of interest rate volatility is on the portfolio. It has been observed that bonds with large tenures have higher durations. Therefore, the bond with a tenure of 15 years will have a higher duration as compared to the bond with a tenure of 10 years. Therefore, in order to generate maximum return due to the interest rate volatility, the fund manager must invest in the bond with a tenure of 15 years. *User Question: A bit confused between the earlier Q1178 and this Q1179. Q1178 stated longer duration (locked-in tenure and fixed coupon) will minimize the impact of interest rate change. i.e. lower risk. Q1179 asking about portfolio that can maximize return from interest rate volatility, a longer tenure or duration will mean the return of 10% coupon rate is locked in and the investors shouldn't expose to any interest rate fluctuation until the bond mature in 15 years. So how to maximize return from interest rate volatility?

Jack Marconi is an equity strategist at Gandhara Investment and is evaluating the performance of four large-cap equity portfolios: Azgard, Lambda, Tricky, and Jackpot. As part of his analysis, Jack computes the Sharpe ratio and the Treynor measure for all four funds. Based on his finding, the ranks assigned to the four funds are as follows: $$ \begin{array}{|c|c|c|} \hline Fund & Treynor\quad Measure\quad Rank & Sharpe\quad Ratio\quad Rank \\ \hline Azgard & 1 & 4 \\ \hline Lambda & 2 & 3 \\ \hline Tricky & 3 & 2 \\ \hline Jackpot & 4 & 1 \\ \hline \end{array} $$ The difference in rankings for Funds Azgard and Jackpot is most likely due to: A Different benchmarks used to evaluate each fund's performance B A difference in risk premiums C A lack of diversification in Azgard Fund as compared to Jackpot Fund D None of the above The correct answer is: C The most likely reason for a difference in ranking is due to the absence of diversification in Azgard Fund. The Sharpe ratio measures excess return per unit of total risk, while the Treynor ratio measures excess return per unit of systematic risk. Since Azgard Fund performed well on the Treynor measure and so poorly on the Sharpe measure, it seems that the fund carries a greater amount of unsystematic risk, meaning it is not well diversified and unsystematic risk is not the relevant risk measure. *User Question: Hello, I believe the explanation is not correct (the answer is correct though). Azgard is the fund that performed poorly on the TREYNOR ratio and performed well on the Sharpe (not vice versa as you state it) Therefore, Azgard carries a greater amount of systematic risk (Beta is greater ,therefore Traenor is low) or it was not well diversified. Please comment because the current explanation is confusing - e.g. if a porftolio performes well on the Treynor it means it is WELL diversified.