This one-day workshop was held at UNSW on Monday 2nd July, 2012.
Besides the 50+ registered participants, some colleagues and students from The School of Mathematics and Statistics, The School of Economics and The School of Risk and Actuarial Studies attended selected talks on the day. Feedback about the event confirms that it was a highly successful event. Participant comments include “We all really enjoyed the day”, “thank you so much for organising such a wonderful event”, and “the workshop today was very impressive”, etc.
The organisers believe that the format of the workshop (i.e., smaller number of talks of a specific duration, with sufficient time breaks for meeting and interacting between the participants) was ideal, and we intend to organise similar events on the topic of quantitative aspects of risk in the future.
Professor Darinka Dentcheva (Stevens Institute of Technology, NJ, USA)
Professor Ben Goldys (The University of New South Wales/ The University of Sydney)
Associate Professor Spiridon Penev (The University of New South Wales)
Dr Gareth Peters (The University of New South Wales)
Dr Pavel Shevchenko (Principal Research Scientist at CSIRO, Adjunct Professor at The University of New South Wales)
Professor Andrzej Ruszczynski (Rutgers University, NJ, USA)
Professor Marek Rutkowski (The University of Sydney)
A talk delivered by James O'Donnell, Senior Manager - Quantitative Analytics, The Westpac Group, "Challenges in every day credit risk management".
Defining suitable risk measures, risk evaluation and estimation using past data is a current issue for many industries. Assessing the risk is a central activity in diverse areas such as the financial industry, environmental safety, and medicine. Whenever decisions are to be taken under uncertainty, there is risk involved and the decisions are to be taken after risk evaluation and estimation using past data. Risk is broadly defined as a quantitative description of one’s preferences with respect to a set of uncertain (random) outcomes.
Recently, new risk measures and methods for their evaluation and optimization have been introduced. Specifically, risk measures used in Finance are required to be coherent. An interesting class of such measures are the so-called Higher Order coherent tails risk measures which have advantages in comparison to their predecessors like e.g., expected shortfall. New methods for forecasting time series in an environment of high volatility are also developed. Optimizing the risk under constraints leads to challenging optimization problems. These constraints are often stochastic dominance-type constraints and require a special treatment. Further statistical problems arise when confidence regions and hypotheses tests are performed. Because of the nonlinearity of the new so-called coherent risk measures, the solution to the arising statistical problems is more delicate. To a great extent, such type of solutions and their statistical properties are "new" to Optimization and to Statistics.
A closely related problem is that of risk allocation. The allocation mechanism should provide incentive to better manage the risk. Also, it is desirable that the allocation procedure to different levels shares the same risk factors/drivers. Similar to defining a coherent risk measure using a set of axioms, a coherent allocation principle (so-called Euler’s principle) can be defined and used in practice.
Another important recent activity in inference about risk is the development of an approach which tries to combine different data sources to estimate risk. This is important in practical applications where often data is scarce but there exists additional information in a form of expert opinions. Bayesian inference methods are a very suitable methodology to be used as a way to combine data and expert opinions.
Another aspect of the arising challenges is to develop a methodology for optimization and inference for risk in dynamic settings. This is necessary in order to be able to analyse the evolution of e.g. financial data in time. The concept of a dynamic risk measure is of crucial importance. It has some important properties like time-consistency. Using dynamic risk measures, dynamic optimization problems for Markov models can be formulated and solved.
A fairly general mathematical framework has been developed recently, which is aimed to furnish effective credit value adjustment (CVA) computations for a contract with bilateral counterparty risk and will be presented at the workshop.
Recently, in the area of credit risk, there has been significant interest and activity in studying multiasset derivatives. Extensions of classical models lead to consider the Wishart dynamics of the stochastic volatility matrix. Some recent results in modelling the stochastic volatility matrix will also be presented at the workshop.
The workshop aimed to introduce, as well as discuss the aspects of risk modelling, optimization and inference about risk. It was suitable for postgraduate (and advanced undergraduate) students, as well as for researchers and practitioners interested in learning more about the mathematical, statistical and operations research aspects of modern risk theory.
Presentations were given by leading experts in the above areas. Many of them have published recent monographs or have current contracts to publish monographs in the area of risk, risk-averse optimization, Bayesian and Monte Carlo methods for Inference about risk and related topics. Bringing them together to this workshop was of great benefit to all participants.
For detailed workshop abstracts, please click here.
There was no registration fee for this workshop. However, for catering purposes, it was necessary to register interest in participating by contacting the organisers, Spiro Penev: S.Penev@unsw.edu.au or Pavel Shevchenko: email@example.com