Greetings,
I am working on an application in finance involving Markov transition
matrices. For this application I have collected a "time series"
of transition matrices over the past twenty years. Specifically, for each
calendar year, I have a single 8 x 8 transition matrix corresponding to the
behavior of a set of financial securities in that year.
My objective is to develop a simple Monte Carlo procedure for simulating
these transition matrices in a way that reflect certain key characteristics
of the time series of matrices. In particular, the simulation should
generate variates for each cell that reflect both the marginal distribution
of each individual transition, as well the correlation between each
individual transition. On approach I've explored is modeling the marginal
distribution of each cell as a gamma or beta distribution, the parameters
of which I estimate with maximum likelihood techniques. What I am looking
for is a way to introduce correlation between the cells that reflect the
correlation seen historically. The background for this is as follows. The
transition matrices reflect the credit ratings migration for a portfolio of
bonds. That is, each cell reflects the probability of a bond moving from
one rating category to another, e.g., AAA to AA, etc. If one examines the
historical times series for any specific cell, there is evidence that
credit transitions reflect movements in the underlying business cycle.
During economic downtowns the probability of moving to lower rating
categories increases, while the opposite is true during economic
expansions. This implies correlation between sets of probabilities. As
such, I think it is reasonable to try to reflect this behavior. The
simplest way to accomplish this, I believe, is by simulating a set of
correlated random variates.
Can this be accomplished using a Dirichlet distribution?
Thanks,
Mark
|