Skip to contents

The fHMM package is an implementation of the hidden Markov model with a focus on applications to financial time series data. This vignette1 introduces the model and its hierarchical extension. It closely follows Oelschläger and Adam (2021).

The Hidden Markov Model

Hidden Markov models (HMMs) are a modeling framework for time series data where a sequence of observation is assumed to depend on a latent state process. The peculiarity is that, instead of the observation process, the state process cannot be directly observed. However, the latent states comprise information about the environment the model is applied on.

The connection between hidden state process and observed state-dependent process arises by the following: Let NN be the number of possible states. We assume that for each point in time t=1,,Tt = 1, \ldots, T, an underlying process (St)t=1,,T(S_t)_{t = 1, \ldots, T} selects one of those NN states. Then, depending on the active state St{1,,N}S_t \in \{ 1, \ldots, N \}, the observation XtX_t from the state-dependent process (Xt)t=1,,T(X_t)_{t = 1, \ldots, T} is generated by one of NN distributions f(1),,f(N).f^{(1)},\dots,f^{(N)}.2

Furthermore, we assume (St)t(S_t)_t to be Markovian, i.e. we assume that the actual state only depends on the previous state. Henceforth, we can identify the process by its initial distribution δ\delta and its transition probability matrix (t.p.m.) Γ\Gamma. Moreover, by construction, we force the process (Xt)t=1,,T(X_t)_{t = 1, \ldots, T} to satisfy the conditional independence assumption, i.e. the actual observation XtX_t depends on the current state StS_t, but does not depend on previous observations or states at all. The following graphic visualizes the dependence structure:

Dependence structure of the HMM.
Dependence structure of the HMM.

Referring to financial data, the different states can serve as proxies for the actual market situation, e.g. calm or nervous. Even though these moods cannot be observed directly, price changes or trading volumes, which clearly depend on the current mood of the market, can be observed. Thereby, using an underlying Markov process, we can detect which mood is active at any point in time and how the different moods alternate. Depending on the current mood, a price change is generated by a different distribution. These distributions characterize the moods in terms of expected return and volatility.3

Following Zucchini, MacDonald, and Langrock (2016), we assume that the initial distribution δ\delta equals the stationary distribution π\pi, where π=πΓ\pi = \pi \Gamma, i.e. the stationary and henceforth the initial distribution is determined by Γ\Gamma.4 This is reasonable from a practical point of view: On the one hand, the hidden state process has been evolving for some time before we start to observe it and hence can be assumed to be stationary. On the other hand, setting δ=π\delta=\pi reduces the number of parameters that need to be estimated, which is convenient from a computational perspective.

Adding a Hierarchical Structure

The hierarchical hidden Markov model (HMMM) is a flexible extension of the HMM that can jointly model data observed on two different time scales. The two time series, one on a coarser and one on a finer scale, differ in the number of observations, e.g. monthly observations on the coarser scale and daily or weekly observations on the finer scale.

Following the concept of HMMs, we can model both state-dependent time series jointly. First, we treat the time series on the coarser scale as stemming from an ordinary HMM, which we refer to as the coarse-scale HMM: At each time point tt of the coarse-scale time space {1,,T}\{1,\dots,T\}, an underlying process (St)t(S_t)_t selects one state from the coarse-scale state space {1,,N}\{1,\dots,N\}. We call (St)t(S_t)_t the hidden coarse-scale state process. Depending on which state is active at tt, one of NN distributions f(1),,f(N)f^{(1)},\dots,f^{(N)} realizes the observation XtX_t. The process (Xt)t(X_t)_t is called the observed coarse-scale state-dependent process. The processes (St)t(S_t)_t and (Xt)t(X_t)_t have the same properties as before, namely (St)t(S_t)_t is a first-order Markov process and (Xt)t(X_t)_t satisfies the conditional independence assumption.

Subsequently, we segment the observations of the fine-scale time series into TT distinct chunks, each of which contains all data points that correspond to the tt-th coarse-scale time point. Assuming that we have T*T^* fine-scale observations on every coarse-scale time point, we face TT chunks comprising of T*T^* fine-scale observations each.

The hierarchical structure now evinces itself as we model each of the chunks by one of NN possible fine-scale HMMs. Each of the fine-scale HMMs has its own t.p.m. Γ*(i)\Gamma^{*(i)}, initial distribution δ*(i)\delta^{*(i)}, stationary distribution π*(i)\pi^{*(i)}, and state-dependent distributions f*(i,1),,f*(i,N*)f^{*(i,1)},\dots,f^{*(i,N^*)}. Which fine-scale HMM is selected to explain the tt-th chunk of fine-scale observations depends on the hidden coarse-scale state StS_t. The ii-th fine-scale HMM explaining the tt-th chunk of fine-scale observations consists of the following two stochastic processes: At each time point t*t^* of the fine-scale time space {1,,T*}\{1,\dots,T^*\}, the process (St,t**)t*(S^*_{t,t^*})_{t^*} selects one state from the fine-scale state space {1,,N*}\{1,\dots,N^*\}. We call (St,t**)t*(S^*_{t,t^*})_{t^*} the hidden fine-scale state process. Depending on which state is active at t*t^*, one of N*N^* distributions f*(i,1),,f*(i,N*)f^{*(i,1)},\dots,f^{*(i,N^*)} realizes the observation Xt,t**X^*_{t,t^*}. The process (Xt,t**)t*(X^*_{t,t^*})_{t^*} is called the observed fine-scale state-dependent process.

The fine-scale processes (S1,t**)t*,,(ST,t**)t*(S^*_{1,t^*})_{t^*},\dots,(S^*_{T,t^*})_{t^*} and (X1,t**)t*,,(XT,t**)t*(X^*_{1,t^*})_{t^*},\dots,(X^*_{T,t^*})_{t^*} satisfy the Markov property and the conditional independence assumption, respectively, as well. Furthermore, it is assumed that the fine-scale HMM explaining (Xt,t**)t*(X^*_{t,t^*})_{t^*} only depends on StS_t. This hierarchical structure is visualized in the following:

Dependence structure of the HHMM.
Dependence structure of the HHMM.

References

Norris, J. R. 1997. “Markov Chains.” Cambridge University Press.
Oelschläger, L., and T. Adam. 2021. “Detecting Bearish and Bullish Markets in Financial Time Series Using Hierarchical Hidden Markov Models.” Statistical Modelling. https://doi.org/10.1177/1471082X211034048.
Zucchini, W., I. L. MacDonald, and R. Langrock. 2016. “Hidden Markov Models for Time Series: An Introduction Using R, 2nd Edition.” Chapman and Hall/CRC.