A study on association in time of a finite semi Markov process

We derive three equivalent sufficient conditions for association in time of a finite state semi Markov process in terms of transition probabilities and crude hazard rates. This result generalizes the earlier results of Esary and Proschan (1970)for a binary Markov process and Hjort, Natvig and Funnemark (1985) for a multistate Markov process.


1.Introduction
In (1970) Esary and Proschan (EP) provided the sufficient condition for association in time of a binary Markov process and used it to derive the lower bound for the binary repairable coherent system. It required 1985 to simplify the EP condition in terms of transition intensities of a finite Markov process. It is well known that the class of semi Markov processes (SMP) contains the class of Markov processes and SMP with finite state space is more widely used to model the most of the real life situations in reliability, risk theory, queues and inventory.
The purpose of this paper is to derive a sufficient condition for association in time for a finite state SMP. The results require the representation of the conditional intensities of the SMP. Throughout the paper we use "increasing" in place of "nondecreasing" and "decreasing" in place of "nonincreasing". The layout of the paper is as follows: in the next section, we briefly summarize the required results from product integrals and SMP. It is shown that the transition probabilities of SMP can be represented using product integrals. Our main results are in section 3. A sufficient condition for association of a finite state SMP is derived in Theorem. Its equivalent versions are also given to verify these conditions one requires to know only the kernel of the SMP under study. Lastly, in section 4 it is shown that the sufficient conditions for association in time obtained by Esary and Proschan (1970) for a binary Markov process and Hjart et al (1985) for a finite Markov process are special cases of our condition.

2.Product Integrals and Semi Markov Processes
Definition 2.2 An interval function α(s, t), 0 ≤ s ≤ t < ∞, with values in the k × k matrices is additive if α(s, s) = 0 for all s, where 0 is a k × k null matrix.
α(s, t) → 0 as t ↓ s for all s.
β(s, s) = I for all s β(s, t) → I as t ↓ s for all s.
Theorem 2.4 Let α(s, t) be any additive function and β(s, t) be any multiplicative function which is right continuous with left hand limits in both the variables and is of bounded variation, which satisfies β(s, Semi Markov process. We define the SMP through a Markov renewal process as in Cinlar (1975). Let (X, T ) be a Markov renewal process having kernel where P ij = lim t→∞ Q ij (t), and H ij (·) is the cumulative distribution function of the random variable (r.v.)Z ij the holding times in state i given that the next transition is to state j. Define, and Z i is the waiting time in state i. We assume that none of the Z i 's are degenerate at zero.
Define, for i, j ∈ E and 0 ≤ u ≤ s < t, In words, P ij (u, (s, t)) is the probability that the SMP {Y (t), t > 0} is in state j at time t given that it entered in state i at time u and the waiting time in state i is ISSN: 2456-8686, Volume 4, Issue 1, 2020:61-68 DOI : http://doi.org/10.26524/cm67 larger than s − u. An alternative expression for P ij (u, (s, t)) would be , (s, t)), Define µ ii (u, s) = − j=i µ ij (u, s). In matrix form let v(u, s) = ((µ ij (u, s))) i,j∈E .
Then v(u, s) is a transition intensity matrix for each u < s, corresponding to SMP {Y (t), t ≥ 0}.
(ii) P (u, (s, t)) = P (u, (s, w))P ( Proof: Statements (i) and (ii) follows from definition. To prove (iii), consider for h > 0. Hence, i.e., ∂ ∂t P ij (u, (s, t)) = P ij (u, (s, t))v jj (w, t) + v=i P iv (u, (s, t))µ vj (w, t) In matrix notation, To summarize the consequence of Lemma 2.5: P (u, (s, t)) is a multiplication matrix valued interval function with initial condition (i) which satisfy the forward equation given by (iii). Hence, using Theorem 2.4 the solution of (iii) is given by the product integral of matrix valued additive interval function.
Since the intensities µ ij (u, y) are continuous in y on [s, t] uniformly in u < s, these are uniformly continuous on the compact set [s, t]. Therefore, P (u, (s, t)) = lim Hence it follows that, for small enough non-negative h, I + v(u, y)h is a stochastic matrix satisfying (i). Since the intensities are uniformly continuous on [s, t] these are bounded and we may choose h independent of u and y in [s, t]. Also, monotone matrices are closed under multiplication, hence M n is a monotone matrix for all large enough n ∈ N. In turn, it follows that P (u, (s, t)) = lim +n → ∞M n satisfies (i) since point wise limits of monotone matrices are monotone.