Introduction to Stochastic Processes (Dover Books on Mathematics)
Rate it:
Open Preview
1%
Flag icon
Our approach capitalizes on this happy harmony between the methods of the better mathematician and the intuition of the honest engineer.
6%
Flag icon
It is important to remember always that an event is a collection of outcomes, while a random variable is a function. A random variable assigns a value to each outcome; a probability measure assigns a value to each event. One talks of the probability of an event, never of the probability of an outcome.
11%
Flag icon
Especially in view of the monstrous looks of the last section, it seems all too advisable to inquire how the reader’s patience is holding out and to assure him that he will in time come to appreciate the true friendliness of these concepts.
23%
Flag icon
You are right in feeling annoyed; so is the company in their claim. If it is any comfort, look at it this way: it takes a larger than average interval to contain an instant distinguished enough to be your arrival time.
28%
Flag icon
The theory of Markovian processes comprises the largest and the most important chapter in the theory of stochastic processes; this importance is further enhanced by the many applications it has found in both the physical, biological, and social sciences and in engineering and commerce.
29%
Flag icon
Equation (1.8) is called the Chapman-Kolmogorov equation; it states that starting at state i, in order for the process X to be in state j after m + n steps, it must be in some intermediate state k after the mth step and then move from that state k into state j during the remaining n steps.
29%
Flag icon
In some applications, the powers Pm of the transition matrix P are desired for a large number of m. Then it may be worth considering the various matrix-theoretic methods available for such computations. Some such results may be found in the appendix at the end of this book.
29%
Flag icon
Note that in this example, not only each row, but also each column sums to one. Such matrices are called doubly Markov.
34%
Flag icon
This chain is called a random walk, and is used to describe the walk of a sufficiently intoxicated person: if he is at position i after step n, his next step leads him to either i + 1 or i − 1 with respective probabilities p and q except that at i = 0 there is a barrier and when he hits it he is sure to step back to 1.
35%
Flag icon
(a)  Show that X is a Markov chain with state space E = {0, 1, . . . , 8} and transition matrix
Roberto Rigolin F Lopes
Give this one a try ...
36%
Flag icon
Limiting Behavior and Applications of Markov Chains
36%
Flag icon
We will first consider the computation of the expected number R(i, j) of visits to j and the probability F(i, j) of ever reaching j, both starting at i. Considering the recurrent states we will show how to compute the probability Pn(i, j) of being in state j at time n, starting at i, for large n. Studying the periodic states is easily reduced to the aperiodic case, and we will show how to do that. Then we will take up the computation of the probability of remaining in a set of transient states forever. Finally, we will give a brief treatment of two important models in queueing theory and an ...more
48%
Flag icon
Potentials, Excessive Functions, and Optimal Stopping of Markov Chains
54%
Flag icon
optimal stopping problem
57%
Flag icon
The processes we will introduce in this chapter will remedy this by allowing us to take into account not only the changes of state, but also the actual times spent in between.
64%
Flag icon
The differential equations (4.10) and (4.11) are called, respectively, Kolmogorov’s backward and forward equations. The fact that (Pt) is defined by its derivative at t = 0 makes these results interesting. For this reason A is called the generator of the process Y.
68%
Flag icon
If r ≥ 1, arrivals occur faster than the server can handle.
68%
Flag icon
Another application is to the kinetic theory of gases. Then the arrivals are gas molecules entering a fixed region, and service times are the times spent in that region by the molecules. If the temperature is low, molecules do not interact very much and the model fits well.
70%
Flag icon
If the reader enjoyed this chapter, then he might want to ask what lies further ahead and how he might prepare himself for a life of randomness. If so, I would advise him first to rebuild the foundations of his knowledge of probabilities by studying a book such as NEVEU [1]; then study the general theory of stochastic processes through a book such as DELLACHERIE [1]; then read the theory of Markov processes in a general setting, say through BLUMENTHAL and GETOOR [1]. And this is a most straightforward Christian piece of advice.
78%
Flag icon
Renewal theory is one of the main tools in the elementary theory of probability. A renewal process, by itself, does not have a rich enough structure to be of much interest. The main importance of renewal theory is, therefore, largely due to its applications to regenerative processes and to the elegant formalism of the renewal equation. In this light, the importance of the renewal function itself is due to its use as a potential operator and not to its connection with the expectations of the numbers of renewals.
78%
Flag icon
Markov Renewal Theory
88%
Flag icon
In Examples (1.19) and (1.20) we showed the existence of Markov renewal processes embedded in the queue size processes of M/G/1 and G/M/1 queueing systems. In Chapter 6 we studied the underlying Markov chains. We now bring together these results to obtain the time-dependent behavior of the queue size processes.