7.3. DISCUSSION OF PAPERS 59
For an inhomogeneous system
d u
dt
D Au C b; where A is invertible and b is constant;
the steady state (particular) solution is u D A
1
b, so the general solution is
u.t/ D c
1
e
ˇ
1
t
.cos !
1
t C i sin !
1
t/v
1
C c
2
e
ˇ
2
t
.cos !
2
t C i sin !
2
t/v
2
A
1
b:
For the eigenvalues in Fig. 7.1, the trajectories are translated, and converge at A
1
b.
7.3 DISCUSSION OF PAPERS
AVA and fluid approximations were used in some of the papers we have discussed so far, as well
as in another four that we introduce below.
StreamJoins [21]
In Table 1, the number of tuples B is a random variable for a logical window, and the number
of hash buckets jBj is also not a constant. One can therefore view B=jBj in Eq. (8) as an
example of AVA.
GPRS [32]
e expression
Q
X D
Q
U
Q
Q
in Eq. (10) is a similar example of AVA.
StorageAvailability [13]
e authors use the machine repairman model to derive Eq. (1), but the formula suggests
that there is a combinatorial derivation. In fact, one can use the following AVA argument:
e failure rate is and the recovery rate is , so the average up-time between consecutive
failures is 1=, and the average down-time for recovery is 1=. By AVA, the availability for
an SE (i.e., the probability that it is up) is
1
1
C
1
D
1
1 C
:
It is clear that this solution holds even if (so 1). Since the SEs are assumed to
be independent, the probability that at least Q SEs are available is binomial, i.e.,
N
X
j DQ
N
j
!
1
1 C
j
1 C
N j
;
which is equivalent to Eq. (1). is argument does not assume the distributions are exponen-
tial, and may be more readily accepted by engineers. However, it still assumes independence
of failures.
60 7. DETERMINISTIC APPROXIMATIONS
TransactionalMemory [18]
e use of L `
w
E.Q/, instead of L `
w
i in the denominators for Eq. (3) is an example of
AVA. Eq. (12) has E.Q/ on both sides, so the model results in a fixed point approximation
that is solved iteratively.
SensorNet [38]
One can derive Eq. (3) with the following AVA argument: Given a random walk on a torus,
with all directions equilikely, a MULE is equally likely to be anywhere along its path. If the
expected path length is EŒR
i
, then the probability of being at grid point i is
i
D 1=EŒR
i
.
One can do a similar derivation of Corollary 2.1: By Eq. (3), the average inter-arrival time
of a specific MULE to a sensor is N , so the arrival rate is 1=N ; since there are N
mules
,
the arrival rate of MULEs to that sensor is N
mules
=N D
mules
; the inter-arrival time of
MULEs is therefore 1=
mules
, which gives the buffer occupancy in Eq. (17).
TCP [33]
e system in this paper refers to a bulk transfer in a TCP Reno connection over multi-
ple routers. e central issue is how packet loss and timeout affect the throughput via the
protocol (window adjustment, retransmission, etc.) — see Eq. (31). e model uses the
technique of discretizing time into rounds (Fig. 2).
Instead of making various assumptions (that are hard to verify or justify, considering the
many approximations), one could just view some of the derivations as an application of
AVA. For example, instead of considering fW
i
g
i
to be a Markov regenerative process, we
could just use AVA to get B D
EY
EA
in Eq. (1). Similarly, one could forego the assumption
of mutual independence between fX
i
g and fW
i
g and derive Eq. (12) from Eq. (10) by AVA.
Rather than assume round-trip times to be independent of congestion window size and
round number, one could use AVA to derive Eq. (6):
EA
i
D E
X
i
C1
X
j D1
r
ij
E
EX
i
C1
X
j D1
r
ij
D
EXC1
X
j D1
Er
ij
D .EX C 1/Er;
where the approximation regards EX to be an integer.
As the formulas become more complicated, it is no longer clear what assumptions are suf-
ficient for the derivations, and the authors simply apply AVA in Eq. (25):
Q D E
O
Q.W /
O
Q.EW /:
Eq. (19) is an example of a fixed point approximation (Sec. 5.3): TCP throughput B.p/ is
expressed in terms of loss probability p, which in turn depends on B.p/.

Get Analytical Performance Modeling for Computer Systems, 2nd Edition now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.