View
66
Download
0
Category
Preview:
DESCRIPTION
Brun’s Sieve Let B 1 , …, B m be events, X i the indicator random variable for Bi and X = X 1 + … + X m the number of Bi that hold. Let there be a hidden parameter n (so that actually m = m ( n ), B i = B i ( n ), X = X( n )) which will define the following o , O notation. - PowerPoint PPT Presentation
Citation preview
Brun’s Sieve Let B1, …, Bm be events, Xi the indicator random varia
ble for Bi and X = X1 + … + Xm the number of Bi that hold.
Let there be a hidden parameter n (so that actually m = m(n), Bi = Bi(n), X = X(n)) which will define the following o, O notation.
Define S(r) = ∑ Pr[Bi1 Λ…Λ Bir
], the sum over all sets {i
1,…,ir} {1,…,m}.
Theorem 8.3.1 Suppose there is a constant μso that
E[X] = S(1) → μ and such that for every fixed r, E[X(r) / r!] = S(r) → μr / r!.
Then Pr[X = 0] → and indeed for every t Pr[X = t] →
e
e
!t
t e
Pr[X = r] ≤ S(r) = ∑ Pr[ ], where {i1,…,ir} {1,…,m}.
The Inclusion-Exclusion Principle gives that Pr[X = 0] = Pr[ ] = 1 – S(1) + S(2) -…+(-1)rS(r)…
Bonferroni’s inequality: Let P(Ei) be the probability that Ei is true, and be the probability that at least one of Ei,…, En is true. Then
mBB ...1
rii BB ...1
]P[1 Ei
n
i
n
i
n
iEiEi
1 1 ]P[]P[
Proof. We do only the case t = 0. Fix є> 0. Choose s so that
The Bonferroni Inequalities states that, in general, the inclusion-exclusion formula alternatively over and underestimates Pr[X = 0]. In particular,
Select no(the hidden variable) so that for n
no,
for 0 ≤ r ≤ 2s.
2
2
0! |)1(|
es
rrur r
s
r
rr S2
0
)()1(]0XPr[
)12(2!)( || sr
ur r
S
Proof(cont.) For such n
Pr[X = 0] ≤ + є Similarly, taking the sum to 2s+1 we find no s
o that for n no
Pr[X = 0] ≤ - є
As є was arbitrary Pr[X = 0] →
e
e
e
Let G ~ G(n,p), the random graph and let EPIT represent the statement that every vertex lies in a triangle.
Theorem 8.3.2 Let c > 0 be fixed and let p = p(n),μ=μ(n) satisfy
p3 = μ, =
Then Pr[G(n,p) |= EPIT] =
2
1n e nc
limn
ce
Proof. First fix xV(G). For each unordered y, z V(G) – {x} let Bxyz
be the event that {x,y,z} is a triangle of G. Let Cx be the event and Xx be the cor
responding indicator random variable. We use Janson’s Inequality to bound E[Xx]
= Pr[Cx]. Here p = o(1) so є = o(1).
as defined above.
xyzB
xyzB
]Pr[ xyzB
Proof(cont.) Dependency xyz ~ xuv occurs if and only if th
e sets overlap (other than x). Hence
Since . Thus Now define
the number of vertices x not lying in a triangle. Then from Linearity of Expectation,
ncex ~]X[E
)1()(]Pr[ 53 opnOBB zxyxyz
)1(3/2 onp ncex ~]X[E
)(
XXGVx
x
cGVx
x
]E[XE[X])(
Proof(cont.) We need to show that the Poisson Paradigm
applies to X. Fix r. Then
the sum over all sets of vertices {x1,…,xr}. All r-sets look alike so
where x1,…,xr are some particular vertices. But
the conjunction over 1 ≤ i ≤ r and all y,z.
]...Pr[]!/X[E 1)((r)
rxxr CCSr
!~
!)(
~!/X[E (r)
rc
rne
rrr
yzxxx ir BCC ...1
Proof(cont.) We apply Janson’s Inequality to this conjunction.
Again є = p3 = o(1). The number of {xi,y,z} is , the overc
ount coming from those triangles containing two(or three of the xi). (Here it is crucial that r is fixed.) Thus
As before Δ is p5 times the number of pairs xiyz~ xj
y’z’. There are O(rn3) = O(n3) terms with i = j and O(r2n2) = O(n2) terms with i j so again Δ = o(1). Therefore
and
)(2
1nO
nr
)()(2
1]Pr[ )1(13 o
yzx nOrnOn
rpB i
eCC rxx ~]...Pr[ 1
!~]...Pr[]!/X[E 1
(r)
rn
CCr
nr
r
xx r
Large Deviations Given a point in the probability space(i.e., a
selection of R) we call an index set J I a disjoint family (abbreviated disfam) if Bj for every j J. For no j, j’ J is j ~ j’.
If, in addition, If j’ J and Bj’ then j ~ j’ for some j J.
Then we call J a maximal disjoint family
(abbreviated maxdisfam).
Lemma 8.4.1 With the above notation and for any integer s,
Pr[there exists a disfam J, |J| = s] ≤
Proof. Let denote the sum over all s-sets J I with
no j ~ j’. Let denote the sum over ordered s-tuples
(j1 ,…, js) with {j1 ,…, js} forming such a J. Let denote the sum over all ordered s-tuples
(j1,…, js).
!s
s
o
a
Proof(cont.) Pr[there exists a disfam J, |J| = s]
!!1
!1
!1
]]Pr[[
]Pr[]...Pr[
]Pr[]...Pr[
]Pr[
]Pr[
1
1
ss
iIis
jja
s
jjo
s
jJj
jJj
s
s
s
B
BB
BB
B
B
For smaller s we look at the further condition of J being a maxidisfam. To that end we let μs denote the minimum, over all j1, … , js of
,the sum taken over all i I except those i with
i ~ jl for some 1≤ l ≤ s. In application s will be small (otherwise we use
Lemma 8.4.1) and μs will be close to μ. For some applications it is convenient to set
and note that μs >= μ – sv.
]Pr[ iB
ji
iIj
Bv~
]Pr[max
Lemma 8.4.2 With the above notation and for any integer s, Pr[there exists a maxdisfam J,
|J| = s] ≤
Proof. As in Lemma 8.4.1 we bound this probability
by of J = {j1, … , js} being a maxdisfam. For this to occur J must first be a disfam and then , where is the conjunction over all i I except those with i ~ jl for some 1 ≤ l ≤ s.
svss
ees
ees
s
!!
2
iB
Proof(cont.) We apply Janson’s Inequality to give an
upper bound to .The associated values satisfy
the latter since has simply fewer addends. Thus
and
]Pr[ iB
!/
]Pr[]disfammaxPr[
2
2
see
BeeJ
s
jJj
s
s
,
,
,
s
2]Pr[ eeB s
i
When Δ = o(1) and vμ = o(1) or, more generally, μ3μ = μ + o(1), then Lemma 8.4.2 gives a close approximation to the Poisson Distribution since
Pr[there exists a maxdisfam J, |J| = s]
For s ≤ 3μ and the probability is quite small for
larger s by Lemma 8.4.1
eo s
s
!))1(1(
Recommended