Design and Analysis of Algorithmsailab.cs.nchu.edu.tw/course/Algorithms/103/AL10.pdf ·...

Preview:

Citation preview

1

Design and Analysis of Algorithms

演算法設計與分析

Lecture 10November 26, 2014

洪國寶

2

Homework # 81. 20.2-1 (p. 488) / 19.2-1 (p. 518) 2. 20.4-1 (p. 496) / 19.4-1 (p. 526)3. Show that if we start with empty Fibonacci heap and do

not perform cascading-cuts, then it is possible for a sequence of Fibonacci heap operations to result in degree k trees that have only k+1 nodes, k1.

4. Provide a sequence of operations on a Fibonacci heap such that, for any n 0, a binomial tree Bn is produced.

5. 21.3-2 (p. 504 / p. 572)

Due December 3, 2014

3

Outline

• Review• Data structures for disjoint sets• Elementary graph algorithms

4

Review: Complexity of Mergeable Heaps

• Extract-Min(H): deletes the node with the minimum key.• Decrease-Key(H, x, k): assigns to node x the new key

value k, which is its current key value.

5

Binomial Trees

• Binomial tree.–Recursive definition:

Bk-1

Bk-1

B0 Bk

B0 B1 B2 B3 B4

6

Review: Binomial Heap• Binomial heap. Vuillemin, 1978.– Sequence of binomial trees that satisfy binomial heap

property.• each tree is min-heap ordered• 0 or 1 binomial tree of order k

B4 B0B1

55

45 32

30

24

23 22

50

48 31 17

448 29 10

6

37

3 18

7

Review: Binomial Heap Properties• Properties of N-node binomial heap.– Min key contained in root of B0, B1, . . . , Bk. (root list) – Contains binomial tree Bi iff bi = 1 where bn b2b1b0 is binary

representation of N.– At most log2 N + 1 binomial trees.– Height log2 N.

B4 B0B1

55

45 32

30

24

23 22

50

48 31 17

448 29 10

6

37

3 18

N = 19# trees = 3height = 4binary = 10011

8

Review: Binomial Heap Union

• Create heap H that is union of heaps H' and H''.– Analogous to binary addition.

• Running time = O(log N)– Proportional to number of trees in root lists 2( log2 N + 1).

001 1100 1+011 1

11110

1

19 + 7 = 26

9

3

37

6 18

55

45 32

30

24

23 22

50

48 31 17

448 29 10

H

Review: Binomial Heap Extract-min Operation• Delete node with minimum key in binomial heap H.– Find root x with min key in root list of H, and delete– H' broken binomial trees– H Union(H', H)

• Running time. O(log N)

10

3

37

6 18

55

x 32

30

24

23 22

50

48 31 17

448 29 10

H

Review: Binomial Heap Decrease Key Operation

• Decrease key of node x in binomial heap H.– Suppose x is in binomial tree Bk.– Bubble node x up the tree if x is too small.

• Running time. O(log N)– Proportional to depth of node x log2 N .

depth = 3

11

Review: Fibonacci Heaps• Fibonacci heap history. –Fredman and Tarjan (1986)–Ingenious data structure and analysis.

• Fibonacci heap intuition.–Similar to binomial heaps, but less structured.–Decrease-key and union run in O(1) time.–"Lazy" unions and inserts

• We do not attempt to consolidate trees in a Fibonacci heap when we unite two heaps or insert a new node.

12

Fibonacci Heaps: Structure

• Fibonacci heap.–Set of min-heap ordered trees.

723

30

17

35

26 46

24

H 39

4118 52

3

44

min

marked

13

Fibonacci Heaps: Potential Function• Key quantities.– degree[x] = degree of node x.– mark[x] = mark of node x (black or gray).– t(H) = # trees.– m(H)= # marked nodes.– (H)= t(H) + 2m(H) = potential function.

723

30

17

35

26 46

24

H

t(H) = 5, m(H) = 3

(H) = 11

39

4118 52

3

44

mindegree = 3

14

Fibonacci Heap Operation 5: Extract Min

• Extract min.– Delete min and concatenate its children into root list.– Consolidate trees so that no two roots have same degree.

39

411723 18 52

30

7

35

26 46

24

44

currentmin

15

Fibonacci Heap Extract Min Analysis

• Notation.– D(n) = max degree of any node in Fibonacci heap with n nodes.– t(H) = # trees in heap H.– (H)= t(H) + 2m(H).

• Actual cost. O(D(n) + t(H))

• Amortized cost. O(D(n))– t(H') D(n) + 1 since no two trees have same degree.– (H) D(n) + 1 - t(H).

scale up the units of potential to dominate the constant hidden in O(t(H)).

16

• Decrease key of element x to k.– Case 0: min-heap property not violated.

• decrease key of x to k• change heap min pointer if necessary

– Case 1: parent of x is unmarked.• decrease key of x to k• cut off link between x and its parent• mark parent• add tree rooted at x to root list, updating heap min pointer

– Case 2: parent of x is marked.• decrease key of x to k• cut off link between x and its parent p[x], and add x to root list• cut off link between p[x] and p[p[x]], add p[x] to root list

– If p[p[x]] unmarked, then mark it.

– If p[p[x]] marked, cut off p[p[x]], unmark, and repeat. Cascading-cuts

Fibonacci Heap Decrease Key

17

• Notation.– t(H) = # trees in heap H.– m(H) = # marked nodes in heap H.– (H) = t(H) + 2m(H).

• Actual cost. O(c)• Amortized cost. O(1)– t(H') = t(H) + c– m(H') m(H) - c + 2

• each cascading cut (except the last one) unmarks a node• last cascading cut could potentially mark a node

– c + 2(-c + 2) = 4 - c.

Fibonacci Heap Decrease Key

scale up the units of potential to dominate the constant hidden in O(c).

18

Fibonacci Heaps: Bounding Max Degree

• Definition.D(N) = max degree in Fibonacci heap with N nodes.

• Key lemma.D(N) log N, where = (1 + 5) / 2.

• Corollary.Delete and Extract-min take O(log N) amortized time.

19

Outline

• Review• Data structures for disjoint sets• Elementary graph algorithms

20

Data structure for disjoint sets

• Discuss the data structures to maintain a collection of pair-wise disjoint dynamic sets.– Operations: make-set, union, find-set– Implementations: linked-lists, rooted trees– Applications: many

21

Disjoint Sets• Maintain a collection S = {S1, …, Sk} of disjoint dynamic sets.• Each set has a representative member.• Support Operations:

– Make-Set(x): Make new singleton set containing object x (x is representative).

– Union(x, y): Like before (x and y are objects in two sets to be merged).

– Find-Set(x): Returns pointer to representative set containing x.

• Complexity: In terms of – n = # of Make-Set operations.– m = total # of operations.– Note: m n.

22

Applications for disjoint sets

• Fortran compliers (common, equivalent statements)

• Computational geometry problems• Unification in logic programming• Longest common subsequences• Graph problems (minimum spanning trees,

connected components, …)

23

Connected components

• Given a graph G=(V,E), a path of length k from a vertex u to a vertex u’ is a sequence of vertices <V0, V1, …, Vk> such that u=V0, u’=Vk, and (Vi-1, Vi) E for i=1,2, … ,k.

• If there is a path p from u to u’, we say that u’ is reachable from u to u’ via p.

• The connected components of a graph are the equivalence classes of vertices under the is-reachable-from relation.

24

Compute the connected components

25

26

Determine whether two vertices are in the same component

Time complexity?

27

Data structure for disjoint sets

• Discuss the data structures to maintain a collection of pair-wise disjoint dynamic sets.– Operations: make-set, union, find-set– Implementations: linked-lists, rooted trees– Applications: many

28

Linked ListsStore set {a, b, c} as:

Make-Set and Find-Set are O(1).

Union(x, y): Append x’s list onto the end of y’s list. Update representative pointers in x’s list (Figure 21.2).

•Time is linear in |x|.

•Running time for a sequence of m ops can take (m²) time. (Figure 21.3) (Not very good.)

a b c

representativetail

29

union

Union(x, y): Append x’s list onto the end of y’s list. Update representative pointers in x’s list.

30

Example

n make-set operations

n-1 union operations

Total time: next slide

31

Example (Cont.)

OperationM-S(x1)M-S(x2)M-S(xn)U(x1, x2)U(x2, x3)U(x3, x4)U(xn-1, xn)

“Time”111123n–1

m = 2n – 1 operations.

Total Time is:

• (n) = (m) for Make-Set ops.

• for Union ops.

• (m²) total.• (m) amortized.

Total Time is:

• (n) = (m) for Make-Set ops.

• for Union ops.

• (m²) total.• (m) amortized.

)Θ(m)Θ(ni 221n

1i

32

Improvement: Weighted-Union Heuristic

• Keep track of list length in representative. (Time/space tradeoff)

• Modify Union so that smaller list is appended to longer one.

• Time for Union is now proportional to the length of the smaller list.

33

Amortized Running Time of WUHTheorem 21.1: Sequence of m operations takesO(m + n log n) time.

Theorem 21.1: Sequence of m operations takesO(m + n log n) time.

Proof:

M-S and F-S contribute O(m) total.

What about Union? (See the next slide)

Time is dominated by no. of total times we change a rep. pointer.

A given object’s rep. pointer can change at most log n times.

34

What about Union?

o1 o2 o3 o4… .. On totalU1 v v 2U2 v v 2U3 v v v 3..Uj v v v v v O(n)Total O(?)

35

Proof of Theorem 21.1 (Continued)

Note: n = no. of M-S’s = no. of objects

After object x’s rep. ptr. has been changed once, set has 2 members. …………………………………………… twice..…….. 4 members.…………………………………………… three times... 8 members.…………………………………………… log k times.. k members.

k n , so x’s rep. pointer can change at most log n times.

O(n log n) for n objects.

O(m + n log n) total.

36

Data structure for disjoint sets

• Discuss the data structures to maintain a collection of pair-wise disjoint dynamic sets.– Operations: make-set, union, find-set– Implementations: linked-lists, rooted trees– Applications: many

37

Disjoint-Set Forests

M-S, F-S: EasyUnion: As follows…

Union

Will speed up sequence of Union, M-S, and F-S operations by means of two heuristics.

x y xy

a

b d

c

representative

set is {a, b, c, d}

38

39

Heuristics to improve the running time of the rooted tree representation• Weighted-union heuristic

– union by size: make the root of the smaller tree point to the root of the larger, arbitrarily breaking a tie

– union by rank: make the root of the shallower tree point to the root of the other, arbitrarily breaking a tie

40

• Find-path compaction– path compression: make every encountered

node point to the root node– path splitting: make every encountered node

(except the last and next to last) point to its grandparent

– path halving: make every other encountered node (except the last and next to last) point to its grandparent

Heuristics to improve the running time of the rooted tree representation

41

Two heuristics used in the textbook1) Union by Rank

• Store rank of tree in rep. • Rank u.b. on height.

• Make root with smaller rank point to root with larger rank.

2) Path Compression• During Find-Set, “flatten” tree.

bc

d

a b c

dF-S(a)

a

42

OperationsMake-Set(x)

p[x] := x;rank[x] := 0

Make-Set(x)p[x] := x;rank[x] := 0

Union(x, y)Link(Find-Set(x), Find-Set(y))

Union(x, y)Link(Find-Set(x), Find-Set(y))

Link(x, y)if rank[x] > rank[y] then

p[y] := xelse

p[x] := y;if rank[x] = rank[y] then

rank[y] := rank[y] + 1

Link(x, y)if rank[x] > rank[y] then

p[y] := xelse

p[x] := y;if rank[x] = rank[y] then

rank[y] := rank[y] + 1

Find-Set(x)if x p[x] then

p[x] := Find-Set(p[x])return p[x]

Find-Set(x)if x p[x] then

p[x] := Find-Set(p[x])return p[x]

43

Find-Setc

ab

ca b

F-S(a)p[a] := F-S(b)

p[b] := F-S(c){ return c

return creturn c

44

ExampleMS(a) ; MS(b) ; ... ; MS(i) ; MS(j)

e/0

j/0

a/0

f/0

c/0

h/0

b/0

g/0

d/0

i/0

U(a,b) ; U(c,d) ; U(e,f) ; U(g,h); U(i j)

rankparent pointer

j/1

i/0

b/1

a/0

f/1

e/0

d/1

c/0

h/1

g/0

45

Example (Continued)j/1

i/0

b/1

a/0

f/1

e/0

d/1

c/0

h/1

g/0

U(a,d)

b/1

a/0

d/2

c/0

j/1

i/0

f/1

e/0

h/1

g/0

d

dd

d

46

Example (Continued)

b/1

a/0

d/2

c/0

j/1

i/0

f/1

e/0

h/1

g/0

d

dd

d

U(f,h)

j/1

i/0

b/1

a/0

d/2

c/0

d

dd

d

f/1

e/0

h/2

g/0

h

h

hh

47

Example (Continued)

U(d,h)

j/1

i/0

b/1

a/0

d/2

c/0

d

dd

d

f/1

e/0

h/2

g/0

h

h

hh

j/1

i/0

b/1

a/0

d/2

c/0

d

d d

d

f/1

e/0

h/3

g/0

h

h

h

h

48

Example (Continued)

U(e,j)

j/1

i/0

b/1

a/0

d/2

c/0

d

d d

d

f/1

e/0

h/3

g/0

h

h

h

h

j/1

i/0

b/1

a/0

d/2

c/0

d

d

dd

f/1e/0

h/3

g/0

h

h hh

49

Example (Continued)

j/1

i/0

b/1

a/0

d/2

c/0

d

d

dd

f/1e/0

h/3

g/0

h

h hh

FS(i)

j/1i/0b/1

a/0

d/2

c/0

d

d

dd

f/1e/0

h/3

g/0

h

h hh

BC

BC

BC

PC

Block 0

Block 0

Block 1

Block 2

50

Example (Continued)

FS(a)

j/1i/0b/1

a/0

d/2

c/0

d

d

dd

f/1e/0

h/3

g/0

h

h hh

j/1i/0b/1a/0 d/2

c/0

dd

d

d

f/1e/0

h/3

g/0

h

hhh

51

Time Complexity• We will cover the complexity analysis

found in CLR rather than CLRS.– Note: This was Chapter 22 in CLR, which is

why the remaining lemmas etc. are numbered the way they are.

– CLRS uses potential method (pp. 509 –517 2nd edition; pp. 573 – 581 3rd edition)CLR uses aggregate method

52

Potential function

53

Potential function

A4(1) >> 1080

54

Time Complexity

• Tight upper bound on time complexity: O(m (n)).– (n) is almost a constant.

• A slightly easier bound of O(m log*n)is established in CLR.

55

Bound we will establish

• We establish O(m log*n) as an upper bound.

• log*n = min{i 0: log(i) n 1}.

– In particular:

– And hence: log*265536 = 5.– Thus, log*n 5 for all practical purposes.

12log2

2*

kk

56

Properties of RanksLemma 22.2:

(i) (x:: rank[x] rank[p[x]]).(ii) (x: x p[x]: rank[x] < rank[p[x]]).(iii) rank[x] is initially 0.(iv) rank[x] does not decrease.(v) Once x p[x] holds, rank[x] does not change.(vi) rank[p[x]] is a monotonically increasing function of time.

Lemma 22.2:(i) (x:: rank[x] rank[p[x]]).

(ii) (x: x p[x]: rank[x] < rank[p[x]]).(iii) rank[x] is initially 0.(iv) rank[x] does not decrease.(v) Once x p[x] holds, rank[x] does not change.(vi) rank[p[x]] is a monotonically increasing function of time.

Proof:By induction on number of operations (see example Slides 42 - 48).

57

Lemma 22.3Lemma 22.3: For all tree roots x, size(x) 2rank[x].Lemma 22.3: For all tree roots x, size(x) 2rank[x].

no. of nodes in tree rooted at xProof:

Induction on number of Link operations

Basis:Before first link, all ranks are 0 and each tree contains one node.

Inductive Step:Consider Link(x,y).

Assume lemma holds before this operation.

We show it holds after.

2 cases.

58

Case 1: rank[x] rank[y]Assume rank[x] < rank[y].

Link(x,y)x y xy

rank(x)size(x)

rank(y)size(y)

rank(x)size(x)

rank(y)size(y)

Note: rank(x) = rank(x)rank(y) = rank(y)

size(y) = size(x) + size(y) 2rank(x) + 2rank(y)

2rank(y)

= 2rank(y)

No ranks or sizes change for any nodes other than y.

59

Case 2: rank[x] = rank[y]

x y xy

rank(x)size(x)

rank(y)size(y)

rank(x)size(x)

rank(y)size(y)

Note: rank(x) = rank(x)rank(y) = rank(y) + 1

size(y) = size(x) + size(y) 2rank(x) + 2rank(y)

2rank(y) + 1

= 2rank(y)

Link(x,y)

60

Lemma 22.4Lemma 22.4: For any integer r 0, there are at most n/2r nodes of rank r.Lemma 22.4: For any integer r 0, there are at most n/2r nodes of rank r.

Corollary 22.5: Every node has rank at most log n.Corollary 22.5: Every node has rank at most log n.

61

Proving the Time Bound

Lemma 22.6: Suppose we convert a sequence S of m MS, U, and FS operations into a sequence S of m MS, Link, and FS operationsby turning each Union into two FS operations followed by a Link. Then, if sequence S runs in O(m log*n) time, sequence Sruns in O(m log*n) time.

Lemma 22.6: Suppose we convert a sequence S of m MS, U, and FS operations into a sequence S of m MS, Link, and FS operationsby turning each Union into two FS operations followed by a Link. Then, if sequence S runs in O(m log*n) time, sequence Sruns in O(m log*n) time.

Only have to consider MS, Link, FS operations.

62

Theorem 22.7Theorem 22.7: A sequence of m MS, L, and FS operations, n of which are MS operations, can be performed in worst-case time O(m log*n).

Theorem 22.7: A sequence of m MS, L, and FS operations, n of which are MS operations, can be performed in worst-case time O(m log*n).

Proof: Use blackboard

63

Remarks

64

Remarks (Cont.)

65

Outline

• Review• Data structures for disjoint sets• Elementary graph algorithms

66

Outline of the course (1/1)

• Introduction (1-4)• Data structures (10-14)• Dynamic programming (15)• Greedy methods (16)• Amortized analysis (17)• Advanced data structures (6, 19-21)• Graph algorithms (22-25)• NP-completeness (34-35)• Other topics (5, 31)

67

Graph algorithms• Topics:

– Elementary graph algorithms– Minimum spanning trees– Shortest paths

• Reading:– Chapters 22, 23, 24, 25

68

Graphs

• A graph G = (V, E) consists of a set V of vertices (nodes) and a set E of directed or undirected edges.– For analysis, we use V for |V| and E for |E|.

• Any binary relation is a graph.– Network of roads and cities, circuit representation, etc.

69

Directed graphs

• A directed graph (or digraph) G is a pair (V, E), where V is a finite set and E is a binary relation on V. – The set V is called the vertex set of G, and its elements

are called vertices (singular: vertex). – The set E is called the edge set of G, and its elements

are called edges.• Vertices are represented by circles in the figure,

and edges are represented by arrows. Note that self-loops--edges from a vertex to itself--are possible.

70

Undirected graphs

• In an undirected graph G = (V, E), the edge set E consists of unordered pairs of vertices, rather than ordered pairs. That is, an edge is a set {u, v}, where u, v V and u ≠ v.

• By convention, we use the notation (u, v) for an edge, rather than the set notation {u,v}, and (u,v) and (v, u) are considered to be the same edge.

• In an undirected graph, self-loops are forbidden, and so every edge consists of exactly two distinct vertices.

71

Figure B.2 Directed and undirected graphs.

(a) A directed graph G = (V, E), where V = {1,2,3,4,5,6} and E = {(1,2), (2,2), (2,4), (2,5), (4,1), (4,5), (5,4), (6,3)}. The edge (2,2) is a self-loop.

(b) An undirected graph G = (V,E), where V = {1,2,3,4,5,6} and

E = {(1,2), (1,5), (2,5), (3,6)}. The vertex 4 is isolated.

(c) The subgraph of the graph in part (a) induced by the vertex set {1,2,3,6}.

72

Definitions

• Many definitions for directed and undirected graphs are the same, although certain terms have slightly different meanings in the two contexts.

• If (u, v) is an edge in a directed graph G = (V, E), we say that (u, v) is incident from or leaves vertexu and is incident to or enters vertex v.

• If (u, v) is an edge in an undirected graph G = (V, E), we say that (u, v) is incident on vertices u and v.

73

Examples

• The edges leaving vertex 2 in Figure B.2(a) are (2, 2), (2, 4), and (2, 5). The edges enteringvertex 2 are (1, 2) and (2, 2).

• In Figure B.2(b), the edges incident on vertex 2 are (1, 2) and (2, 5).

74

Definitions (Cont.)

• If (u, v) is an edge in a graph G = (V, E), we say that vertex v is adjacent to vertex u. – When the graph is undirected, the adjacency

relation is symmetric. – When the graph is directed, the adjacency

relation is not necessarily symmetric.

75

Example (Cont.)

• In parts (a) and (b) of Figure B.2, vertex 2 is adjacent to vertex 1, since the edge (1, 2) belongs to both graphs. Vertex 1 is not adjacent to vertex 2 in Figure B.2(a), since the edge (2, 1) does not belong to the graph.

76

Definitions (Cont.)

• The degree of a vertex in an undirected graph is the number of edges incident on it.

• In a directed graph, – the out-degree of a vertex is the number of edges

leaving it, and – the in-degree of a vertex is the number of edges

entering it.

• The degree of a vertex in a directed graph is its in-degree plus its out-degree.

77

Example (Cont.)

• Vertex 2 in Figure B.2(a) has in-degree 2, out-degree 3, and degree 5.

78

Definitions (Cont.)

• A path of length k from a vertex u to a vertex u' in a graph G = (V, E) is a sequence of vertices <V0, V1, …, Vk> such that u=V0, u’=Vk, and (Vi-1, Vi) E for i=1,2, … ,k. The length of the path is the number of edges in the path. The path contains the vertices V0, V1, …, Vk and the edges (V0, V1), (V1, V2), . . . , (Vk-1, Vk).

• If there is a path p from u to u', we say that u' is reachable from u via p, which we sometimes write as if G is directed. A path is simple if all vertices in the path are distinct.

79

Example (Cont.)

• In Figure B.2(a), the path 1, 2, 5, 4 is a simple path of length 3. The path 2, 5, 4, 5 is not simple.

80

Definitions (Cont.)

• A subpath of path p = V0, V1, . . . , Vk, is a contiguous subsequence of its vertices. – That is, for any 0 i j k, the subsequence of

vertices Vi, Vi+1, . . . , Vj, is a subpath of p.

81

Definitions (Cont.)• In a directed graph, a path v0, v1, . . . , vk forms a cycle if

v0 = vk and the path contains at least one edge. – The cycle is simple if, in addition, v1, v2, . . . , vk are distinct. – A self-loop is a cycle of length 1. – Two paths v0, v1, v2, . . . , v k - 1, v0 and v'0, v'1, v'2, . . . , v‘ k - 1, v'0

form the same cycle if there exists an integer j s. t. v'i = v (i + j) mod k for i = 0, 1, . . . , k - 1.

• A directed graph with no self-loops is simple. In an undirected graph, a path v0, vl, . . . , vk forms a cycle if v0= vk and v1, v2, . . . , vk are distinct.

• A graph with no cycles is acyclic.

82

Example (Cont.)

• In Figure B.2(a), the path 1, 2, 4, 1 forms the same cycle as the paths 2, 4, 1, 2 and 4, 1, 2, 4 . This cycle is simple, but the cycle 1, 2, 4, 5, 4, 1 is not. The cycle 2, 2 formedby the edge (2, 2) is a self-loop.

• In Figure B.2(b), the path 1, 2, 5, 1 is a cycle.

83

Definitions (Cont.)

• An undirected graph is connected if every pair of vertices is connected by a path. – The connected components of a graph are the

equivalence classes of vertices under the "is reachable from" relation.

– An undirected graph is connected if it has exactly one connected component, that is, if every vertex is reachable from every other vertex.

84

Example (Cont.)

• The graph in Figure B.2(b) has three connected components: {1, 2, 5}, {3, 6}, and {4}. Every vertex in {1,2,5} is reachable from every other vertex in {1, 2, 5}.

85

Definitions (Cont.)

• A directed graph is strongly connected if every two vertices are reachable from each other. – The strongly connected components of a graph

are the equivalence classes of vertices under the "are mutually reachable" relation.

– A directed graph is strongly connected if it has only one strongly connected component.

86

Example (Cont.)

• The graph in Figure B.2(a) has three strongly connected components: {1, 2, 4, 5}, {3}, and {6}. All pairs of vertices in {1, 2, 4, 5} are mutually reachable. The vertices {3, 6} do not form a strongly connected component, since vertex 6 cannot be reached from vertex 3.

87

Definitions (Cont.)

• Two graphs G = (V, E) and G' = (V', E') are isomorphic if there exists a bijection f : V V' such that (u, v) E if and only if (f(u), f(v)) E'. – In other words, we can relabel the vertices of G

to be vertices of G', maintaining the corresponding edges in G and G'.

88

Example (Cont.)• Figure B.3(a) shows a pair of

isomorphic graphs G and G'with respective vertex sets V= {1, 2, 3, 4, 5, 6} and V' = {u, v, w, x, y, z}. The mapping from V to V' given by f(1) = u, f(2) = v, f(3) = w, f(4) = x, f(5) = y, f(6) = z is the required bijectivefunction.

• The graphs in Figure B.3(b) are not isomorphic. Although both graphs have 5 vertices and 7 edges, the top graph has a vertex of degree 4 and the bottom graph does not.

89

Definitions (Cont.)

• We say that a graph G' = (V', E') is a subgraph of G = (V,E) if V' V and E' E. Given a set V' V, the subgraph of G induced by V' is the graph G' = (V', E'), where E' = {(u, v) E: u, v V'} .

90

Example (Cont.)

• The subgraph induced by the vertex set {1, 2, 3, 6} in Figure B.2(a) appears in Figure B.2 (c) and has the edge set {(1, 2), (2, 2), (6, 3)}.

91

Definitions (Cont.)• Given an undirected graph G = (V, E), the directed version

of G is the directed graph G' = (V, E'), where (u, v) E' if and only if (u, v) E. – That is, each undirected edge (u, v) in G is replaced in the directed

version by the two directed edges (u, v) and (v, u). • Given a directed graph G =(V, E), the undirected version of

G is the undirected graph G' = (V, E'), where (u, v) E' if and only if u v and (u, v) E. – That is, the undirected version contains the edges of G "with their

directions removed" and with self-loops eliminated. • In a directed graph G = (V, E), a neighbor of a vertex u is

any vertex that is adjacent to u in the undirected version of G. That is, v is a neighbor of u if either (u, v) E or (v, u) E. In an undirected graph, u and v are neighbors if they are adjacent.

92

Definitions (Cont.)

• Several kinds of graphs are given special names. – A complete graph is an undirected graph in which

every pair of vertices is adjacent. – A bipartite graph is an undirected graph G = (V, E) in

which V can be partitioned into two sets V1 and V2 such that (u, v) E implies either u V1 and v V2 or u V2 and v V1. That is, all edges go between the two sets V1and V2.

– An acyclic, undirected graph is a forest, and a connected, acyclic, undirected graph is a (free) tree.We often take the first letters of "directed acyclic graph" and call such a graph a dag.

93

Definitions (Cont.)

• The contraction of an undirected graph G = (V, E) by an edge e=(u,v) is a graphG’=(V’,E’), where– V’= V - {u,v} U {x} and x is a new vertex. – The set of edges E’ is formed from E by

deleting the edge (u,v) and, for each vertex w incident to u or v, deleting whichever of (u,w) and (v,w) in E and adding the new edge (x,w).

94

Representations of Graphs: Adjacency List

• Adjacency list: An array Adj of |V | lists, one for each vertex in V. For each u V, Adj[u] pointers to all the vertices adjacent to u.

• Advantage: O(V+E) storage, good for sparse graph.• Drawback: Need to traverse list to find an edge.

95

Representations of Graphs: Adjacency Matrix• Adjacency matrix: A |V| |V| matrix A = (aij) such that

• Advantage: O(1) time to find an edge.• Drawback: O(V2) storage, more suitable for dense graph.• Q: How to save space if the graph is undirected?

96

Tradeoffs between Adjacency List and Matrix

97

Questions?

Recommended