39
Asymptotes and Algorithms By Gary Short Gibraltar Software 1

Algorithms - Rocksolid Tour 2013

Embed Size (px)

Citation preview

Page 1: Algorithms  - Rocksolid Tour 2013

1

Asymptotes and Algorithms

By Gary ShortGibraltar Software

Page 2: Algorithms  - Rocksolid Tour 2013

2

Agenda

• Introduction• Performance, does it matter?• How do we measure performance?• Analysis of Insertion Sort• Simplifying things with asymptotic notation• Designing algorithms• Solving recurrences• Questions.

Page 3: Algorithms  - Rocksolid Tour 2013

3

Introduction

• Gary Short• Head of Gibraltar Labs• C# MVP• [email protected]• @garyshort• http://www.facebook.com/theothergaryshort

Page 4: Algorithms  - Rocksolid Tour 2013

4

Performance – Does it Matter?

Performance is the most important thing in software engineering today...

Page 5: Algorithms  - Rocksolid Tour 2013

5

... Apart from everything else!

Page 6: Algorithms  - Rocksolid Tour 2013

6

So Why Bother About Performance?

Page 7: Algorithms  - Rocksolid Tour 2013

7

How do we Measure Performance?

• What do we care about?– Memory?– Bandwidth?– Computational time?

Page 8: Algorithms  - Rocksolid Tour 2013

8

We Need a Model to Work With

• RAM Model– Arithmetic – add, subtract, etc– Data movement – load, copy, store– Control – branching, subroutine call, return– Data – Integers, floats

• Instruction are run in series– And take constant time• Not really, but shhh! –Ed.

Page 9: Algorithms  - Rocksolid Tour 2013

9

Analysis of Insertion Sort

InsertionSort(A)for j = 2 to A.length

key=[Aj]i = j - 1while i > 0 and A[i] > key

A[i+1] = A[i]i = i - 1

A[i+1] = key

Page 10: Algorithms  - Rocksolid Tour 2013

10

That Makes no Sense, Show me!

Page 11: Algorithms  - Rocksolid Tour 2013

11

So What’s The Running Time?

Page 12: Algorithms  - Rocksolid Tour 2013

12

Sum Running Time for Each Statement...

T(n) = c1n+c2(n-1)+c3(n-1)+c4 sum(tj) j=2..n+c5 sum(tj-1) j=2..n+c6sum(tj-1) j=2..n+c7(n-1)

Page 13: Algorithms  - Rocksolid Tour 2013

13

Best Case Running Time

If the input (A) is already sorted then...A[i] <= key when has initial value of j-1 thus tj=1.And so...T(n) = c1n+c2(n-1)+c3(n-1)+c4(n-1)+c7(n-1)= (c1+c2+c3+c4+c7)n-(c2+c3+c4+c7)Which can be expressed as an+b for constants a

and b that depend on ciSo T(n) is a linear function of n

Page 14: Algorithms  - Rocksolid Tour 2013

14

Page 15: Algorithms  - Rocksolid Tour 2013

15

Side Note: No One Cares About Best Case

Page 16: Algorithms  - Rocksolid Tour 2013

16

Worst Case Scenario

If the input (n) is in reverse sort order then...We have to compare each A[j] with each

element in the sub array A[1..j-1].And so...T(n) = (c4/2+c5/2+c6/2)n^2 +(c1 +c2+c3+c4/2-

c5/2-c6/2+c7)n-(c2+c3+c4+c7)Which can be expressed as an^2 + bn + cSo T(n) is a quadratic function of n

Page 17: Algorithms  - Rocksolid Tour 2013

17

Page 18: Algorithms  - Rocksolid Tour 2013

18

In Short...

In worst case insertion sort sucks!

Page 19: Algorithms  - Rocksolid Tour 2013

19

Man That Was a Lot of Maths!

Page 20: Algorithms  - Rocksolid Tour 2013

20

Simplifying Things With Asymptotic Notation

• Asymptotic notation characterises functions by their growth rates

• Functions with the same growth rates have the same Asymptotic notation

Page 21: Algorithms  - Rocksolid Tour 2013

21

How Does That Help Us?

Let’s say we have a function with running timeT(n) = 4n^2 - 2n + 2If n = 500 then 4n^2 is 1000 times bigger than 2nSo...We can ignore smaller order terms and

coefficientsT(n) = 4n^2 -2n +2 can be written O(n) = n^2

Page 22: Algorithms  - Rocksolid Tour 2013

22

A Short Note on The Abuse of “=“

If T(n) = 4n^2 -2n +2Then saying T(n) = O(n^2) is not strictly correctRather T(n) is in the set O(n^2) and the above

should be read as T(n) is O(n^2) and not T(n) equals O(n^2)

But really on Maths geeks care – Ed.

Page 23: Algorithms  - Rocksolid Tour 2013

23

So Back to Insertion Sort

So now we can say of Insertion Sort that...Best case it’s O(n)And worst case it’s O(n^2)And since we only care about worst case...We say that Insertion Sort has O(n^2)Which sucks! – Ed.

Page 24: Algorithms  - Rocksolid Tour 2013

24

Designing Algorithms

So can we do better?

Page 25: Algorithms  - Rocksolid Tour 2013

25

Optimizing Algorithms is Child’s Play

• Sit at table• Foreach item in itemsOnPlate– Eat item

• Wait(MealComplete)• Foreach dish in dishesUsed– WashDish– DryDish

• Resume Play

Page 26: Algorithms  - Rocksolid Tour 2013

26

Child Will Optimize To…

• Pause Game• Set Speed = MaxInt• Run to table• Take sliceBread(1)• Foreach item on Plate– Place item on bread

• Take sliceBread(2)• Run Outside• Resume Game

Page 27: Algorithms  - Rocksolid Tour 2013

27

Divide And Conquer

• Divide– Divide the problem into sub problems

• Conquer– Solve the sub problems recursively

• Combine– Add the solutions to the sub problems into the

solution for the original problem.

Page 28: Algorithms  - Rocksolid Tour 2013

28

Merge Sort

• Divide– Divide the n elements into two n/2 element arrays

• Conquer– Sort the two arrays recursively

• Combine– Merge the two sorted arrays to produce the

answer.

Page 29: Algorithms  - Rocksolid Tour 2013

29

Analysis of Merge Sort

MergeSort(A,p,r)if(p<r)

q = [(p+r)/2]MergeSort(A,p,q)MergeSort(A,q+1,r)Merge(A,p,q,r)

Initial call MergeSort(A,1,A.length)

Page 30: Algorithms  - Rocksolid Tour 2013

30

Dancers, or it Never Happened!!

Page 31: Algorithms  - Rocksolid Tour 2013

31

So What’s The Running Time?

In the general case...If the divide step yields ‘a’ sub problemsEach 1/b the size of the originalIt takes T(n/b) time to solve one problem of n/b sizeSo it takes aT(n/b) to solve ‘a’ of themThen, if it takes D(n) time to divide the problemAnd C(n) time to combine the resultsThen we get the recurrence...T(n) = aT(n/b) + D(n) + C(n).

Page 32: Algorithms  - Rocksolid Tour 2013

32

Apply That to Merge Sort...

• Divide– Computes the middle of the subarray, taking

constant time so, D(n) = O(1)• Conquer– Recursively solve two sub problems each of size

n/2 contributing 2T(n/2) to the running time• Combine– Merge procedure O(n)

• Giving us a recurrence of 2T(n/2)+O(n)

Page 33: Algorithms  - Rocksolid Tour 2013

33

Solve The Recurrence Using The Master Method

For a Recurrence in the form T(n) = aT(n/b) + f(n)ThenIf f(n) = O(nlogba-k) then T(n) = O(nlogba)If f(n) = O(nlogba) then T(n) = O(nlogba log n)if f(n) = Omega(n log b a+k) and if af(n/b) <=

cf(n) then T(n) = O(f(n))

Page 34: Algorithms  - Rocksolid Tour 2013

34

What?!

• More simply we are comparing f(n) with the function n log ba and intuitively understanding that the bigger of the two determines the solution to the recurrence.

Page 35: Algorithms  - Rocksolid Tour 2013

35

And So...

• With Merge Sort we are in the third case of the Master Method thus...

• T(n) = O(n log n)• Which is much better than the O(n^2) of

Insertion Sort

Page 36: Algorithms  - Rocksolid Tour 2013

36

Page 37: Algorithms  - Rocksolid Tour 2013

37

What We Learned

• Performance is important• Therefore algorithmic optimization is too• We have a model to benchmark• And a syntax• Divide and conquer• Master Method• Other resources.

Page 38: Algorithms  - Rocksolid Tour 2013

38

Page 39: Algorithms  - Rocksolid Tour 2013

39

Questions?