1、1,Analysis of Algorithms & Orders of Growth,Rosen 6th ed., 3.1-3.3,2,Analysis of Algorithms,An algorithm is a finite set of precise instructions for performing a computation or for solving a problem. What is the goal of analysis of algorithms? To compare algorithms mainly in terms of running time bu
2、t also in terms of other factors (e.g., memory requirements, programmers effort etc.) What do we mean by running time analysis? Determine how running time increases as the size of the problem increases.,3,Example: Searching,Problem of searching an ordered list. Given a list L of n elements that are
3、sorted into a definite order (e.g., numeric, alphabetical), And given a particular element x, Determine whether x appears in the list, and if so, return its index (position) in the list.,4,Search alg. #1: Linear Search,procedure linear search (x: integer, a1, a2, , an: distinct integers) i := 1 whil
4、e (i n x ai) i := i + 1 if i n then location := i else location := 0 return location index or 0 if not found,5,Search alg. #2: Binary Search,Basic idea: On each step, look at the middle element of the remaining list to eliminate half of it, and quickly zero in on the desired element.,x,x,x,x,6,Searc
5、h alg. #2: Binary Search,procedure binary search (x:integer, a1, a2, , an: distinct integers) i := 1 left endpoint of search interval j := n right endpoint of search interval while i1 item m := (i+j)/2 midpoint if xam then i := m+1 else j := m end if x = ai then location := i else location := 0 retu
6、rn location,7,Is Binary Search more efficient?,Number of iterations: For a list of n elements, Binary Search can execute at most log2 n times! Linear Search, on the other hand, can execute up to n times !,8,Is Binary Search more efficient?,Number of computations per iteration: Binary search does mor
7、e computations than Linear Search per iteration. Overall: If the number of components is small (say, less than 20), then Linear Search is faster. If the number of components is large, then Binary Search is faster.,9,How do we analyze algorithms?,We need to define a number of objective measures.(1) C
8、ompare execution times? Not good: times are specific to a particular computer !(2) Count the number of statements executed? Not good: number of statements vary with the programming language as well as the style of the individual programmer.,10,Example (# of statements),Algorithm 1 Algorithm 2arr0 =
9、0; for(i=0; iN; i+) arr1 = 0; arri = 0; arr2 = 0;. arrN-1 = 0;,11,How do we analyze algorithms?,(3) Express running time as a function of the input size n (i.e., f(n). To compare two algorithms with running times f(n) and g(n), we need a rough measure of how fast a function grows. Such an analysis i
10、s independent of machine time, programming style, etc.,12,Computing running time,Associate a “cost“ with each statement and find the “total cost“ by finding the total number of times each statement is executed. Express running time in terms of the size of the problem. Algorithm 1 Algorithm 2Cost Cos
11、tarr0 = 0; c1 for(i=0; iN; i+) c2arr1 = 0; c1 arri = 0; c1arr2 = 0; c1.arrN-1 = 0; c1 - -c1+c1+.+c1 = c1 x N (N+1) x c2 + N x c1 = (c2 + c1) x N + c2,13,Computing running time (cont.),Cost sum = 0; c1 for(i=0; iN; i+) c2for(j=0; jN; j+) c2 sum += arrij; c3- c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N
12、x N,14,Comparing Functions Using Rate of Growth,Consider the example of buying elephants and goldfish:Cost: cost_of_elephants + cost_of_goldfishCost cost_of_elephants (approximation) The low order terms in a function are relatively insignificant for large nn4 + 100n2 + 10n + 50 n4i.e., n4 + 100n2 +
13、10n + 50 and n4 have the same rate of growth,15,Rate of Growth Asymptotic Analysis,Using rate of growth as a measure to compare different functions implies comparing them asymptotically. If f(x) is faster growing than g(x), then f(x) always eventually becomes larger than g(x) in the limit (for large
14、 enough values of x).,16,Example,Suppose you are designing a web site to process user data (e.g., financial records). Suppose program A takes fA(n)=30n+8 microseconds to process any n records, while program B takes fB(n)=n2+1 microseconds to process the n records. Which program would you choose, kno
15、wing youll want to support millions of users?,A,17,Visualizing Orders of Growth,On a graph, as you go to the right, a faster growing function eventually becomes larger.,fA(n)=30n+8,Increasing n ,fB(n)=n2+1,Value of function ,18,Big-O Notation,We say fA(n)=30n+8 is order n, or O(n). It is, at most, r
16、oughly proportional to n. fB(n)=n2+1 is order n2, or O(n2). It is, at most, roughly proportional to n2. In general, an O(n2) algorithm will be slower than O(n) algorithm. Warning: an O(n2) function will grow faster than an O(n) function.,19,More Examples ,We say that n4 + 100n2 + 10n + 50 is of the
17、order of n4 or O(n4) We say that 10n3 + 2n2 is O(n3) We say that n3 - n2 is O(n3) We say that 10 is O(1), We say that 1273 is O(1),20,Big-O Visualization,21,Computing running time,Algorithm 1 Algorithm 2Cost Costarr0 = 0; c1 for(i=0; iN; i+) c2arr1 = 0; c1 arri = 0; c1arr2 = 0; c1.arrN-1 = 0; c1 - -
18、c1+c1+.+c1 = c1 x N (N+1) x c2 + N x c1 = (c2 + c1) x N + c2,O(n),22,Computing running time (cont.),Cost sum = 0; c1 for(i=0; iN; i+) c2for(j=0; jN; j+) c2 sum += arrij; c3 -c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N x N,O(n2),23,Running time of various statements,while-loop,for-loop,24,i = 0; while
19、(iN) X=X+Y; / O(1)result = mystery(X); / O(N), just an example.i+; / O(1) The body of the while loop: O(N) Loop is executed: N timesN x O(N) = O(N2),Examples,25,if (ij)for ( i=0; iN; i+ )X = X+i; elseX=0;Max ( O(N), O(1) ) = O (N),O(N),O(1),Examples (cont.d),26,Asymptotic Notation,O notation: asympt
20、otic “less than”: f(n)=O(g(n) implies: f(n) “” g(n) notation: asymptotic “greater than”: f(n)= (g(n) implies: f(n) “” g(n) notation: asymptotic “equality”: f(n)= (g(n) implies: f(n) “=” g(n),27,Definition: O(g), at most order g,Let f,g are functions RR. We say that “f is at most order g”, if: c,k: f
21、(x) cg(x), xk “Beyond some point k, function f is at most a constant c times g (i.e., proportional to g).” “f is at most order g”, or “f is O(g)”, or “f=O(g)” all just mean that fO(g). Sometimes the phrase “at most” is omitted.,28,Big-O Visualization,k,29,Points about the definition,Note that f is O
22、(g) as long as any values of c and k exist that satisfy the definition. But: The particular c, k, values that make the statement true are not unique: Any larger value of c and/or k will also work. You are not required to find the smallest c and k values that work. (Indeed, in some cases, there may b
23、e no smallest values!),However, you should prove that the values you choose do work.,30,“Big-O” Proof Examples,Show that 30n+8 is O(n). Show c,k: 30n+8 cn, nk . Let c=31, k=8. Assume nk=8. Then cn = 31n = 30n + n 30n+8, so 30n+8 k: . Let c=2, k=1. Assume n1. Then cn2 = 2n2 = n2+n2 n2+1, or n2+1 cn2.
24、,31,Note 30n+8 isnt less than n anywhere (n0). It isnt even less than 31n everywhere. But it is less than 31n everywhere to the right of n=8.,Big-O example, graphically,Increasing n ,Value of function ,n,30n+8,30n+8 O(n),32,Common orders of magnitude,33,34,Order-of-Growth in Expressions,“O(f)” can b
25、e used as a term in an arithmetic expression . E.g.: we can write “x2+x+1” as “x2+O(x)” meaning “x2 plus some function that is O(x)”. Formally, you can think of any such expression as denoting a set of functions: “x2+O(x)” : g | fO(x): g(x)= x2+f(x),35,Useful Facts about Big O,Constants . c0, O(cf)=
26、O(f+c)=O(fc)=O(f) Sums: - If gO(f) and hO(f), then g+hO(f).- If gO(f1) and hO(f2), then g+hO(f1+f2) =O(max(f1,f2)(Very useful!),36,More Big-O facts,Products: If gO(f1) and hO(f2), then ghO(f1f2) Big O, as a relation, is transitive: fO(g) gO(h) fO(h),37,More Big O facts, f,g & constants a,bR, with b0
27、, af = O(f) (e.g. 3x2 = O(x2) f+O(f) = O(f) (e.g. x2+x = O(x2) |f|1-b = O(f) (e.g. x1 = O(x) (logb |f|)a = O(f) (e.g. log x = O(x) g=O(fg) (e.g. x = O(x log x) fg O(g) (e.g. x log x O(x) a=O(f) (e.g. 3 = O(x),38,Definition: (g), at least order g,Let f,g be any function RR. We say that “f is at least
28、 order g”, written (g), if c,k: f(x) cg(x), xk “Beyond some point k, function f is at least a constant c times g (i.e., proportional to g).” Often, one deals only with positive functions and can ignore absolute value symbols. “f is at least order g”, or “f is (g)”, or “f= (g)” all just mean that f (
29、g).,39,Big- Visualization,40,Definition: (g), exactly order g,If fO(g) and gO(f) then we say “g and f are of the same order” or “f is (exactly) order g” and write f(g). Another equivalent definition: c1c2,k: c1g(x)f(x)c2g(x), xk “Everywhere beyond some point k, f(x) lies in between two multiples of
30、g(x).” (g) O(g) (g) (i.e., fO(g) and f(g) ),41,Big- Visualization,42,Rules for ,Mostly like rules for O( ), except: f,g0 & constants a,bR, with b0, af (f) Same as with O. f (fg) unless g=(1) Unlike O. |f| 1-b (f), and Unlike with O. (logb |f|)c (f). Unlike with O. The functions in the latter two cas
31、es we say are strictly of lower order than (f).,43, example,Determine whether: Quick solution:,44,Other Order-of-Growth Relations,o(g) = f | c k: f(x) k “The functions that are strictly lower order than g.” o(g) O(g) (g).(g) = f | c k: cg(x) k “The functions that are strictly higher order than g.” (
32、g) (g) (g).,45,Subset relations between order-of-growth sets.,Relations Between the Relations,RR,( f ),O( f ),( f ),( f ),o( f ), f,46,Strict Ordering of Functions,Temporarily lets write fg to mean fo(g), fg to mean f(g) Note that Let k1. Then the following are true: 1 log log n log n logk n logk n
33、n1/k n n log n nk kn n! nn ,47,Common orders of magnitude,48,Review: Orders of Growth,Definitions of order-of-growth sets, g:RR O(g) f | c,k: f(x) cg(x), xk o(g) f | c k: f(x) k (g) f| | c,k : f(x) cg(x),xk (g) f | c k: f(x) cg(x), xk (g) f | c1c2,k: c1g(x)f(x)|c2g(x), xk,49,Algorithmic and Problem
34、Complexity,Rosen 6th ed., 3.3,50,Algorithmic Complexity,The algorithmic complexity of a computation is some measure of how difficult it is to perform the computation. Measures some aspect of cost of computation (in a general sense of cost).,51,Problem Complexity,The complexity of a computational pro
35、blem or task is the complexity of the algorithm with the lowest order of growth of complexity for solving that problem or performing that task. E.g. the problem of searching an ordered list has at most logarithmic time complexity. (Complexity is O(log n).),52,Tractable vs. Intractable Problems,A pro
36、blem or algorithm with at most polynomial time complexity is considered tractable (or feasible). P is the set of all tractable problems. A problem or algorithm that has more than polynomial complexity is considered intractable (or infeasible). Note n1,000,000 is technically tractable, but really imp
37、ossible. nlog log log n is technically intractable, but easy. Such cases are rare though.,53,Dealing with Intractable Problems,Many times, a problem is intractable for a small number of input cases that do not arise in practice very often. Average running time is a better measure of problem complexi
38、ty in this case. Find approximate solutions instead of exact solutions.,54,Unsolvable problems,It can be shown that there exist problems that no algorithm exists for solving them. Turing discovered in the 1930s that there are problems unsolvable by any algorithm. Example: the halting problem (see pa
39、ge 176) Given an arbitrary algorithm and its input, will that algorithm eventually halt, or will it continue forever in an “infinite loop?”,55,NP and NP-complete,NP is the set of problems for which there exists a tractable algorithm for checking solutions to see if they are correct. NP-complete is a
40、 class of problems with the property that if any one of them can be solved by a polynomial worst-case algorithm, then all of them can be solved by polynomial worst-case algorithms. Satisfiability problem: find an assignment of truth values that makes a compound proposition true.,56,P vs. NP,We know
41、PNP, but the most famous unproven conjecture in computer science is that this inclusion is proper (i.e., that PNP rather than P=NP). It is generally accepted that no NP-complete problem can be solved in polynomial time. Whoever first proves it will be famous!,57,Questions,Find the best big-O notatio
42、n to describe the complexity of following algorithms: A binary search of n elements A linear search to find the smallest number in a list of n numbers An algorithm that lists all ways to put the numbers 1,2,3,n in a row.,58,Questions (contd),An algorithm that prints all bit strings of length n An it
43、erative algorithm to compute n! An algorithm that finds the average of n numbers by adding them and dividing by n An algorithm that prints all subsets of size three of the set 1,2,3,n The best case analysis of a linear search of a list of size n.,59,Questions (contd),The wors-case analysis of a line
44、ar search of a list of size n The number of print statements in the following while n1 print “hello”n=n/2 ,60,Questions (contd),The number of print statements in the followingfor (i=1, in; i+)for (j=1, j n; j+)print “hello” The number of print statements in the followingfor (i=1, in; i+)for (j=1, j i; j+)print “hello”,
copyright@ 2008-2019 麦多课文库(www.mydoc123.com)网站版权所有
备案/许可证编号:苏ICP备17064731号-1