Q.4: What do you mean by recursion? Write its property.
Ans: Recursion: Recursive procedure contains either a call statement to itself or a call statement to a second procedure that may eventually result in a call statement back to the original procedure.
It has the following properties:
There must be certain criteria, called base criteria, for which the procedure does not call itself.
Each time the procedure does call itself; it must be closer to the base criteria.
Example:
Tower of Hanoi:
Algorithm: Tower of Hanoi (n, x, y, z).
//move the top n disks from tower x to tower y.
{
If(n≥1) then
{
Tower of Hanoi (n – 1, x, z, y);
Write(“Move top disk from” x “to top of tower” y );
Tower of Hanoi(n – 1, z, y, x);
}
}
Complexity of Tower of Hanoi problem:
Let, T(n) = no of movement needed to place n disks from source peg to destination peg.
T(n)={█(1; n=1@T(n-1)+1+T(n-1); n>1)┤
T(n) =T(n-1)+1+T(n-1)
=2T(n-1)+1
=2(2T(n-2)+1)+1
=2^2 T(n-2)+2+1
=2^3 T(n-3)+2^2+2+1
=2^(n-1) T(n-(n-1) )+2^(n-2)+ ………+2^2+2+1
=2^(n-1) T(1)+2^(n-2)+ ………+2^2+2+1
=1+2+2^2+ ………+2^(n-2)+2^(n-1)
[We know that, S=a+ar+ar^2+ ………+〖ar〗^n=a(r^(n+1)-1)/(r-1),r>0]
So, T(n)=(1 .(2^n-1) )/(2-1)=2^n-1
#Problem:
T(n)={█(2; n=0@2+T(n-1); n>0)┤
T(n) =2+T(n-1)
=2+(2+T(n-2))
=2(2)+T(n-2)
=3(2)+T(n-2)
=n(2)+T(n-n)
=2n+T(0)
=2n+2
Q.5: Describe different complexity of an algorithm.
Ans: Space complexity: The space complexity of an algorithm is the amount of memory it needs to run to completion. The space needed to each algorithm is seen to be the sum of the following component.
A fixed part that is independent of the characteristics (such as number, size) of the inputs and outputs.
A variable part that consists of the space needed by component variables whose size is dependent on the particular problem instances being solved, the space needed by referenced variables.
The space needed by this algorithm sum of the4 following component:
Algorithm Sum (a, n)
{
s □(∶=) 0;
for i □(∶=) 1 to n do
s □(∶=) s + a [i];
return s;
}
The space requirement S(P) of any algorithm P may therefore be written as S(P)=c+S_P ,where S_P→ instance characteristic, and c → constant.
Time complexity: The time complexity of an algorithm is the amount of computer time it needs to run to completion. The time taken T (P) taken by a program P is the sum of the compiler time and run (or execution) time. The compiler time does not depend on the instance characteristics. We concern ourselves with just the run time of a program.
Run time is denoted by t_p(instance characteristics).
t_p (n)=c_a ADD(n)+c_s SUB(n)+c_m MUL(n)+c_d DIV(n)+ ………
Where, n denotes the instance characteristics, and c_a,c_s,c_m,c_d, and so on respectively denote the time needed for an addition, subtraction, multiplication, division and so on and ADD, SUB, MUL, DIV and so on are functions whose values are the numbers of additions, subtractions, multiplications, divisions and so on that are performed when the code for P is used on an instance with characteristic n.
Q.6: Describe asymptotic notation.
Ans: Asymptotic notation: The notation that is used to describe the asymptotic running time of an algorithm are defined in terms of functions whose domains are the set of natural numbers; N = {0, 1,2, … …}
Big – O notation: The function f(n)=O(g(n) ) if and only if there exist positive constants C and no such that O≤f(n)≤Cg(n) for all n≥n_o.
We use O – notation to give an upper bound on a function to within a constant factor. For all values n to the right of n_o, the value of the function f(n) is on or below g(n).
Example: 3n + 2 = O(n) as 3n+ 2 ≤ 4n for all n≥2.
Omega (Ω) notation: The function f(n)=Ω(g(n) ) if and only if there exist positive constant C and no such that O≤Cg(n)≤f(n) for all n≥n_o.
We use Ω - notation to give an lower bound on a function to within a constant factor. For all values n to the right of n_o, the value of the function f(n) is on or below g(n).
Example: 3n + 2 = Ω(n) as 3n + 2 ≥ 3n for all n≥1.
Theta (θ) notation: The function f(n)= θ(g(n) ) if and only if there exist positive constant C_1,C_2 and n_o such that, O≤C_1 g(n)≤f(n)≤C_2 g(n) for all n≥n_o.
We use θ – notation to give an tight bound on a function to within a constant factor. For all values of n, the value of f(n) lies at or above C_1 g(n) and at or bellow C_2 g(n). We says, g(n) is asymptotically tight bound for f(n).
Example: 3n + 2 = θ(n) as 3n + 2 ≥ 3n for n≥ 2 and 3n + 2 ≤ 4n for n≥2. So, C_1=3, C_2=4 and n_o=2
Search This Blog
Subscribe to:
Post Comments (Atom)

No comments:
Post a Comment