Design & Analysis of Algorithm logo

DAA Unit 1 Part 2 MCQs

Unit 1

Father of Algorithm, What is Algorithm, Need of an algorithm, Algorithm Vs Program, Good Algorithm, Analysis of Algorithm, Comparison of running time, Asymptotic notation, Recurrence relation, Substitution method, Iterative method, Recurrence Tree Method, Master Theorem, Extended Master Theorem.

Q21 – The Master theorem (ISRO Scientist 2020)
  1. assumes the subproblems are unequal sizes
  2. can be used if the subproblems are of equal size
  3. cannot be used for divide and conquer algorithms
  4. cannot be used for asymptotic complexity analysis

Ans – (2)

Explanation

The correct statement is (2).

The master theorem can be used if the subproblems are of equal size.

The master theorem is a mathematical tool used to determine the asymptotic complexity of divide-and-conquer algorithms, specifically those that can be represented in a specific form known as a recurrence relation. The theorem can be used for both equal and unequal sized subproblems.

 

Q22 – The running time of an algorithm is given by
T(n) = T(n-1) + T(n-2) – T(n-3), if n > 3
        = n, otherwise
Then what should be the relation between T(1), T(2) and T(3), so that the order of the algorithm is constant? (ISRO Scientist 2018)
  1. T(1) = T(2) = T(3)
  2. T(1) + T(3) = 2*T(2)
  3. T(1) – T(3) = T(2)
  4. T(1) + T(2) = T(3)

Ans – (1)

Explanation – Question says that what should be the ‘relation’ between T(1), T(2) and T(3), so that the order of the algorithm is constant.

To have a constant order of the algorithm, the running time of the algorithm should be O(1), which means it should not depend on the input size n.

Therefore, T(1) will execute in time 1 and the order of algorithm is O(1).

T(2) will execute in time 2 and the order of algorithm is O(1).

T(3) will execute in time 3 and the order of algorithm is O(1).

Q23 – The time complexity of computing the transitive closure of binary relation on a set of n elements is known to be (ISRO Scientist 2018)
  1. O(n)
  2. O(n*log(n))
  3. O(n3/2)
  4. O(n3)

Ans – (4)

Explanation – The time complexity of computing the transitive closure of a binary relation on a set of n elements is O(n^3).

To compute the transitive closure, we typically use the Warshall’s algorithm, which involves a triple nested loop over the elements of the set. The outer loop iterates n times, while the two inner loops iterate n times each. Therefore, the total number of iterations is n * n * n = n^3.

Each iteration of the loop involves a constant amount of work, so the total time complexity is O(n^3).

Q24 – Consider the recurrence equation
Equation
Then T(n) is (in big O order) (ISRO Scientist 2017)
  1. O(n)
  2. O(2n)
  3. O(1)
  4. O(log n)

Ans – (2)

Explanation –

[T(n) = 2T(n-1)] *20

[T(n-1) = 2T(n-2)] *21

[T(n-2) = 2T(n-3)] *22

[T(n-3) = 2T(n-4)] *23

…

[T(2) = 2T(1)] *2n-1

[T(1) = 1] *2n

Dividing the both sides we get

T(n) = 2n

Q25 – Consider the program
void function(int n) {
         int i, j, count=0;
         for(i = n/2; i <= n; i++)
                for(j = 1; j <= n; j = j*2)
                      count++;
}
The complexity of the program is (ISRO Scientist 2017)
  1. O(log n)
  2. O(n2)
  3. O(n2logn)
  4. O(n logn)

Ans – (4)

Explanation –

for(i = n/2; i <= n; i++) /* Runs in n/2 times */

     for(j = 1; j <= n; j = j*2) /* Runs in logn time */

These are nested loops so time complexity is multiple of both. O(nlogn) is the answer.

Q26 – The recurrence relation that arises in relation with the complexity of binary search is (GATE 1994, ISRO Scientist 2017)
(a) T(n) = 2T(n/ 2) + k , where k is constant
(b) T(n) = T(n / 2) + k , where k is constant
(c) T(n) = T(n / 2) + log n
(d) T(n) = T(n / 2) + n             

Ans – (2)

Explanation – We want to search only on the half of the array that’s why the term is T(n/2) and constant time for comparing mid element.

Q27 – The time complexity of computing the transitive closure of a binary relation on a set of
n elements is known to be
(a) O(nlogn)
(b) O(n3/2)
(c) O(n3)
(d) O(n)

Ans – (3)

Explanation –

The time complexity of computing the transitive closure of a binary relation on a set of n elements is O(n^3), using the standard Floyd-Warshall algorithm.

Note that there are other algorithms for computing the transitive closure of a binary relation that have better time complexity for certain types of relations (e.g., sparse relations), but the Floyd-Warshall’s algorithm is a general-purpose algorithm that works well for all types of relations.

Q28 – Consider the following segment of C code
int j, n;
j = 1;
while(j <= n)
       j = j*2;
The number of comparisons made in the execution of the loop for any n>0 is (ISRO Scientist 2016)
  1. Floor(log2n)*n
  2. n
  3. Floor(log2n)
  4. Floor(log2n) + 1

Ans – (4)

Explanation –

The number of comparisons made in the loop is equal to the number of times the loop condition is evaluated, which is Floor(log2n) + 1 (including the final comparison that causes the loop to terminate).

Q29 – Consider the following recurrence
T(n) = 2T(√n) + 1
T(1) = 1
Which of the following is true? (ISRO Scientist 2016)
  1. T(n) = O(loglogn)
  2. T(n) = O(logn)
  3. T(n) = O(√n)
  4. T(n) = O(n)

Ans – (2)

Q30 – What is the time complexity for the following C module? Assume that n>0;
int module(int n)
{
       If (n == 1)
           return 1;
       else
           return (n + module(n-1));
} (ISRO Scientist 2014)
  1. O(n)
  2. O(log n)
  3. O(n2)
  4. O(n!)

Ans – (1)

Explanation –

Some of the students decode the code return(n + module(n-1)) as T(n) = T(n-1) + n. But it is wrong. Why?

Because you put some input value n = constant in the code, so after returning n becomes constant and constant does not change the result in algorithm.

So, correct way is T(n) = T(n-1) + constant and this leads to T(n) = O(n)

Q31 – Let T(n) be defined by T(1) = 10 and T(n+1) = 2n + T(n) for all integers n => 1. Which of the following represents the order of growth of T(n) as a function of n? (ISRO Scientist 2011)
  1. O(n)
  2. O(nlogn)
  3. O(n2)
  4. O(n3)

Ans – (3)

Explanation –

T(1) = 10

=> T(n+1) = T(n) + 2n

Which can be written as

=> T(n) = T(n-1) + 2(n-1)

Solve the recurrence by iterative method given in https://compgeek.co.in/solving-recurrence-relation/

Q32 – Find the time complexity within the terms of n.
for(i=1; i<=n; i++)
    for(j=1; j<=n; j++)
    {
       Statement;
    }
  1. O(n)
  2. O(logn)
  3. O(n2)
  4. O(nlogn)

Ans – (3)

Explanation –

Outer loop runs n times.

Inner loop runs n times.

These are nested so time complexity will be n*n means O(n2).

Q33 – Find the time complexity within the terms of n.
for(i=1; i<=n; i++)
    for(j=1; j<=i; j++)
    {
       Statement;
    }
  1. O(n)
  2. O(logn)
  3. O(n2)
  4. O(nlogn)

Ans – (3)

Explanation – When i = 1, then j executes 1 time.

When i = 2, then j executes 2 times.

When i = 3, then j executes 3 times.

When i = n, then j executes n times.

=> 1 + 2 + 3 + … + n = n(n+1)/2 = O(n2).

Q34 – Find the time complexity within the terms of n.
for(i=1; i<=n; i=i*2)
    {
       Statement;
    }
  1. O(n)
  2. O(logn)
  3. O(n2)
  4. O(nlogn)

Ans – (2)

Explanation – i = 1, 2, 4, 8, 16, 32, …, n so this leads to log n execution.

Q35 – Find the time complexity within the terms of n.
for(i=1; i<=n; i++)
    for(j=1; j<=n; j=j*2)
    {
       Statement;
    }
  1. O(n)
  2. O(logn)
  3. O(n2)
  4. O(nlogn)

Ans – (4)

Explanation –

Outer loop runs n times.

Inner loop runs log n times.

They both are nested so will multiply both. The time complexity of this code is O(nlogn).

Q36 – Find the time complexity within the terms of n.
for(i=1; i<=n; i=i*2)
    for(j=1; j<=n; j=j*2)
    {
       Statement;
    }
  1. O(n)
  2. O(logn)
  3. O(n2)
  4. O((logn)2)

Ans – (4)

Explanation –

Outer loop runs log n times.

Inner loop runs log n times.

They both are nested so will multiply both. The time complexity of this code is O((logn)2).

 

Q37 – Find the time complexity within the terms of n.
for(i=1; i<=n; i=i*2)
    for(j=1; j<=n; j=j++)
    {
       Statement;
    }
  1. O(n)
  2. O(logn)
  3. O(n2)
  4. O(nlogn)

Ans – (1)

Explanation –

When i = 1 or 20, then j executes 20 times.

When i = 21, then j executes 21 times.

When i = 22, then j executes 22 times.

When i = 23, then j executes 23 times.

When i = 2m, then j executes 2m times. [2m ≤ n]

=> 21 + 22 + 23 + . . . + 2m = (2m+1 – 1)/(2 – 1)

=> O(2m) = O(n)

Q38 – Find the time complexity within the terms of n.
for(i=1; i<=n; i=i*2)
    for(j=i; j<=n; j=j++)
    {
       Statement;
    }
  1. O(n)
  2. O(logn)
  3. O(n2)
  4. O(nlogn)

Ans – (4)

Explanation – 

When i = 1 or 20, then j executes 1, 2, 3, …, n times.

When i = 21, then j executes 2, 3, …, n times.

When i = 22, then j executes 4, 5, …, n times.

When i = 23, then j executes 8, 9, 10, …, n times.

When i = 2m, then j executes 2m, 2m + 1, 2m + 2, …, n times. [2m ≤ n]

Sum of execution of j = O(nlogn).

Some of you thinks that j’s execution in terms of n will be O(n2). But it is not. I will draw a graph so that you can see the difference between the two.

log graph

Green color graph has small area than the yellow color graph and j’s execution is green color area.

Q39 – Find recurrence equation from source code
F(int n)
{
  if (n <= 1)
      return;
  F(n-1);
  F(n-1);
}
  1. T(n) = T(n-1) + ÆŸ(1), T(1) = 1
  2. T(n) = 2T(n-1) + ÆŸ(1), T(1) = 1
  3. T(n) = T(n-1) + ÆŸ(n), T(1) = 1
  4. T(n) = T((n-1)/2) + ÆŸ(1), T(1) = 1

Ans – (2)

Explanation – If n <= 1, then return means its time complexity will be T(1) = Ɵ(1).

Also, it has two recurrence function f(n-1), So, its time complexity will be T(n) = 2T(n-1) + ÆŸ(1).

Q40 – Find recurrence equation from source code
F(int n)
{
  if (n <= 1)
      return;
  F(n/2);
  F(n/2);
}
  1. T(n) = T(n/2) + ÆŸ(1), T(1) = 1
  2. T(n) = 2T(n-1) + ÆŸ(1), T(1) = 1
  3. T(n) = 2T(n/2) + ÆŸ(n), T(1) = 1
  4. T(n) = T((n-1)/2) + ÆŸ(1), T(1) = 1

Ans – (3)

Explanation – If n <= 1, then return means its time complexity will be T(1) = Ɵ(1).

Also, it has two recurrence function f(n/2), So, its time complexity will be T(n) = T(n/2) + ÆŸ(1).

 

BOOKS

Algorithm T H CoremanAlgorithm by Udit AgarwalAlgorithm Ellis HorowitzData Structure & Algorithm Made Easy Narasimha

Â