Given sequence: 20, 24, 30, 35, 50

Ascending order: 20, 24, 30, 35, 50

Merge(20, 24) → new size (44)

Number of comparisons = 20 + 24 – 1 = 43

Sequence: 30, 35, 44, 50

Merge(30, 35) → new size (65)

Number of comparisons = 30 + 35 – 1 = 64

Sequence: 44, 50, 65

Merge(44, 50) → new size (94)

Number of comparisons = 44 + 50 – 1 = 93

Sequence: 65, 94

Merge(65, 94) → new size (150)

Number of comparisons = 65 + 94 – 1 = 158

Therefore, total number of comparisons, 43 + 64 + 93 + 158 = 358Assume that a mergesort algorithm in the worst case takes 30 seconds for an input of size 64.

Which of the following most closely approximates the maximum input size of a problem that can be solved in 6 minutes?Option 2 : 512

**Concept:**

Merge sort algorithm worst case time complexity is O (nlogn)

**Data:**

T_{1} (n) =30 seconds, n_{1} = 64

T_{2} (n) = 6 minutes

**Formula:**

T(n) = nlog_{2}n

**Calculation:**

30 = c × 64 log_{2}(64)

30 = c × 64 log_{2}(2^{6})

30 = c × 64 × 6

\(c = \frac{5}{{64}}\)

6 × 30 = c n_{2} log_{2}n_{2}

\(6 \times 60 = \frac{5}{{64}} \times {n_2}{\log _2}{n_2}\)

**∴ ****n _{2} = 512**

Option 3 : Merge sort

**Heapsort**:

Heap data structure is an array object that can be viewed as a nearly complete binary tree. A heap can be a max or a min-heap. In a max heap, the maximum element will be the root of the tree, and in min heap minimum element will be the root of the tree. To make the right element as the root is heap sorting. It does not use recursion. It is based on the concept of a priority queue.

**Bubble sort: **

It works by repeatedly moving the largest element to the highest index position of the array. It needs swapping of the elements. It compares the adjacent elements and swaps their positions if they are not in order. Order can be ascending or descending.

**Insertion sort:**

In this, array elements are compared to sequentially and arranged in order. It places the unsuitable element into its right position. It repeatedly inserts an element in the sorted subarray to its left.

**Merge sort:**

It is a divide and conquers based algorithm. It uses recursion to sort the elements of the array. In this, the array is divided into two parts until we get the sorted subarray, then combine and sort them again and the final result will be the sorted array of elements.

Therefore merge sort algorithms users recursion.

Option 4 : m + n - 1

**Concept: **

Merge sort algorithm uses Divide and Conquer. Hence, we divide both our lists into m and n single-element lists respectively.

Now, we know that first m lists are sorted and after these m lists, the n lists are also sorted.

Let’s call these single element lists as m1, m2 and so on, and n1, n2 and so on.

To merge these lists into one single-sorted list, we compare

- m1 with n1.
- Either m2 with n1 or m1 with n2. Here, for sorting 3 elements, 2 comparisons are needed.
- Observe that, each element in subset of m needs to be compared with each element in subset of n. We don’t need to compare m1 with other m lists because they are sorted and same goes for n lists.
- We repeat the process until we have got a sorted list with m + n elements. This happens when we have made m + n – 1 comparisons.

Option 2 : 8, 15, 20, 47, 4, 9, 30, 40, 12, 17

The correct answer is **option 2.**

__Key Points__

- Two-way merge sort algorithm to sort the following elements in ascending order 20, 47, 15, 8, 9, 4, 40, 30, 12, 17

**Hence,** **after the 2nd pass list would be** * 8, 15, 20, 47, 4, 9, 30 40, 12, 17 *

__Additional Information__

- Sorting arrays through several computers. Merge Sort is a recursive algorithm with the following recurrence relation for time complexity.
- T(n) = 2T(n/2) + θ(n)
- Time complexity two-way merge sort is O(N log N)
- complete merge sort process for an example array {38, 27, 43, 3, 9, 82, 10}

Option 2 : O(n log n)

**Merge sort:**

Merge sort is based on the divide and conquer approach.

Recurrence relation for merge sort will become:

T(n) = 2T (n/2) + Θ (n)

Using Master’s theorem

T (n) = n × log_{2}n

Therefore, the time complexity of Merge Sort is θ(nlogn).

Option 3 : Merge sort

__Option_1:__ Insertion Sort

In Insertion sort, if the array is already sorted, then it takes O(n) time and if it is sorted is decreasing order, then it takes O(n^{2}) time to sort the array.

__Option_2:__ Quick Sort

In Quick sort, if the array is already sorted whether in decreasing or in non-decreasing order, then it takes O(n^{2}) time.

__Option_3__ – Merge Sort

Merge sort gives time complexity of O(nlogn) in every case, be it best, average or worst. In merge sort, performance is affected least by the order of input sequence.

__Option_4:__ Selection Sort

Option 4 : Θ (n^{2}), Θ (n log n), and Θ (n^{2})

**Insertion sort:**

In Insertion sort, the worst-case takes Θ (n^{2}) time, the worst case of insertion sort is when elements are sorted in reverse order. In that case the number of comparisons will be like:

\(\mathop \sum \limits_{{\rm{p}} = 1}^{{\rm{N}} - 1} {\rm{p}} = 1 + 2 + 3 + \ldots . + {\rm{N}} - 1 = {\rm{\;}}\frac{{{\rm{N}}\left( {{\rm{N}} - 1} \right)}}{2} - 1\)

This will give Θ (n^{2}) time complexity.

**Merge sort:**

In Merge sort, the worst-case takes Θ (n log n) time. Merge sort is based on the divide and conquer approach. Recurrence relation for merge sort will become:

T(n) = 2T (n/2) + Θ (n)

T(n) = n + Θ (n)

T (n) = n × logn

**Quicksort:**

In Quicksort, the worst-case takes Θ (n^{2}) time. The worst case of quicksort is when the first or the last element is chosen as the pivot element.

**Diagram**

**\(\mathop \sum \limits_{{\rm{p}} = 1}^{{\rm{N}} - 1} {\rm{p}} = 1 + 2 + 3 + \ldots . + {\rm{N}} - 1 = {\rm{\;}}\frac{{{\rm{N}}\left( {{\rm{N}} - 1} \right)}}{2} - 1\)**

This will give Θ (n^{2}) time complexity.

Recurrence relation for quick sort algorithm will be,

T (n) = T (n-1) + Θ (n)

This will give the worst-case time complexity as Θ (n^{2}).

Consider the following sorting algorithms.

I. Quicksort

Il. Heapsort

Ill. Mergesort

Which of them perform in least time in the worst case?

Option 2 : II and III only

** Answer**: Option 2

** Concept**:

**Quick Sort**:

Quicksort is an efficient sorting algorithm. It is also called partition-exchange sort which follows the divide and conquers technique

In quick sort worst case, the first or the last element is selected at the pivot element.

For a quicksort, in worst case recurrence relation will become T(n) = T(n - 1) + T (1) + n

Recurrence relation gives: T(n) = O (n2). Hence option 2 is correct

Therefore, the worst-case time complexity of the Quicksort is O (n2).

**Merge sort**:

It is based on the divide and conquers approach.

Recurrence relation for merge sort will become:

T(n) = 2T (n/2) + Θ (n)

Using Master’s theorem

T (n) = n × log2 n

Therefore, the time complexity of Merge Sort is θ(n log n).

**Heap sort**:

Psudocode for heap sort:

build_heap(a,i)

for j = i down to 1 do

1. swap (a[1], a[j])

2. heapification(a,1,j - 1)

The time complexity of heapify is O(log N). The time complexity of BuildHeap() is O(N) and the overall time complexity of Heap Sort is O(N log N).

Option 2 : O(n^{2} log n)

The correct answer is **"option 2".**

__CONCEPT:__

The **Recurrence relation** for the **number of comparisons** needed to **sort** an array of** n integers **is:

**T(n) = 2T(n/2) + n**

**= O(nlog _{2}n)**

**EXPLANATION:**

Consider **n strings** of **length n** in place of **each integer**, since **each integer takes O(1)** **time** to sort then,

**Complexity to sort one string of size n - O(n)**

**Complexity to sort n strings of size n - O(n * nlog _{2}n) = O(n^{2}log_{2}n)**

**Hence, ****worst case running time is **O(n2log2n).

Option 2 : O(n log_{2} n) and O(log_{2} n)

**Merge sort**

It is based on the divide and conquers approach.

Recurrence relation for merge sort will become:

T(n) = 2T (n/2) + Θ (n)

Using Master’s theorem

T (n) = n × log_{2}n

Therefore, the time complexity of Merge Sort is θ(nlogn).

**Binary Search**

Search a sorted array by repeatedly dividing the search interval in half.

Recurrence for binary search is T(n) = T(n/2) + θ(1)

Using Master’s theorem

T (n) = log_{2}n

Consider only half of the input list and throw out the other half. Hence time complexity is O(log n).

Option 1 : O(1), O(n)

The total of n elements in the entire array.

- The divide step takes constant time, regardless of the subarray size. It indicates constant time by Θ(1) left parenthesis, 1, right parenthesis.
- The conquer step, where we recursively sort two subarrays of approximately n/2 elements each, takes some amount of time, but we'll account for that time when we consider the subproblems.
- The combined step merges a total of n elements, taking Θ(n), left parenthesis, n, right parenthesis time.

Therefore Answer is Option 1

Option 1 : Merge Sort

__Merge sort:__

Merge sort complexity is independent of the distribution of data. Merge sort is based on the divide and conquer approach. Recurrence relation for merge sort will become:

T(n) = 2T (n/2) + Θ (n)

T(n) = n + Θ (n)

T (n) = n × logn

**Merge sorting algorithms, running time complexity is least dependent on the initial ordering of the input that is O(n × log _{2}n)**

__Insertion sort:__

In Insertion sort, the worst-case takes Θ (n2) time, the worst case of insertion sort is when elements are sorted in reverse order. In that case the number of comparisons will be like:

\(\mathop \sum \limits_{{\rm{p}} = 1}^{{\rm{N}} - 1} {\rm{p}} = 1 + 2 + 3 + \ldots . + {\rm{N}} - 1 = {\rm{\;}}\frac{{{\rm{N}}\left( {{\rm{N}} - 1} \right)}}{2} - 1\)

This will give Θ (n2) time complexity.

Quicksort:

In Quicksort, the worst-case takes Θ (n2) time. The worst case of quicksort is when the first or the last element is chosen as the pivot element.

This will give Θ (n2) time complexity.

Recurrence relation for quick sort algorithm will be,

T (n) = T (n-1) + Θ (n)

This will give the worst-case time complexity as Θ (n2).

**It is clear that quick sort and insertion sort time complexity depend on the input sequence**

__Important Point__

Algorithm |
Best-case (Ω) |
Average-case (Θ) |
Worst-case(O) |

Merge sort |
n × log2n |
n × log2n |
n × log2n |

Quicksort |
n × log2 n |
n2 |
n2 |

Insertion sort | n | n2 | n2 |

Selection sort |
n2 |
n2 |
n2 |

Option 1 : Microsoft Word

**MAIL MERGE:**

- Mail Merge is used to create and send bulk mail, labels, and envelopes.
- The feature is usually employed in a word processing document, e.g. an MS Word file, which contains fixed text (which is the same in each output document) and variables (which act as placeholders that are replaced by text from the data source).
- Mail Merge enables us to send the same letter to different persons in MS Word.
- It imports data from another source such as a spreadsheet and then uses that to replace placeholders throughout the message with the relevant information for each individual that is being messaged.

**Advantages of mail merge:**

1) It saves time and effort as only one document needs to be checked for errors, so there are also fewer chances of mistakes being done.

2) The standard letter/template can be saved and reused.

Disadvantages of mail merge:

1) Mailmerge letters can lack a personal touch, because the only individual part, is the data merged from the database.

2) The database that provides the information, must be kept up to date.

__Steps to use the Select Recipient option in the Mail Merge.__

Steps | Procedure | Images |

1. | Open an existing Word document, or create a new one. | |

2. | From the Mailings tab, click the Start Mail Merge command and select Step-by-Step Mail Merge Wizard from the drop-down menu. | |

3. | From the Mail Merge task pane on the right side of the Word window, choose the type of document we want to create. In our example, we'll select Letters. Then click Next: Starting document. | |

4. | Now we will need an address list so Word can automatically place each address into the document. The list can be in an existing file, such as an Excel workbook, or we can type a new address list from within the Mail Merge Wizard. |

Option 2 : A sorting algorithm is stable if it preserves the order of duplicate keys

__Concept __

The stability of a sorting algorithm is concerned with how the algorithm treats equal (or repeated) elements.

A sorting algorithm is said to be stable if two objects with equal keys appear in the same order in sorted output as they appear in the input array to be sorted.

Some sorting algorithms are stable by nature like Insertion sort, Merge Sort, Bubble Sort, etc. And some sorting algorithms are not, like Heap Sort, Quick Sort, etc.