Here are the answers to your queries:
1. Define Recursive Function
A recursive function is a function that calls itself to solve a smaller sub-problem of the original problem. This process continues
until it reaches a base case, which stops further recursion.
Example:
def factorial(n):
if n == 0:
return 1
return n * factorial(n - 1)
2. What is Tail Recursion? Give Example.
Tail recursion is a type of recursion where the recursive call is the last operation performed by the function. This optimization can
improve memory usage since it does not need to store previous states.
Example:
def tail_factorial(n, acc=1):
if n == 0:
return acc
return tail_factorial(n - 1, acc * n)
3. Differentiate Iteration and Recursion
Aspect Iteration Recursion
Definition Repeated execution of a block of code.A function calling itself.
State Uses a loop variable. Uses a function call stack.
Memory UsageMore efficient in terms of memory. May consume more memory due to stack.
Termination Controlled by loop conditions. Requires a base case to terminate.
Complexity Easier to understand and debug. Can be harder to debug.
4. Algorithm for Calculating GCD of Two Numbers
GCD(a, b):
1. If b == 0, return a.
2. Else, compute GCD(b, a % b).
Example (Python):
def gcd(a, b):
if b == 0:
return a
return gcd(b, a % b)
5. Recursive Algorithm for Binary Search
BinarySearch(arr, low, high, key):
1. If low > high, return -1.
2. Set mid = (low + high) // 2.
3. If arr[mid] == key, return mid.
4. If arr[mid] > key, call BinarySearch(arr, low, mid-1, key).
5. Else, call BinarySearch(arr, mid+1, high, key).
Example (Python):
def binary_search(arr, low, high, key):
if low > high:
return -1
mid = (low + high) // 2
if arr[mid] == key:
return mid
elif arr[mid] > key:
return binary_search(arr, low, mid - 1, key)
else:
return binary_search(arr, mid + 1, high, key)
6. Recursive Solution for Tower of Hanoi
TowerOfHanoi(n, source, target, auxiliary):
1. If n == 1:
a. Move disk from source to target.
b. Return.
2. Call TowerOfHanoi(n-1, source, auxiliary, target).
3. Move disk from source to target.
4. Call TowerOfHanoi(n-1, auxiliary, target, source).
Example (Python):
def tower_of_hanoi(n, source, target, auxiliary):
if n == 1:
print(f"Move disk 1 from {source} to {target}")
return
tower_of_hanoi(n - 1, source, auxiliary, target)
print(f"Move disk {n} from {source} to {target}")
tower_of_hanoi(n - 1, auxiliary, target, source)
7. Recursion Tree for 3 Nodes
For a function f(n) = f(n-1) + f(n-2), the recursion tree would look like:
f(3)
/ \
f(2) f(1)
/ \
f(1) f(0)
8. Advantages of Using Doubly Linked List
Can traverse in both directions.
Allows efficient insertion and deletion from both ends.
More flexible than singly linked lists.
9. Differences Between Array List and Linked List
Aspect Array List Linked List
Storage Stored in contiguous memory blocks.Stored in non-contiguous memory.
Access Time O(1) for random access. O(n) for access.
Insertion/DeletionO(n) (shifting required). O(1) (if pointer provided).
Memory Usage Requires more memory for resizing. Requires extra memory for pointers.
10. Differences Between Single and Double Linked List
Aspect Single Linked List Double Linked List
Direction Traversal in one direction only.Traversal in both directions.
Memory Usage Less memory (single pointer). More memory (two pointers).
Insertion/DeletionDifficult at the back. Easier at both ends.
11. Insert a Node at Front of a Doubly Linked List
class Node:
def __init__(self, data):
self.data = data
self.prev = None
self.next = None
def insert_at_front(head, data):
new_node = Node(data)
new_node.next = head
if head:
head.prev = new_node
return new_node # New head
12. Insert a Node at End of a Doubly Linked List
def insert_at_end(head, data):
new_node = Node(data)
if not head:
return new_node
temp = head
while temp.next:
temp = temp.next
temp.next = new_node
new_node.prev = temp
return head
13. Delete a Node at Front of a Doubly Linked List
def delete_at_front(head):
if not head:
return None
new_head = head.next
if new_head:
new_head.prev = None
return new_head
14. Delete a Node at End of a Doubly Linked List
def delete_at_end(head):
if not head:
return None
if not head.next:
return None
temp = head
while temp.next:
temp = temp.next
temp.prev.next = None
return head
15. Insert a Node at Front of a Singly Linked List
class Node:
def __init__(self, data):
self.data = data
self.next = None
def insert_at_front(head, data):
new_node = Node(data)
new_node.next = head
return new_node # New head
16. Insert a Node at End of a Singly Linked List
def insert_at_end(head, data):
new_node = Node(data)
if not head:
return new_node
temp = head
while temp.next:
temp = temp.next
temp.next = new_node
return head
17. Delete a Node at Front of a Singly Linked List
def delete_at_front(head):
if not head:
return None
return head.next # New head
18. Delete a Node at End of a Singly Linked List
def delete_at_end(head):
if not head or not head.next:
return None
temp = head
while temp.next and temp.next.next:
temp = temp.next
temp.next = None
return head
19. Define BST
A Binary Search Tree (BST) is a binary tree where each node satisfies the following properties:
1. The left subtree contains only nodes with values less than the node's value.
2. The right subtree contains only nodes with values greater than the node's value.
3. Both left and right subtrees are also BSTs.
20. Construct a Binary Tree from Pre-order and In-order Traversals
In-order: 8, 1, 25, 12, 7, 13, 9
Pre-order: 1, 8, 12, 25, 13, 7, 9
class TreeNode:
def __init__(self, data):
self.data = data
self.left = None
self.right = None
def build_tree(preorder, inorder):
if not preorder or not inorder:
return None
root_val = preorder.pop(0)
root = TreeNode(root_val)
inorder_index = inorder.index(root_val)
root.left = build_tree(preorder, inorder[:inorder_index])
root.right = build_tree(preorder, inorder[inorder_index + 1:])
return root
21. Prove Maximum and Minimum Height of a Binary Tree
Maximum Height: When the tree is skewed (like a linked list), height = n (number of nodes).
Minimum Height: When the tree is balanced, height = log₂(n) (base-2 logarithm).
22. Compare Full Binary Tree and Complete Binary Tree
Aspect Full Binary Tree Complete Binary Tree
Definition All nodes have 0 or 2 children.All levels are filled except possibly the last.
Leaf NodesFound at the same level. Found from left to right at the last level.
Example Perfectly balanced tree. Binary heap is an example.
23. DFS and BFS Algorithm Problem Solving
DFS (Depth First Search)
def dfs(graph, start, visited=None):
if visited is None:
visited = set()
visited.add(start)
print(start, end=" ")
for neighbor in graph[start]:
if neighbor not in visited:
dfs(graph, neighbor, visited)
BFS (Breadth First Search)
from collections import deque
def bfs(graph, start):
visited = set()
queue = deque([start])
while queue:
node = queue.popleft()
if node not in visited:
print(node, end=" ")
visited.add(node)
queue.extend(graph[node] - visited)
24. C Function to Delete a Node After a Specified Node in a Singly Linked List
#include <stdio.h>
#include <stdlib.h>
struct Node {
int data;
struct Node* next;
};
void deleteAfter(struct Node* prevNode) {
if (prevNode == NULL || prevNode->next == NULL) {
printf("No node to delete.\n");
return;
}
struct Node* temp = prevNode->next;
prevNode->next = temp->next;
free(temp);
}
25. Recursive Function for Binary Search
int binarySearch(int arr[], int low, int high, int key) {
if (low > high)
return -1;
int mid = (low + high) / 2;
if (arr[mid] == key)
return mid;
else if (arr[mid] > key)
return binarySearch(arr, low, mid - 1, key);
else
return binarySearch(arr, mid + 1, high, key);
}
26. Differences Between Complete and Full Binary Tree
Aspect Complete Binary Tree Full Binary Tree
DefinitionAll levels filled except possibly the last, filled from left.Every node has 0 or 2 children.
Structure Last level may have incomplete nodes. Perfectly symmetrical structure.
Example A tree with levels filled left to right. A tree where all internal nodes h
Example:
Complete Binary Tree:
1
/ \
2 3
/ \ /
4 5 6
Full Binary Tree:
1
/ \
2 3
/ \ / \
4 5 6 7
28. Find Maximum and Minimum in a Binary Search Tree
Algorithm:
1. Traverse the leftmost path to find the minimum value.
2. Traverse the rightmost path to find the maximum value.
C Function:
struct Node* findMin(struct Node* root) {
while (root->left != NULL)
root = root->left;
return root;
}
struct Node* findMax(struct Node* root) {
while (root->right != NULL)
root = root->right;
return root;
}
29. Prove Minimum and Maximum Height of a Binary Tree
Maximum Height: A skewed tree (like a linked list) has n levels, so height = n.
Minimum Height: A perfectly balanced tree has height = log₂(n) (base-2 logarithm).
Proof:
In a perfectly balanced binary tree:
At height h, maximum nodes =
ℎ
2 −1
.
Rearranging
ℎ
2 −1=𝑛
, we get
ℎ = log (𝑛 + 1)
2
.
30. Finding the Distance Between Two Nodes in a Binary Tree
Steps:
1. Find the Lowest Common Ancestor (LCA) of the two nodes.
2. Compute the distance from the LCA to each node.
3. Add the distances to get the total distance.
Formula:
Distance = Dist(LCA, Node1) + Dist(LCA, Node2)
31. Parent, Left Child, and Right Child of Node ‘C’
Given array: A B C D E
Index of C = 2
Parent =
floor((𝑖 − 1)/2)
:
Parent(C) = 𝐵
Left child =
2𝑖 + 1
:
Left(C) = 𝐹
(does not exist here).
Right child =
2𝑖 + 2
:
Right(C) = 𝐺
(does not exist here).
32. Postorder Traversal of the Binary Tree
Inorder: D B E A F C G
Preorder: A B D E C F G
Steps to Find Postorder:
1. Root = A.
2. Left subtree (Inorder): D B E → Preorder: B D E → Postorder: D E B.
3. Right subtree (Inorder): F C G → Preorder: C F G → Postorder: F G C.
Postorder Traversal: D E B F G C A.
33. Differences Between Tree Traversal and Graph Traversal
Aspect Tree Traversal Graph Traversal
Structure A tree has a hierarchical structure.A graph may have cycles.
TechniquesInorder, Preorder, Postorder. BFS, DFS.
Cycles Trees have no cycles. Graphs may contain cycles.
Root/StartStarts from the root. Can start from any node.
34. DFS Algorithm on a Graph
Steps for DFS:
1. Start from any node.
2. Mark the node as visited.
3. Recursively visit all unvisited neighbors.
Example Graph:
A -- B -- C
\ /
D --- E
DFS Traversal (Starting from A):
A → B → C → E → D.
Algorithm (Python):
def dfs(graph, node, visited):
if node not in visited:
print(node, end=" ")
visited.add(node)
for neighbor in graph[node]:
dfs(graph, neighbor, visited)
graph = {
'A': ['B', 'D'],
'B': ['A', 'C'],
'C': ['B', 'E'],
'D': ['A', 'E'],
'E': ['C', 'D']
}
visited = set()
dfs(graph, 'A', visited)
35. Types of Algorithms
Algorithms are categorized based on their approach or application. Here are the main types:
Greedy Algorithm
Dynamic Programming
Divide and Conquer
Backtracking
Brute Force
Recursive Algorithms
Sorting and Searching Algorithms
Graph Algorithms (like Dijkstra's, BFS, DFS)
Machine Learning Algorithms
Hashing Algorithms
Cryptographic Algorithms
36. Greedy Algorithm
A greedy algorithm makes decisions step-by-step, choosing the option that seems the most optimal at each
Example:
Problem: Coin Change Problem
Find the minimum number of coins to make a certain amount.
Solution:
1. Pick the largest coin that is less than or equal to the remaining amount.
2. Subtract the coin's value from the total.
3. Repeat until the amount becomes zero.
Example Input: Amount = 49, Coins = {1, 5, 10, 25}
Steps: Choose 25 → 10 → 10 → 1 → 1 → Total Coins = 5.
37. Dynamic Programming
Dynamic programming (DP) solves problems by breaking them into smaller overlapping subproblems, solving
Example:
Problem: Fibonacci Sequence
Compute the
𝑛
-th Fibonacci number.
Solution (Tabulation):
𝐹[0] = 0, 𝐹[1] = 1, and for 𝑖 ≥ 2, 𝐹[𝑖] = 𝐹[𝑖 − 1] + 𝐹[𝑖 − 2].
Example Input:
𝑛=5
Steps:
𝐹[0] = 0, 𝐹[1] = 1, 𝐹[2] = 1, 𝐹[3] = 2, 𝐹[4] = 3, 𝐹[5] = 5
.
Output:
5
.
38. Divide and Conquer Algorithm
A divide and conquer algorithm splits a problem into smaller, independent subproblems, solves them recur
Example:
Problem: Merge Sort
Sort an array of elements.
Solution:
1. Divide the array into two halves.
2. Sort each half recursively using merge sort.
3. Merge the two sorted halves into a single sorted array.
Example Input: [38, 27, 43, 3, 9]
Steps:
Divide: [38, 27, 43, 3, 9] → [38, 27, 43] and [3, 9].
Conquer: Sort [38, 27, 43] → [27, 38, 43]; Sort [3, 9] → [3, 9].
Combine: Merge [27, 38, 43] and [3, 9] → [3, 9, 27, 38, 43].