KEMBAR78
15 Plus DSA Pattern | PDF | Theoretical Computer Science | Algorithms
0% found this document useful (0 votes)
24 views6 pages

15 Plus DSA Pattern

The document outlines various algorithmic techniques for optimizing problem-solving in programming, including Sliding Window, Two Pointers, and Dynamic Programming, among others. Each technique is accompanied by its use cases, advantages, and example problems to illustrate its application. The final takeaway emphasizes the importance of recognizing patterns and practicing to master these optimization strategies.

Uploaded by

tovodo5424
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views6 pages

15 Plus DSA Pattern

The document outlines various algorithmic techniques for optimizing problem-solving in programming, including Sliding Window, Two Pointers, and Dynamic Programming, among others. Each technique is accompanied by its use cases, advantages, and example problems to illustrate its application. The final takeaway emphasizes the importance of recognizing patterns and practicing to master these optimization strategies.

Uploaded by

tovodo5424
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

By: @allu Arvind

1. Sliding Window (for Subarrays)

 When to Use?
When you need to find the maximum/minimum/sum in a subarray of fixed or variable size.

 Why?
Instead of brute force O(N²), you move the window efficiently in O(N) me.

 Example Problems:

o Fixed Window: Maximum sum subarray of size K.

o Variable Window: Longest substring without repea ng characters.

2. Two Pointers (for Subarrays, Sor ng, Searching)

 When to Use?
When working with sorted arrays or finding pairs/triplets efficiently.

 Why?
Avoids O(N²) loops by using O(N) approach.

 Example Problems:

o Two Sum (Sorted Array)

o Container With Most Water

3. Prefix Sum / Difference Array (for Subarrays)

 When to Use?
When you need to compute range sums efficiently.

 Why?
Avoids recompu ng sums in O(N²) → O(N).

 Example Problems:

o Range Sum Query

o Subarray Sum Equals K

4. Hashing (for Subarrays and Subsequences)

 When to Use?
When checking for duplicates or storing frequency counts.

 Why?
Avoids O(N²) brute force using O(N) HashMap.

 Example Problems:

o Longest Consecu ve Sequence

o Subarray Sum Equals K


5. Backtracking (for Subsets and Combina ons)

 When to Use?
When solving problems involving all possible combina ons.

 Why?
Op mizes recursion using pruning.

 Example Problems:

o Subsets

o Combina on Sum

6. Bit Manipula on (for Subsets)

 When to Use?
When genera ng all subsets efficiently.

 Why?
Uses binary representa on instead of recursion.

 Example Problems:

o Generate All Subsets (Power Set)

7. Binary Search (for Op miza on)

 When to Use?
When looking for a value in a sorted array or applying search space reduc on.

 Why?
Converts O(N) → O(log N).

 Example Problems:

o Find Peak Element

o Median of Two Sorted Arrays

8. Dynamic Programming (for Subsequences)

 When to Use?
When solving overlapping subproblems.

 Why?
Reduces O(2ⁿ) brute force to O(N²) or O(N log N).

 Example Problems:

o Longest Increasing Subsequence

o Knapsack Problem
9. Stack (for Monotonic Stack Problems)

 When to Use?
When solving next greater/next smaller problems.

 Why?
Avoids nested loops and reduces O(N²) → O(N).

 Example Problems:

o Next Greater Element

o Largest Rectangle in Histogram

10. Heap / Priority Queue (for Op miza on Problems)

 When to Use?
When needing to efficiently find the Kth smallest/largest.

 Why?
Maintains elements efficiently in O(log N).

 Example Problems:

o Kth Largest Element

o Merge K Sorted Lists

11. Graph Traversal (BFS & DFS)

 When to Use?
When working with trees, networks, or finding shortest paths.

 Why?
Op mizes searching in O(V + E).

 Example Problems:

o Shortest Path in a Grid

o Number of Islands

12. Topological Sor ng (for Dependency Problems)

 When to Use?
When scheduling tasks based on dependencies.

 Why?
Uses Kahn’s Algorithm (BFS) or DFS to order tasks.

 Example Problems:

o Course Schedule

o Alien Dic onary


13. Union-Find (for Connected Components)

 When to Use?
When solving problems related to connec vity.

 Why?
Avoids unnecessary DFS/BFS in graphs.

 Example Problems:

o Number of Connected Components

o Kruskal’s Algorithm for MST

14. Trie (for Fast String Search)

 When to Use?
When storing or searching words efficiently.

 Why?
Avoids unnecessary string comparisons.

 Example Problems:

o Word Search

o AutoComplete System

15. Meet in the Middle (for Hard Problems)

 When to Use?
When brute force O(2ⁿ) is too slow.

 Why?
Splits the problem into two halves for O(2ⁿ/2) complexity.

 Example Problems:

o Subset Sum

o Knapsack Varia ons

How to Move from Brute Force to Op miza on?

1. Recognize the pa ern → Iden fy if it fits one of the 15+ pa erns above.

2. Understand the brute force complexity → Check if it's O(N²), O(2ⁿ), etc.

3. Look for redundant calcula ons → Use Sliding Window, DP, Hashing, etc.

4. Op mize using the right pa ern → Apply BFS, Two Pointers, Binary Search, etc.

5. Check constraints → If N ≤ 10⁵, brute force is not op mal.


Final Takeaway

 Start recognizing which pa ern applies to a problem.

 Train yourself to op mize brute force solu ons step by step.

 Prac ce 5-10 problems per pa ern to master it.

You might also like