Sure!
Below is a comprehensive table of common data structures and their time
complexities for various operations, including access, search, insertion, and
deletion. These complexities assume the data structure is implemented efficiently.
### 1. **Array (Dynamic/Resizing Array)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Access by index | O(1) |
| Search (unsorted) | O(n) |
| Search (sorted) | O(log n) |
| Insertion (end) | O(1) amortized |
| Insertion (middle) | O(n) |
| Deletion (end) | O(1) |
| Deletion (middle) | O(n) |
| Space complexity | O(n) |
### 2. **Linked List (Singly Linked List)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Access by index | O(n) |
| Search (unsorted) | O(n) |
| Search (sorted) | O(n) |
| Insertion (front) | O(1) |
| Insertion (end) | O(n) (O(1) with tail pointer) |
| Deletion (front) | O(1) |
| Deletion (end) | O(n) |
| Space complexity | O(n) |
### 3. **Doubly Linked List**
| Operation | Time Complexity |
|------------------------|-------------------|
| Access by index | O(n) |
| Search (unsorted) | O(n) |
| Search (sorted) | O(n) |
| Insertion (front) | O(1) |
| Insertion (end) | O(1) |
| Deletion (front) | O(1) |
| Deletion (end) | O(1) |
| Space complexity | O(n) |
### 4. **Stack (Using Array or Linked List)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Push | O(1) |
| Pop | O(1) |
| Peek (top) | O(1) |
| Space complexity | O(n) |
### 5. **Queue (Using Array or Linked List)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Enqueue | O(1) |
| Dequeue | O(1) |
| Front (peek) | O(1) |
| Space complexity | O(n) |
### 6. **Deque (Double-ended Queue)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Add to front | O(1) |
| Add to end | O(1) |
| Remove from front | O(1) |
| Remove from end | O(1) |
| Access by index | O(n) |
| Space complexity | O(n) |
### 7. **Hash Table (Hash Map)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Access/Insert/Search | O(1) average, O(n) worst case (due to collisions) |
| Deletion | O(1) average, O(n) worst case |
| Space complexity | O(n) |
### 8. **Binary Search Tree (BST)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Access/Search/Insert | O(log n) average, O(n) worst case (unbalanced) |
| Deletion | O(log n) average, O(n) worst case |
| Space complexity | O(n) |
### 9. **Balanced Binary Search Tree (e.g., AVL, Red-Black Tree)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Access/Search/Insert | O(log n) |
| Deletion | O(log n) |
| Space complexity | O(n) |
### 10. **Heap (Binary Heap)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Access (top) | O(1) |
| Insert | O(log n) |
| Extract (top) | O(log n) |
| Search | O(n) |
| Space complexity | O(n) |
### 11. **Trie (Prefix Tree)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Insert | O(k), where k is the length of the word |
| Search | O(k) |
| Delete | O(k) |
| Space complexity | O(k * n), where k is the average length of strings and
n is the number of strings |
### 12. **Disjoint Set (Union-Find)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Find | O(α(n)), where α is the inverse Ackermann function
(almost constant) |
| Union | O(α(n)) |
| Space complexity | O(n) |
### 13. **Graph (Adjacency List)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Add Vertex | O(1) |
| Add Edge | O(1) |
| Remove Vertex | O(V + E) |
| Remove Edge | O(V) |
| Search (BFS/DFS) | O(V + E) |
| Space complexity | O(V + E) |
### 14. **Graph (Adjacency Matrix)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Add Vertex | O(V^2) (due to matrix resizing) |
| Add Edge | O(1) |
| Remove Vertex | O(V^2) |
| Remove Edge | O(1) |
| Search (BFS/DFS) | O(V^2) |
| Space complexity | O(V^2) |
### 15. **Bloom Filter**
| Operation | Time Complexity |
|------------------------|-------------------|
| Insert | O(k), where k is the number of hash functions |
| Search | O(k) |
| Space complexity | O(m), where m is the number of bits and k is the number
of hash functions |
### 16. **Skip List**
| Operation | Time Complexity |
|------------------------|-------------------|
| Access/Search/Insert | O(log n) |
| Deletion | O(log n) |
| Space complexity | O(n) |
### 17. **Sparse Matrix (using Linked List or Hash Map)**
| Operation | Time Complexity |
|------------------------|-------------------|
| Access (non-zero element) | O(1) |
| Insert (non-zero element) | O(1) |
| Search (non-zero element) | O(n) |
| Space complexity | O(non-zero entries) |
### Summary of Key Points:
- **Arrays** provide fast access but can be inefficient for insertions and
deletions, especially in the middle.
- **Linked lists** allow fast insertions and deletions, but accessing an element by
index is slower.
- **Hash tables** offer average O(1) time for most operations, but performance can
degrade with many collisions.
- **BSTs** and **balanced trees** provide efficient search and update times, but
can degrade to O(n) if not balanced.
- **Heaps** are great for priority queues, with logarithmic time complexity for
insertions and deletions.
- **Graphs** can be represented either as adjacency matrices or lists, each having
its own space and time complexity trade-offs.
This summary covers the most common data structures, but there are many others with
varying trade-offs depending on the specific use case. Let me know if you'd like
more details on any specific data structure!