CS50 Week 3: Algorithm Insights

Aug 2, 2024

CS50 Week 3 Lecture Notes

Overview

  • Focus on algorithms and implementation rather than new syntax.
  • Emphasis on thinking algorithmically and applying learned vocabulary in practical coding scenarios.

Recap of Previous Weeks

  • Understanding of C and Scratch programming.
  • Importance of algorithms for solving real-world problems.

Algorithm Efficiency

  • Chart from Week 0:
    • X-axis: Size of the problem (e.g., number of phone book pages).
    • Y-axis: Time to solve (measured in various units).
  • Algorithms Discussed:
    • Individual page search (1:1 slope).
    • Pair search (2:1 slope).
    • Logarithmic search (Slow increase, fundamental advantage).

Attendance Example

  • Demonstration of a faster attendance algorithm using pairing and summing.
  • Theoretical efficiency vs. actual outcome showcased bugs in execution.
  • Connection to algorithmic thinking: Divide and conquer.

Data Structures

  • Introduction to arrays as a collection of values in contiguous memory.
  • Characteristics of arrays in C:
    • Must be of the same data type.
    • Access through indexing (0-based).

Searching Algorithms

Linear Search

  • Definition: Search through each item sequentially.
  • Pseudocode:
    for i from 0 to n-1:
        if array[i] == target:
            return true
    return false
    
  • Complexity:
    • Worst Case: O(n)
    • Best Case: Ω(1)

Binary Search

  • Definition: Search through sorted data by repeatedly dividing the search interval in half.
  • Pseudocode:
    if array[mid] == target:
        return true
    elif array[mid] < target:
        search right half
    else:
        search left half
    
  • Complexity:
    • Worst Case: O(log n)
    • Best Case: Ω(1)

Running Time Analysis

  • Discussed Big O Notation as a tool to measure efficiency of algorithms.
  • Big O Notation Cheat Sheet:
    • O(1): Constant Time
    • O(n): Linear Time
    • O(n^2): Quadratic Time
    • O(log n): Logarithmic Time
    • O(n log n): Linearithmic Time

Sorting Algorithms

Selection Sort

  • Definition: Repeatedly select the smallest or largest element from an unsorted section.
  • Pseudocode:
    for i from 0 to n-1:
        find minimum in array[i..n-1]
        swap minimum with array[i]
    
  • Complexity:
    • Worst Case: O(n^2)
    • Best Case: Ω(n^2)

Bubble Sort

  • Definition: Repeatedly swap adjacent elements if they are in the wrong order.
  • Pseudocode:
    repeat n times:
        for i from 0 to n-2:
            if array[i] > array[i+1]:
                swap(array[i], array[i+1])
    
  • Complexity:
    • Worst Case: O(n^2)
    • Best Case: Ω(n)

Merge Sort

  • Definition: Recursively divides the array and merges sorted halves.
  • Pseudocode:
    if length of array is 1:
        return
    mid = length of array / 2
    sort left half
    sort right half
    merge_sorted_halves
    
  • Complexity:
    • Worst Case: O(n log n)
    • Best Case: Ω(n log n)

Recursion

  • Recursion is when a function calls itself with different arguments to solve smaller instances of the same problem.
  • Illustrated through examples like sorting algorithms and pyramid structure.

Conclusion

  • Merge sort is efficient and exemplifies the use of recursion and dividing problems into manageable parts.
  • Algorithms discussed are widely applicable in programming and real-world data handling.