Grokking Algorithms: An illustrated guide for programmers and other curious people
Rate it:
Kindle Notes & Highlights
8%
Flag icon
It’s hard to overstate hash tables’ usefulness. When I want to solve a problem, the two plans of attack I start with are “Can I use a hash table?” and “Can I model this as a graph?”
11%
Flag icon
In general, for any list of n, binary search will take log2 n steps to run in the worst case, whereas simple search will take n steps.
12%
Flag icon
You may not remember what logarithms are, but you probably know what exponentials are. log10 100 is like asking, “How many 10s do we multiply together to get 100?” The answer is 2: 10 × 10. So log10 100 = 2. Logs are the flip of exponentials.
12%
Flag icon
Logs are the flip of exponentials.
14%
Flag icon
Big O doesn’t tell you the speed in seconds. Big O notation lets you compare the number of operations. It tells you how fast the algorithm grows.
15%
Flag icon
Here are five Big O run times that you’ll encounter a lot, sorted from fastest to slowest: O(log n), also known as log time. Example: Binary search. O(n), also known as linear time. Example: Simple search. O(n * log n). Example: A fast sorting algorithm, like quicksort (coming up in chapter 4). O(n2). Example: A slow sorting algorithm, like selection sort (coming up in chapter 2). O(n!). Example: A really slow algorithm, like the traveling salesperson (coming up next!).
16%
Flag icon
Algorithm speed isn’t measured in seconds, but in growth of the number of operations. Instead, we talk about how quickly the run time of an algorithm increases as the size of the input increases. Run time of algorithms is expressed in Big O notation. O(log n) is faster than O(n), but it gets a lot faster as the list of items you’re searching grows.
17%
Flag icon
Binary search is a lot faster than simple search. O(log n) is faster than O(n), but it gets a lot faster once the list of items you’re searching through grows. Algorithm speed isn’t measured in seconds. Algorithm times are measured in terms of growth of an algorithm. Algorithm times are written in Big O notation.
20%
Flag icon
There are two different types of access: random access and sequential access. Sequential access means reading the elements one by one, starting at the first element. Linked lists can only do sequential access. If you want to read the 10th element of a linked list, you have to read the first 9 elements and follow the links to the 10th element. Random access means you can jump directly to the 10th element.
27%
Flag icon
To solve a problem using D&C, there are two steps: 1.  Figure out the base case. This should be the simplest possible case. 2.  Divide or decrease your problem until it becomes the base case.
29%
Flag icon
When you’re writing a recursive function involving an array, the base case is often an empty array or an array with one element. If you’re stuck, try that first.
29%
Flag icon
If you like recursion, or you’re interested in learning a new language, check out Haskell.
50%
Flag icon
To calculate the shortest path in an unweighted graph, use breadth-first search. To calculate the shortest path in a weighted graph, use Dijkstra’s algorithm.
56%
Flag icon
In technical terms: at each step you pick the locally optimal solution, and in the end you’re left with the globally optimal solution.
Sugan
Greedy algorithms
57%
Flag icon
This is called an approximation algorithm. When calculating the exact solution will take too much time, an approximation algorithm will work. Approximation algorithms are judged by How fast they are How close they are to the optimal solution
68%
Flag icon
Sometimes algorithms aren’t an exact recipe. They’re a framework that you build your idea on top of.
69%
Flag icon
Levenshtein distance measures how similar two strings are, and it uses dynamic programming. Levenshtein distance is used for everything from spell-check to figuring out whether a user is uploading copyrighted data.