Hi everyone,
Readers who’ve worked through the material in Algorithmic Thinking might be interested to know what’s out there to read next. In the book, I offer some pointers to more traditional algorithms textbooks that you might consider if you’d like to dig deeper into the theory of algorithms.
But if I’ve managed to whet your appetite for competitive programming, then you might consider wrestling with one of the advanced competitive programming books that are available.
In this post, I review the book Guide to Competitive Programming by Antti Laaksonen. (There’s a free version that contains much of what’s in the sale version, if you want to get a taste before deciding whether to buy.) I think it pairs well with Algorithmic Thinking. If you’re getting stuck in Dr. Laaksonen’s book, you might consider reading through Algorithmic Thinking again, and I’ll point out below where Algorithmic Thinking might be particularly helpful. Make no mistake, though: there’s a lot here. The book can be terse. And there are sections in Dr. Laaksonen’s book that are particularly hard core. I need to work through some of it again myself! But if you have the patience and desire to up your competitive programming game, this book can bring it.
The book has 15 chapters. I’ll note here what might be of most interest to those who’ve read my Algorithmic Thinking book as background.
Chapter 1, Introduction. This is a brief intro to competitive programming; most of the material will be familiar to my readers. The book uses C++ as its programming language, and this chapter explains why. (I used C in Algorithmic Thinking, not C++, because my goals were different: peering under the hood rather than getting you ready for fast-paced programming competitions.)
Chapter 2, Programming Techniques. This chapter starts with a very brief intro to C++, including how to read input and write output. If you prefer scanf
and printf
as I taught it in Algorithmic Thinking, you can continue to use those, or you can jump to the C++ streams approach introduced here. There’s some material here on reducing the amount of code you need to type. It makes the code less transparent, but these are the kinds of tricks you might use yourself in competitions (or might see when reading editorials by other competitors). There’s also a lightning-fast review of recursion, and you should be ready for that if you’ve worked through Chapters 2 and 3 of Algorithmic Thinking. The chapter ends with a summary of bit-manipulation techniques: low-level programming techniques that can be a little opaque but are extremely efficient. One of my favourite tricks here is using bit manipulation to generate all subsets of a set of values. Neat!
Chapter 3, Efficiency. This chapter starts with a rundown of big O, along the lines of what we do in Appendix A. There’s a pretty cool table here that helps you use the input size of a problem to reverse-engineer the big O efficiency that the problem author had in mind. That could help narrow down the kind of algorithm that you design. There’s also a formal definition of big O – the sort of thing you’d find in a typical algorithms textbook. But I think the most useful bit of this chapter is the algorithm design section, where you’ll walk through solving two problems (Maximum Subarray Sum, and number of ways to place two queens on a chessboard) with more and more efficient algorithms.
Chapter 4, Sorting and Searching. In Algorithmic Thinking, we use the C qsort
function whenever we need to sort. This chapter starts with a classical treatment of how to implement three sorting algorithms: bubble sort, merge sort, and counting sort. You’re unlikely to need this material for competitive programming, but it’s interesting nonetheless. Remember how I spent an entire chapter, Chapter 6, on binary search? How binary search often offers super-fast solutions to tough problems? Sorting can be like that, too, and this chapter shows how through several examples. Speaking of binary search: that’s here, too, but very briskly compared to my treatment in Algorithmic Thinking.
Chapter 5, Data Structures. Here’s where we avail ourselves of the built-in C++ data structures; during a timed competition, we should use these whenever we can! Here we have dynamic arrays (vectors), iterators and ranges, double-ended queues, stacks, queues, sets, multisets, maps (i.e. dictionaries), priority queues, and more. Yeah, there are a lot of them. How do we choose among them? Sometimes it comes down to efficiency; the chapter does a good job of exploring their efficiency in practice.
Chapter 6, Dynamic Programming. This is an introduction to dynamic programming using the classical examples of coin change, longest increasing subsequence, finding an optimal path through a grid, and knapsack. You’ll be ready for this material once you’ve worked through Chapter 3 of Algorithmic Thinking.
I’ve already noted that this book can be tough going. Still, there’s a further difficulty bump in the next example of this chapter: using dynamic programming for optimal elevator utilization. I like the presentation, though, and I think this is the kind of thing that can influence your perception of the strengths and limitations of dynamic programming. (Before you tackle this elevator problem, you might consider reading up on how you can use dynamic programming to solve the Traveling Salesperson Problem, whose presentation might be a little easier to absorb.)
Chapter 7, Graph Algorithms. You’ll be familiar with some of this material from chapters 4, 5, and 8 of Algorithmic Thinking: graph terminology, adjacency lists, adjacency matrices, breadth-first search, determining whether two nodes are connected, Dijkstra’s Algorithm, etc. You’ll also find a brief discussion of depth-first search here, along with more advanced algorithms that I don’t cover, including those for spanning trees. There’s also nice material on how dynamic programming can be used to solve problems on acyclic graphs.
Chapter 8, Algorithm Design Topics. This is a grabbag of tricks that can help increase the efficiency of your algorithms or show that your algorithms are more efficient than they may seem. We begin with more material on bit manipulation, which shows how to replace loops by more efficient bit operations.
Suppose you design an algorithm that performs $n$ operations. Once in a while, an operation takes $n$ steps; but, usually, each operation takes only $1$ step. So we have $n$ operations, each of which takes at most $n$ steps. Is this an $O(n^2)$ algorithm, then? Well, not necessarily, and the way to analyze this kind of algorithm is using amortized analysis. That technique is introduced, through well-chosen examples, in this chapter; this is one of my favourite sections in the book.
Chapter 9, Range Queries. My readers will be ready for this: I covered Range Sum Queries and Range Minimum Queries in Algorithmic Thinking (chapters 6 and 7). Remember what you learned about Segment Trees? This chapter shows another kind of tree, the Binary Index Tree, that I personally find more confusing than Segment Trees but may nevertheless be worth learning for its compact code.
There’s a lot in those first nine chapters. Still, here we reach what I think of as a natural split point in the book, with the subsequent material still more advanced than what came before.
Chapter 10, Tree Algorithms. This is a deep dive into algorithms on trees. Chapters 2 and 3 of Algorithmic Thinking are good rampup material for this. Among other techniques, you’ll learn how to use dynamic programming to answer queries on trees.
Chapter 11, Mathematics, and Chapter 13, Geometry. There are competitive programming problems that require the use of mathematics (prime numbers, techniques for counting, matrices, probability) or geometry. These topics are not present in Algorithmic Thinking because I didn’t want to exclude anyone who didn’t have the required background, but they’re necessary for a serious competitor.
Chapter 12, Advanced Graph Algorithms. In Algorithmic Thinking, you learned about the importance of modeling the problem you’re trying to solve. In particular, if you can model the problem as a graph, then you’re well on your way to solving it. I showed how shortest paths can often be used to solve problems, but there are other algorithms that work for different classes of graph problems, too. The most famous of those is Maximum Flow, and that (along with other advanced graph material) is covered here.
Chapter 14, String Algorithms. Problems on strings (in C: arrays of chars) can sometimes be solved with techniques that we already know. Other times, we need algorithms and data structures specialized for string processing. This chapter surveys those algorithms and data structures. For example, you’ll learn about Tries, which are trees well-suited to storing strings.
Chapter 15, Additional Topics. This chapter reprises earlier material to add advanced variants and tricks. For example, you’ll learn how to efficiently support range updates in segment trees, and how to optimize special classes of dynamic programming algorithms. A really neat bit of this chapter shows how designing several (seemingly independent) algorithms for a problem can actually help you design a faster overall solution.
As I hope I’ve made clear, you’ll need considerable background, time, and persistence in order to work through the material in the book. If you’re ready to commit, though, then your time with this book will be well spent. Dr. Laaksonen has concisely and expertly condensed a great deal of competitive programming material into this book. The progression is logical, tradeoffs are illustrated, and sample code fragments are provided.
Written on November 2nd , 2020 by Daniel Zingaro