Guide by Mumit Khan
• A Tiny STL Primer by David Harvey
• A Very Modest STL Tutorial by Jak Kirman
• The Dinkum Standard Template Library Reference by P. J. Plauger.
• Overview of the STL by G. Bowden Wise
• The Binder Library. A versatile function argument binding mechanism that interoperates with STL algorithms.
• The STL Resource List by Warren Young
• The Standard Template Library Overview by Rob Kremer
• Alexander Stepanov's Byte Magazine's article
• Dr. Dobb's Journal interview with Alexander Stepanov
• STL for C Programmers, by Leen Ammeraal. John Wiley, 1996. ISBN 0-471-97181-2.
• Generic Programming and the STL, by Matthew Austern. Addison-Wesley, 1998. ISBN 0-201-30956-4.
• Designing Components with the C STL, by Ulrich Breymann. Addison Wesley Longman, 1998. ISBN 0-201-17816-8.
• Data Structures in C Using the Standard Template Library, by Timothy Budd. Addison-Wesley, 1997.
• The STL Primer, by Graham Glass and Brett Schuchert. Prentice-Hall, 1996. ISBN 0134549767
• The C Standard Library - A Tutorial and Reference, by Nicolai Josuttis. Addison Wesley Longman, 1999. ISBN 0-201-37926-0. (English translation of Die CStandardbibliothek.)
• STL Tutorial and Reference Guide, by David Musser and Atul Saini. Addison-Wesley, 1996. ISBN 0-201-63398-1.
• C Programmer's Guide to the Standard Template Library, by Mark Nelson. IDG Books, 1995. ISBN 1-56884-314-3. (No longer in print.)
• The Standard Template Library, by P. J. Plauger, Alexander Stepanov, Meng Lee, and David Musser. Prentice-Hall. ISBN 0-13-437633-1. (forthcoming)
• Using the STL, by Robert Robson. Springer-Verlag, 1998.
• STL Programming from the Ground Up, by Herbert Schildt. Osborne McGraw-Hill, 1999.
We welcome additions to this list. Please send suggestions to [email protected].
Al Stevens Interviews Alex Stepanov
This interview appeared in the March 1995 issue of Dr. Dobb's Journal, and is reprinted with permission.
Tell us something about your long-term interest in generic programming.
I started thinking about generic programming in the late 70s when I observed that some algorithms depended not on some particular implementation of a data structure but only on a few fundamental semantic properties of the structure. I started going through many different algorithms, and I found that most algorithms can be abstracted away from a particular implementation in such a way that efficiency is not lost. Efficiency is a fundamental concern of mine. It is silly to abstract an algorithm in such a way that when you instantiate it back it becomes inefficient.
At that time I thought that the right way of doing this kind of research was to develop a programming language, which is what I started doing with two of my friends, Deepak Kapur, who at present is a professor at State University of New York, Albany, and David Musser, professor at Rensselaer Polytechnic Institute. At that time the three of us worked at the General Electric Research Center at Schenectady, NY. We started working on a language called Tecton, which would allow people to describe algorithms associated with what we called generic structures, which is just a collection of formal types and properties of these types. Sort of mathematical stuff. We realized that one can define an algebra of operations on these structures, you can refine them, you can enrich them, and do all sorts of things.
There were some interesting ideas, but the research didn't lead to practical results because Tecton was functional. We believed Backus's idea that we should liberate programming from the von Neumann style, and we didn't want to have side effects. That limited our ability to handle very many algorithms that require the notion of state and side effects.
The interesting thing about Tecton, which I realized sometime in the late 70s, was that there was a fundamental limitation in the accepted notion of an abstract data type. People usually viewed abstract data types as something which tells you only about the behavior of an object and the implementation is totally hidden. It was commonly assumed that the complexity of an operation is part of implementation and that abstraction ignores complexity. One of the things that is central to generic programming as I understand it now, is that complexity, or at least some general notion of complexity, has to be associated with an operation.
Let's take an example. Consider an abstract data type stack. It's not enough to have Push and Pop connected with the axiom wherein you push something onto the stack and after you pop the stack you get the same thing back. It is of paramount importance that pushing the stack is a constant time operation regardless of the size of the stack. If I implement the stack so that every time I push it becomes slower and slower, no one will want to use this stack.
We need to separate the implementation from the interface but not at the cost of totally ignoring complexity. Complexity has to be and is a part of the unwritten contract between the module and its user. The reason for introducing the notion of abstract data types was to allow interchangeable software modules. You cannot have interchangeable modules unless these modules share similar complexity behavior. If I replace one module with another module with the same functional behavior but with different complexity tradeoffs, the user of this code will be unpleasantly surprised. I could tell him anything I like about data abstraction, and he still would not want to use the code. Complexity assertions have to be part of the interface.
Around 1983 I moved from GE Research to the faculty of the Polytechnic University, formerly known as Brooklyn Polytechnic, in NY. I started working on graph algorithms. My principal collaborator was Aaron Kershenbaum, now at IBM Yorktown Heights. He was an expert in graph and network algorithms, and I convinced him that some of the ideas of high order and generic programming were applicable to graph algorithms. He had some grants and provided me with support to start working with him to apply these ideas to real network algorithms. He was interested in building a toolbox of high order generic components so that some of these algorithms could be implemented, because some of the network algorithms are so complex that while they are theoretically analyzed, but never implemented. I decided to use a dialect of Lisp called Scheme to build such a toolbox. Aaron and I developed a large library of components in Scheme demonstrating all kinds of programming techniques. Network algorithms were the primary target. Later Dave Musser, who was still at GE Research, joined us, and we developed even more components, a fairly large library. The library was used at the university by graduate students, but was never used commercially. I realized during this activity that side effects are important, because you cannot really do graph operations without side effects. You cannot replicate a graph every time you