Cons is a Perl-based software construction tool (i.e., substitute for make). It offers a number of features not found in make or other build tools, including integrated dependency analysis (no more "make depend" to generate static lists of .h files), complete dependency analysis across multiple directories, multiple side-by-side variant builds, compilation from code repositories, MD5 signatures instead of time stamps for determining whether a file is up-to-date, and extensibility via Perl.
Test::Cmd provides a framework for portable, automated testing of executable commands and scripts (in any language, not just Perl), especially those that interact with the file system. In addition to running tests and evaluating conditions, the Test::Cmd module manages and cleans up temporary workspace directories, and provides methods for creating files and directories from in-line data (here-documents).
SCons is a software construction tool (build tool substitute for Make) implemented in Python. It features a modular build engine that can be embedded in other software. SCons configuration files are Python scripts that call the build engine API. It can scan files to detect dependencies automatically and maintain a global view of all dependencies in a build tree, and uses MD5 signatures to detect changed file contents reliably. Timestamps can be used if you prefer. SCons also supports parallel builds and is easily extensible through user-defined builder and scanner objects.
> % And for Automake, It should impose a
> % turn to a non directory-recursive
> % approach. There is some work being
> % on that area, but seems to me that
> % more like a 'we support that as an
> % alternative' idea. That would result
> % a several times faster builder, but
> % better yet, a consistent one, read:
> % For some problems using recursive
> % file systems.
> I have found this article to be
> confusing and misleading. It discusses
> several common problems with makefiles,
> most of which are not related to using
> recursive makefiles. Putting everything
> into one giant makefile means that make
> has to parse and process a huge DAG for
> every operation; this can make things
> slower, not faster. I have never seen
> convincing evidence that monolithic
> makefiles are inherantly "better",
> although you may be able to construct
> *some* cases where they run faster
> (trivial examples are likely to do
Real-world testimonial about the benefits of a global DAG:
Having a global view of the dependencies is really cool. Compare the couple of seconds of waiting with the need to "make clean; make" or "make depend; make." I wasted so much time before because I forgot about certain dependencies, resulting in inconsistent builds and strange bugs.
(This is from the project leader for the Computational Crystallography Toolbox, and he happened to mention it while talking about SCons, but it's really generic to any single-DAG build tool.)
The point is: yes, a global DAG may make any individual build take a little longer, which is what's sticking in your craw. Point taken. The benefit, though, comes from all of the problems you avoid by letting the build system take care of these things for you. And it's taking care of them using CPU cycles and disk accesses that are going to get faster with new generations of systems, so this part of your build will only get faster in the future.
What doesn't scale in the future, though, is the time you'll have to spend tracking down an introduced dependency problem and then reconfiguring your build to deal with it. Your counter-example to Miller's paper--just reorder the "bee" and "ant" projects--is itself contrived, because Miller is talking about the general problem of representing dependencies using order (or multiple passes). If order in your build system is necessary to represent all of the dependencies, then your build is fragile and will break as soon as someone checks in a change that breaks the order.
You could say, "Don't break the order," but it's just not necessarily that simple. Not every developer is sufficiently careful, and in a very large, complicated software system, it's likely that no one understands all of the order dependencies. So you end up with huge strings of -l options, or multiple passes through different targets, etc., that no one dares touch because no one really understands what will or won't break the build.
Now you could let the computer do the work of optimizing the build for you (a novel idea!), but if it's going to that, it needs an accurate representation of all of the dependencies--a global DAG. As you point out, it doesn't come free, but for lots of projects, especially large ones, the benefits far outweigh the cost.
Don't get me wrong--Miller's paper isn't flawless. For example, the Makefile-macro technique he uses for knitting together subsidiary Makefiles into a global DAG works fine (I've implemented it on a project), but it's not very extensible. You end up creating a lot of special-purpose macro conventions that have to get rewritten as your build assumptions change, new variants get added, etc.
But "Recursive Make Considered Harmful" is a seminal piece of work. Miller went back to first principals to investigate why increases in CPU and disk speeds over the last 20 years or so haven't sped up builds all that much, and identified a number of contributing factors that most everyone had been overlooking, including (but not limited to) incomplete dependency graphs. In my book, "RMCH" is required reading for anyone who needs to understand why building software correctly can be such a thorny problem.
SCons stability, autoconf functionality
Two such solutions are "SCons" and "Cons". Both try to replace Make with something far more flexible. However, they both depend on tools which are a bit less standard than sh and m4: Perl and Python. Still, both languages are quickly becoming standard on the major Unixes. SCons is a bit less mature, but is preferred by the author of this paper. Using either allows a much clearer build process for complicated software. There is no requirement to generate a set of build instructions from a template.
A few quick items of note about SCons:
SCons is actually a lot more stable than its alpha status would suggest. We've still been tweaking end-cases in the user interface, but the base functionality has been in production use for a lot of projects for the last year and a half. And we have a very large suite of regression tests that have been implemented incrementally with the code, so functionality tends to stay fixed once a bug has been identified. Plus, we're going to go to beta very soon, anyway.
Also, the functionality in SCons at this point is almost a superset of that in Cons classic, modulo a few smaller features, the absence of which doesn't seem get in too many users' way.
Last, the current version of SCons added integrated support for autoconf-like functionality, starting with the ability to search for header files and libraries. It also provides a framework for adding your own tests, so you're not restricted only to what's available in the tool itself. The next version will add the ability to check for specific functions and types, and may add config.h-like header file generation, too.