Actually, I used to generate rules using gcc -MM which does pretty much this: track dependencies in sources files using #include directives.
Nevertheless, I'm happy to see that the idea of automatically handling dependencies is still being worked on. I'd be so happy to trash all this scons config files we have to maintain on a near-daily basis.
djb/apenwarr's "redo" is a good make alternative, infinitely simpler and about as capable.
tup is a good make alternative, that does away with dependency definitions, recursive makes, etc. It just works and works quickly.
http://savannah.gnu.org/bugs/?712
I'm pretty sure that trying to replace them at this point is sacrilege.
The hard part is figuring out a migration path away from Autotools. Not even replacing them, that's relatively easy. Integrating them into a world dominated by Autotools, that's difficult.
Why doesn't this work in practice? Non-idiomatic usage? Custom autoconf scripts?
Pretty sure pkg-config can help with that. Don't see why you need SO.
It won't rebuild stuff that already exists. Once it decides that it needs to make foo.cc and foo.h into foo.o, it won't recompile foo.o unless foo.cc or foo.h changes... or one of its source dependencies changes.
It's something like this: take all of the timestamps for all of the inputs and outputs for any given target. The oldest output has to still be newer than the newest input. If any input is newer than any output, then we need to build.
It sounds weird, but if you write it out like a number line it makes sense.
A comment for the ROOT users (http://root.cern.ch/) amongst you HN readers: this appears to be very similar to 'ckon' which emerged from my PhD since 2011, and takes the humongous headache out of building C++ software modules within the ROOT analysis framework: http://tschaume.github.io/ckon/
@rachelbythebay: Since 'ckon' uses the same principles as your depot build tool, I thought you might be interested to take a look: https://github.com/tschaume/ckon :-)
Anyone knows if the "everything for your project must be contained within a single directory root" constraint can be tricked using symbolic links?
Having just tried it... sure, it'll work. Starting in my "depot" dir...
---
$ mkdir /tmp/hn
$ echo 'int main() { return 0; }' > /tmp/hn/hn.cc
$ ln -s /tmp/hn src/hn
$ bb hn/hn
I1106 144900 4720 build/dep.cc:591] Compiling: hn/hn
I1106 144902 4720 build/deptracker.cc:184] Linking: hn/hn
-rwxr-xr-x 1 u g 7364 Nov 6 14:49 bin/hn/hn
$ bin/hn/hn
$ echo $?
0system_header { name: "microhttpd.h" ldflag: "-lmicrohttpd" }
system_header { name: "mysql/mysql.h" ldflag: "-L/usr/lib64/mysql" ldflag: "-lmysqlclient" }
Let's see... hard-coded absolute paths, compiler-specific flags... Yeah I'll stick with QBS and wait for proper modules.
It should be more like this:
pkg-config --identify-file=microhttpd.h --> "libmicrohttpd"
The problem now is that you can't go from the .h name to the package name. Once you have the package name, you can use pkg-config to give you --libs and --cflags, but there's a (big) piece of the puzzle missing at the moment.
Changing pkg-config and its users to add that mapping would be amazing.
Once we had that, tools like this could see #include <foo.h>, look it up to a package name, use that to get the cflags and ldflags, and that would be it. No config needed.
The page ("deps.html") that introduces the .build.conf file says "You can use pkg-config to help find those flags if your system has that installed", but then the actual configuration shown uses only absolute paths.
That threw me off, since it's not at all obvious how or where you can insert calls to external tools (like pkg-config) in the static-looking configuration. I think it'd be a good idea to edit in an example showing pkg-config being used.