[darcs-users] darcs record and huge patches
ketil at ii.uib.no
Wed Jan 28 08:30:22 UTC 2004
Aggelos Economopoulos <aoiko at cc.ece.ntua.gr> writes:
>> So, has anyone tried running record on a large tree with many local
>> changes? If it isn't supposed to work I'll just kill the process, but
>> if is, how long should it take?
> Well, it took about five hours, but seems to have worked (produced a
> 27M patch (6M compressed)). Still, it doesn't seem normal that it
> should take that long or consume so much memory - can't you force
> garbage collection at some point? Would disabling use of mmap() help
> in such extreme cases?
If you have a self-compiled darcs, you could try to recompile it with
profiling (add "-prof" to the compiler switches¹). If you then run the
same command with the additional parameters "+RTS -p -RTS", which will
give you a breakdown of time usage per function.
But perhaps we should first wait to see if this is a known (or perhaps
even already solved) problem? :-)
¹ Assuming GHC as the compiler
If I haven't seen further, it is by standing in the footprints of giants
More information about the darcs-users