[darcs-devel] Support for darcshive repositories?

David Roundy droundy at abridgegame.org
Fri Dec 3 04:41:43 PST 2004


On Fri, Dec 03, 2004 at 12:17:47PM +1000, Anthony Towns wrote:
> David Roundy wrote:
> >On Wed, Dec 01, 2004 at 03:11:02AM +1000, Anthony Towns wrote:
> >>You could have "DARCS_FOO_GET" take a url, and expect it to dump the 
> >>file on stdout; and have DARCS_FOO_MGET or DARCS_FOO_APPLY optionally 
> >>available for special darcs-specific wrappers/scripts -- if _MGET isn't 
> >>set it can be faked with multiple _GETs after all, and if _APPLY's not 
> >>set, well you just can't push.
> >I'm thinking we could perhaps make the MGET accept a list of URLs and a
> >target directory?
> 
> Why not just have the target direction be "." by the time we run the 
> command? That's what we do for wget already...

Oh sure, that would make sense.

> >I think that may work with non-wrapper commands, although of course it
> >requires that the "filename" portion of each URL be unique, but that's
> >not an onerous restriction.
> 
> Only question is whether to do it by stdin, or as cmd line arguments. 
> Extra arguments works for more programs (curl, fetch), but we'd probably 
> have to worry about doing our own "xargs" then too so we don't have to 
> worry about programs that can't deal with 2000+ arguments... I guess 
> it's okay to invoke the MGET command say, once for every 200 files to 
> download. :-/

Hmmmm.  A MGET_MAXFILES environment variable, defaulting to some reasonable
value? I'm not sure how we determine reasonable here...

I'd rather pass the files as command line arguments.  That way the
interface can easily deal with filenames containing spaces, newlines, etc,
since we're just passing a bunch of C strings to MGET.  Otherwise we'd have
to either have some sort of restriction, or to have some sort of quoting.
-- 
David Roundy
http://www.darcs.net




More information about the darcs-devel mailing list