WWW processing of tar files
"Peter Lister, Cranfield Computer Centre" <ccprl@xdm001.ccc.cranfield.ac.uk>
Message-id: <9303121401.AA10183@xdm039>
To: www-talk@nxoc01.cern.ch
Cc: ccprl@xdm001.ccc.cranfield.ac.uk
Subject: WWW processing of tar files
Date: Fri, 12 Mar 93 14:00:58 GMT
From: "Peter Lister, Cranfield Computer Centre" <ccprl@xdm001.ccc.cranfield.ac.uk>
Have WWW (or Gopher for that matter) developers considered processing
tar and tar.Z files as if they were directories? One could browse the
contents of a remote tar file, or treat it as a binary file to copy
back to the local host.
This has obvious application for shareware source at FTP archives, and
could save disk space and bandwidth: removing the need to copy an
entire tar file back when only a few are wanted; economising on server
disk space by not having to keep an unpacked directory tree as well as
a tar.Z; encouraging bulk transfers to be compressed, yet still
allowing the user to see "just files". Browsers would allow the user to
save the tar/tar.Z file, or just the list of contents, or transparently
unpack it. If added to libwww, it can be built easily into client and server ends.
Comments? Is this sensible?
Peter Lister p.lister@cranfield.ac.uk
Computer Centre,
Cranfield Institute of Technology, Voice: +44 234 754200 ext 2828
Cranfield, Bedfordshire MK43 0AL England Fax: +44 234 750875