Re: [Module::Build] codebase? paths in .tar.gz?
Status: Beta
Brought to you by:
kwilliams
|
From: Glenn L. <pe...@ne...> - 2003-12-22 01:31:11
|
On approximately 12/21/2003 4:29 AM, came the following characters from
the keyboard of Jos I. Boumans:
>
> On Sun, 21 Dec 2003, [iso-8859-1] Andrew Savige wrote:
>
>>At the cost of making the A::T code somewhat ugly, I think an imperfect
>>heuristic solution is possible, so that if all PATHs in the tarball were
>>less than 100 characters in length, A::T could write the whole path at
>>the front of the chunk and not use the 'prefix' field -- with a different
>>code path, using the 'prefix' field, followed if any path in the tarball
>>exceeds 100 characters in length.
>
> i'm very much opposed to this solution, since this still doesn't actually
> /fix/ the problem; it's merely coding around a problem in /other/ tools,
> and it will introduce a very fun /new/ 'bug' that distributions that keep
> their longest path under 100 chars will do the right thing, and any
> distribution with a path > 100 chars will now break... id much rather
> document this shortcoming in other tools and add a dependency on a
> /proper/ tar extractor, that follows the specs for the tar format... A::T,
> gnu tar and quite a few others do... old bsd tars and some )"(#/"#)
> windows tools dont.....
>
> just my $0.02
>
> -jos
Interesting point of view. That's sort of like outlawing all the old
cars that don't have some new safety feature, making it illegal to drive
them on the roads.
Sure seems to me that a high percentage of the tarballs in the world
have mostly paths less than 100 characters.
It is too bad that the old programs can look at the new tarball and see
a subset of the data, instead of reporting the real condition, at least
as a warning if not an error: "This tarball was produced using a version
of tar that implements a newer specification than this tar program. To
see the tarball in its full detail, you should use a tar program that
implements specification GHI. This tar program only implements
specification ABC."
That would explain the situation to the user.
On the other hand, for those tarballs that can be correctly created
according to the older specifications, without loss of functionality,
that is an extremely friendly feature even for a program that is capable
of creating tarballs using the new specifications, with new
functionality. This sort of thing is called "backward compatibility",
and it encourages people to use the new program, since it interoperates
with the old, instead of encouraging them to throw the new program away,
sticking with the old tried and true and interoperable programs.
I have to admit that if I was under time pressure today to release my
new module (I still have a few weeks to go), that I would be extremely
tempted to discard both
(1) Archive::Tar,
(2) Module::Build,
because
(A) the module that I built won't install properly using PPM (even
with A::T v1.07),
(B) I can't see the paths to the files using PowerArchiver 2001,
(C) I don't get warnings for each directory that is supposed to be
in the tarball, and
(D) I have other sample modules using MakeMaker and tar programs,
that do not suffer from (A), (B), and (C), so it is clearly a functional
path to take.
Since I have some lead time, I'll wait and see how this plays out over
the next few weeks. People like backward compatibility, because it
makes things work. And when things don't work, because of incremental
versionitis, it is nice if there is a built in method for reporting that
that is why, instead of simply providing reduced functionality without
warning. Providing reduced functionality is OK, with the warning. So
if an old tar program can extract the data, but not put it in the right
place, that is OK, if it tells me that it the tarball is a newer version
that it might not be able to fully decode.
The fact that there are bugs and deficiencies in old programs, however,
is no excuse for a new program not to be as backward compatible as it
can be.
Personally, I'd rather see archives stored as .zip files, which at least
were designed to do both archiving and compression in one step, rather
than play the two-step jig of .tar.gz. But that's a whole 'nother
story, and an idea which probably won't get much momentum or attention,
because .tar.gz is there and works. Well, given my above woes, sort of.
--
Glenn -- http://nevcal.com/
===========================
Like almost everyone, I receive a lot of spam every day, much of it
offering to help me get out of debt or get rich quick. It's ridiculous.
-- Bill Gates
And here is why it is ridiculous:
The division that includes Windows posted an operating profit of $2.26
billion on revenue of $2.81 billion.
--from Reuters via
http://biz.yahoo.com/rc/031113/tech_microsoft_msn_1.html
So that's profit of over 400% of investment... with a bit more
investment in Windows technology, particularly in the area of
reliability, the profit percentage might go down, but so might the bugs
and security problems? Seems like it would be a reasonable tradeoff.
WalMart earnings are 3.4% of investment.
|