James,
On 18 Oct 2005, at 07:07, James Ring wrote:
> In the case of my final year university group project, the
> lecturers are
> explicitly basing some of our assessment on the number of lines
> contributed
> to CVS. I think that this is a sorry situation, given how poorly a
> simple
> line count can represent somebody's input to a software engineering
> project.
True. In the words of Bill Gates: Measuring programming progress by
lines of code is like measuring aircraft building progress by weight.
(I love this quote. You teachers would probably hate it ;-)
>> Link and some commentary on my weblog:
>> http://dowhatimean.net/2005/10/cvs-and-performance
>
> You mention that the lack of correlation between code metrics and
> access
> patterns does not bode well for StatCVS. I don't think that it's
> necessarily a bad thing... Finding correlations between student grades
> and CVS activity is only one possible application
> for a good tool such as StatCVS.
>
> For example, in an industry project, the project manager may use a
> whole
> bunch of statistics from various sources (including, but not
> limited to,
> StatCVS) to evaluate the health and progress of their project.
>
> I guess what I'm trying to say is that by not providing more
> metrics, we'll
> never see if any interesting correlations come up in unexpected
> places.
>
> I'd certainly like to see some sort of Java code metrics as an "add-
> on"
> for StatCVS, like the ability to measure (for example) method fan-in
> and fan-out in a Java project over time.
That would certainly be interesting, but there are two issues: 1)
It's programming language dependent, which limits the target
audience. 2) I can't see how to present this in an end user
application. If we have a graph of method fan-in over time, I can see
everybody going: "OK, so method fan-in has risen 30% over the last
half year. Now is this good or bad?"
Richard
>
> The more stats, the merrier!
>
>
>> Best,
>> Richard
>>
>>
>
> Thanks,
> James
> --
> James Ring
>
>
>
|