senseclusters-developers Mailing List for SenseClusters
Status: Beta
Brought to you by:
tpederse
You can subscribe to this list here.
2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(2) |
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
(6) |
Feb
|
Mar
|
Apr
(2) |
May
|
Jun
(1) |
Jul
(1) |
Aug
(2) |
Sep
(5) |
Oct
(30) |
Nov
(7) |
Dec
(11) |
2005 |
Jan
(51) |
Feb
(8) |
Mar
(3) |
Apr
(2) |
May
(2) |
Jun
(2) |
Jul
(5) |
Aug
(20) |
Sep
(5) |
Oct
(2) |
Nov
(2) |
Dec
|
2006 |
Jan
(8) |
Feb
(2) |
Mar
(7) |
Apr
(2) |
May
(4) |
Jun
(16) |
Jul
(7) |
Aug
(6) |
Sep
(1) |
Oct
(4) |
Nov
(1) |
Dec
|
2007 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2008 |
Jan
|
Feb
|
Mar
(2) |
Apr
(10) |
May
|
Jun
(1) |
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
(2) |
2009 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
(2) |
Sep
|
Oct
|
Nov
|
Dec
|
2010 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(2) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2013 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
(3) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2015 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
|
Dec
|
From: Ted P. <dul...@gm...> - 2015-10-03 22:35:02
|
We are pleased to announce a new release of SenseClusters. This is a very minor bug fix release, but might be something you want to adopt since it will eliminate some annoying warning messages that appear as of Perl v 5.15. In that version of Perl the use of defined (@array) has been deprecated, and so we have a few spots where we were using this and where it now causes warnings. That has been resolved, so as of version 1.05 of SenseClusters you should not see these warnings. More about defined (@array) can be found here, if you are interested : http://www.perlmonks.org/index.pl?node_id=1077762 You can download the most current version of SenseClusters from CPAN or Sourceforge by following the links here : http://senseclusters.sourceforge.net Please let us know if any questions arise. Cordially, Ted -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <dul...@gm...> - 2015-05-25 14:55:38
|
Greetings all, I just wanted to mention that SenseClusters participated in Task 15 of SemEval 2015, and I'll be presenting a poster about that on Thur June 4 from 2:45 - 4:00 pm (as a part of the SemEval workshop). Here's a bit more about the task: http://alt.qcri.org/semeval2015/task15/ And here's the schedule for Semeval : http://alt.qcri.org/semeval2015/cdrom/program.html Finally, here's the paper that describes the SenseClusters system : http://alt.qcri.org/semeval2015/cdrom/pdf/SemEval076.pdf So, if you will be in Denver it would be nice to see you at the poster session or some other time! Cordially, Ted |
From: Ted P. <tpederse@d.umn.edu> - 2013-06-30 03:11:35
|
We are pleased to announce the release of version 1.03 of SenseClusters. This is the first new release in 5 years, and should be the first of several upcoming releases. There has been a little bit of clean up in the test scripts and other places, but the main new functionality are some additional ways of labeling the discovered clusters. Before this version clusters have been labeled with significant bigrams - as of version 1.03 it is now possible to label clusters with trigrams or 4-grams. Additional functionality related to cluster labeling is expected to be released in the coming months, so please give this a try and let us know of any suggestions or observations you might have. The changes in this version are enumerated below. You can download from CPAN or sourceforge via the links provided here : http://senseclusters.sourceforge.net 1.03 Released June 29, 2013 (changes by TDP and AMJ) Modify install.sh to default to Linux-x86_64 for Cluto installation (TDP) Removed various instances of if (defined %hash) in preprocess/sval2 in favor of if (%hash) - defined %hash is now deprecated - however left that in keyconvert.pl as removed caused syntax issue that should be checked out further (TDP) Fixed Testing/ALL-TESTS.sh to run all test cases by enumerating in for loop - previous method of using wild card did not seem to be running all cases (TDP) Fixed some test cases for clusterstopping in Testing - note that we still have Sun test cases included although no Sun platform to test on. Should keep those though as cluto still comes with a Sun version (TDP) Added the flag "ngram" for clusterlabeling.pl. It will allow user to provide the value for ngram. The features selection while creating the labels of cluster will be based on this parameter. (AMJ) Added --label_ngram option to discriminate.pl to support new --ngram option in clusterlabeling.pl (AMJ) Added test cases testA6 and testA7 to test changes in clusterlabeling. (AMJ) Updated INSTALL to mention depencies on csh and using bash as the system shell (TDP) Please let us know of any questions, problems, or suggestions! Enjoy, Ted and Anand -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <tpederse@d.umn.edu> - 2013-06-29 20:42:34
|
Here's the testing log for the candidate 1.03 release. There are two unresolved issues, one is an error in wordvec, and the other is some deprecated behavior in keyconvert.pl. I think I'm content to let these slide until a later release, just because I'm not sure how to resolve either (not because they are horrible problems I don't think, just a matter of having time to look them over). It also seems like clusterstopping and SVD results are sometimes hardware dependent, which is something we've encountered previously. This test log is based on a Linux x86_64 system. Results could different on other systems, but I don't think reveal any fundamental problem. So, I think I'm relatively close to releasing 1.03...let me know if there is any reason we should not go ahead with that... Thanks! Ted ---------------------------------------------------- ------------- clusterlabeling.pl ------------------- ---------------------------------------------------- Test A1 - Testing clusterlabeling.pl for a no label cluster. Running clusterlabeling.pl testA1.clusters_context --prefix testA1 --rank 5 --window 4 --stop stoplist.new --token token.regex --stat ll --remove 2 > testA1.output STATUS : OK Test Results Match. STATUS : OK Cluster file testA1.cluster.-1 created. STATUS : OK Cluster file testA1.cluster.0 created. STATUS : OK Cluster file testA1.cluster.1 created. Test A2 - Testing clusterlabeling.pl Running clusterlabeling.pl --token token.regex --stop stoplist.new --rank 5 --stat ll --prefix testA2 testA2.clusters_context > testA2.output STATUS : OK Test Results Match. STATUS : OK Cluster file testA2.cluster.0 created. STATUS : OK Cluster file testA2.cluster.1 created. Test A3 - Testing clusterlabeling.pl Running clusterlabeling.pl --token token.regex --stop stoplist.new --rank 5 --stat ll --prefix testA3 testA3.clusters_context > testA3.output STATUS : OK Test Results Match. STATUS : OK Cluster file testA3.cluster.0 created. STATUS : OK Cluster file testA3.cluster.1 created. Test A4 - Testing clusterlabeling.pl with Pointwise Mutual Information (pmi) as the test of association. Running clusterlabeling.pl --token token.regex --rank 5 --remove 6 --stop stoplist.new --stat pmi --prefix testA4 testA4.clusters_context > testA4.reqd STATUS : OK Test Results Match. STATUS : OK Cluster file testA4.cluster.0 created. STATUS : OK Cluster file testA4.cluster.1 created. Test A5 - Testing clusterlabeling.pl without stoplist. Running clusterlabeling.pl --token token.regex --rank 5 --stat ll --prefix testA5 testA5.clusters_context > testA5.output STATUS : OK Test Results Match. STATUS : OK Cluster file testA5.cluster.0 created. STATUS : OK Cluster file testA5.cluster.1 created. Test A6 - Testing clusterlabeling.pl without stoplist. Running clusterlabeling.pl --token token.regex --window 4 --remove 2 --rank 5 --stat ll --prefix testA6 testA6.clusters_context --ngram 3> testA6.output STATUS : OK Test Results Match. STATUS : OK Cluster file testA6.cluster.0 created. STATUS : OK Cluster file testA6.cluster.1 created. Test A7 - Testing clusterlabeling.pl with stoplist. Running clusterlabeling.pl --token token.regex --window 4 --remove 2 --rank 5 --stat ll --prefix testA7 testA7.clusters_context --ngram 3 --stop stoplist.new > testA7.output STATUS : OK Test Results Match. STATUS : OK Cluster file testA7.cluster.0 created. STATUS : OK Cluster file testA7.cluster.1 created. ---------------------------------------------------- ------------- clusterstopping.pl ------------------- ---------------------------------------------------- Test A1 - Testing clusterstopping.pl with default settings. Running clusterstopping.pl --prefix testA1 testA1.vectors STATUS : OK Test Results Match. STATUS : OK File testA1.cr.dat created. STATUS : OK File testA1.pk3 created. STATUS : OK File testA1.pk3.dat created. Test A2 - Testing clusterstopping.pl when using all measures. Running clusterstopping.pl --prefix testA2 --measure all --clmethod rbr --crfun e1 --precision 6 --seed 3 testA2.vectors STATUS : OK Test Results Match. STATUS : OK File testA2.cr.dat created. STATUS : OK File testA2.pk1 created. STATUS : OK File testA2.pk1.dat created. STATUS : OK File testA2.pk2 created. STATUS : OK File testA2.pk2.dat created. STATUS : OK File testA2.pk3 created. STATUS : OK File testA2.pk3.dat created. STATUS : OK File testA2.exp.dat created. STATUS : OK File testA2.gap created. STATUS : OK File testA2.gap.dat created. STATUS : OK File testA2.gap.log created. Test A3 - Testing clusterstopping.pl in similarity space. Running clusterstopping.pl --prefix testA3 --space similarity --measure pk testA3.vectors STATUS : OK Test Results Match. STATUS : OK File testA3.cr.dat created. STATUS : OK FiletestA3.pk1 created. STATUS : OK FiletestA3.pk1.dat created. STATUS : OK FiletestA3.pk2 created. STATUS : OK FiletestA3.pk2.dat created. STATUS : OK FiletestA3.pk3 created. STATUS : OK FiletestA3.pk3.dat created. Test A4 - Testing clusterstopping.pl in similarity space with all options. Running clusterstopping.pl --prefix testA4 --measure all --space similarity --delta 2 --clmethod rbr --crfun i2 --threspk1 -0.6 --seed 5 testA4.simat STATUS : OK Test Results Match. STATUS : OK File testA4.cr.dat created. STATUS : OK File testA4.pk1 created. STATUS : OK File testA4.pk1.dat created. STATUS : OK File testA4.pk2 created. STATUS : OK File testA4.pk2.dat created. STATUS : OK File testA4.pk3 created. STATUS : OK File testA4.pk3.dat created. STATUS : OK File testA4.gap created. STATUS : OK File testA4.exp.dat created. STATUS : OK File testA4.gap.dat created. STATUS : OK File testA4.gap.log created. Test A5 - Testing clusterstopping.pl in vector space using all options. Running clusterstopping.pl --prefix testA5 --measure all --space vector --delta 2 --clmethod bagglo --crfun h1 --sim corr --rowmodel log --colmodel idf testA5.vectors STATUS : OK Test Results Match. STATUS : OK File testA5.cr.dat created. STATUS : OK FiletestA5.pk1 created. STATUS : OK FiletestA5.pk1.dat created. STATUS : OK FiletestA5.pk2 created. STATUS : OK FiletestA5.pk2.dat created. STATUS : OK FiletestA5.pk3 created. STATUS : OK FiletestA5.pk3.dat created. Test A6 - Testing clusterstopping.pl in vector space using all options. Running clusterstopping.pl --clmethod direct --crfun i1 --measure all testA6.vectors STATUS : OK Test Results Match. STATUS : OK File testA6.cr.dat created. STATUS : OK FiletestA6.pk1 created. STATUS : OK FiletestA6.pk1.dat created. STATUS : OK FiletestA6.pk2 created. STATUS : OK FiletestA6.pk2.dat created. STATUS : OK FiletestA6.pk3 created. STATUS : OK FiletestA6.pk3.dat created. STATUS : OK FiletestA6.gap created. STATUS : OK File testA6.exp.dat created. STATUS : OK File testA6.gap.dat created. STATUS : OK File testA6.gap.log created. Test A7 - Testing clusterstopping.pl in vector space with delta = 0 and all options. clusterstopping.pl --prefix testA7 --measure all --delta 0 --clmethod rbr --crfun i2 --seed 5 testA7.vectors > testA7.output STATUS : OK Test Results Match. STATUS : OK File testA7.cr.dat created. STATUS : OK File testA7.pk1 created. STATUS : OK File testA7.pk1.dat created. STATUS : OK File testA7.pk2 created. STATUS : OK File testA7.pk2.dat created. STATUS : OK File testA7.pk3 created. STATUS : OK File testA7.pk3.dat created. STATUS : OK File testA7.gap created. STATUS : OK File testA7.exp.dat created. STATUS : OK File testA7.gap.dat created. STATUS : OK File testA7.gap.log created. Test A8 - Testing clusterstopping.pl in vector space with delta = 0 on a contrived data to test the prediction consistency across platforms. clusterstopping.pl --prefix testA8 --measure all --delta 0 --clmethod rbr --crfun i2 --seed 5 testA8.vectors > testA8.output STATUS : OK Test Results Match. STATUS : OK File testA8.cr.dat created. STATUS : OK File testA8.pk1 created. STATUS : OK File testA8.pk1.dat created. STATUS : OK File testA8.pk2 created. STATUS : OK File testA8.pk2.dat created. STATUS : OK File testA8.pk3 created. STATUS : OK File testA8.pk3.dat created. STATUS : OK File testA8.gap created. STATUS : OK File testA8.exp.dat created. STATUS : OK File testA8.gap.dat created. STATUS : OK File testA8.gap.log created. ---------------------------------------------------- ---------------- reduce-count.pl ------------------- ---------------------------------------------------- Test A1 for reduce-count.pl Running reduce-count.pl test-A1.bi test-A1.uni Test Ok Test A2 for reduce-count.pl Running reduce-count.pl test-A2.bi test-A2.uni Test Ok Test A3 for reduce-count.pl Running reduce-count.pl test-A3.bi test-A3.uni Test Ok Test A4 for reduce-count.pl Running reduce-count.pl test-A4.bi test-A4.uni Test Ok Test B1 for reduce-count.pl Running reduce-count.pl test-B1.bi test-B1.uni Test Ok ---------------------------------------------------- ---------------- cluto2label.pl --------------------- ---------------------------------------------------- Test A1 Testing cluto2label.pl Running cluto2label.pl test-A1.cluto test-A1.key Test A1 OK Test A2 Testing cluto2label.pl Running cluto2label.pl test-A2.cluto test-A2.key Test A2 OK Test A3 Testing cluto2label.pl Running cluto2label.pl test-A3.cluto test-A3.key Test A3 OK Test A4 Testing cluto2label.pl Running cluto2label.pl test-A4.cluto test-A4.key Test A4 OK Test A5 Testing cluto2label.pl Running cluto2label.pl test-A5.cluto test-A5.key Test A5 OK Test A6 Testing cluto2label.pl Running cluto2label.pl --numthrow 2 test-A6.cluto test-A6.key Test A6 OK Test A7 Testing cluto2label.pl Running cluto2label.pl --perthrow 10 test-A7.cluto test-A7.key Test A7 OK ---------------------------------------------------- ------------- format_clusters.pl ------------------- ---------------------------------------------------- Test A1 - Testing format_clusters.pl without any options, i.e. the default output format. Running format_clusters.pl testA1.clusol testA1.rlabel STATUS : OK Test Results Match. Test A2 - Testing format_clusters.pl with --context options. Running format_clusters.pl --context testA2.sval2 testA2.clusol testA2.rlabel STATUS : OK Test Results Match. Test A3 - Testing format_clusters.pl with --senseval2 options. Running format_clusters.pl --senseval2 testA3.sval2 testA3.clusol testA3.rlabel STATUS : OK Test Results Match. Test B1 - Testing format_clusters.pl with --context and --senseval2 options, both. Running format_clusters.pl --context testB1.sval2 --senseval2 testB1.sval2 testB1.clusol testB1.rlabel STATUS : OK Test Results Match. Test B2 - Testing format_clusters.pl without cluster_solution file. Running format_clusters.pl testB2.rlabel STATUS : OK Test Results Match. Test B3 - Testing format_clusters.pl without rlabel file. Running format_clusters.pl testB3.clusol STATUS : OK Test Results Match. Test B4 - Testing format_clusters.pl without cluster_solution and rlabel file. Running format_clusters.pl STATUS : OK Test Results Match. ---------------------------------------------------- --------------------- label.pl --------------------- ---------------------------------------------------- Test A1 - Testing label.pl for condition #Clusters = #Labels Running label.pl testA1a.prelabel STATUS : OK Test Results Match..... Running label.pl testA1b.prelabel STATUS : OK Test Results Match..... Running label.pl testA1c.prelabel STATUS : OK Test Results Match..... Running label.pl testA1d.prelabel STATUS : OK Test Results Match..... Running label.pl testA1e.prelabel STATUS : OK Test Results Match..... Running label.pl testA1f.prelabel STATUS : OK Test Results Match..... Running label.pl testA1g.prelabel STATUS : OK Test Results Match..... Test A2 - Testing label.pl for condition #Clusters < #Labels Running label.pl testA2a.prelabel STATUS : OK Test Results Match..... Running label.pl testA2b.prelabel STATUS : OK Test Results Match..... Running label.pl testA2c.prelabel STATUS : OK Test Results Match..... Running label.pl testA2d.prelabel STATUS : OK Test Results Match..... Running label.pl testA2e.prelabel STATUS : OK Test Results Match..... Test A3 - Testing label.pl for condition #Clusters > #Labels Running label.pl testA3a.prelabel STATUS : OK Test Results Match..... Running label.pl testA3b.prelabel STATUS : OK Test Results Match..... Running label.pl testA3c.prelabel STATUS : OK Test Results Match..... Running label.pl testA3d.prelabel STATUS : OK Test Results Match..... Running label.pl testA3e.prelabel STATUS : OK Test Results Match..... Running label.pl testA3f.prelabel STATUS : OK Test Results Match..... Running label.pl testA3g.prelabel STATUS : OK Test Results Match..... Running label.pl testA3h.prelabel STATUS : OK Test Results Match..... Test A4 - Testing label.pl for condition #Clusters=25 and #Labels=25 Running label.pl test-A4.prelabel STATUS : OK Test Results Match..... Test B1 - test-B1.prelabel doesn't start with #unclustered instances Running label.pl test-B1.prelabel STATUS : OK Test Results Match..... Test B2 - test-B2.prelabel doesn't contain Sense List starting with // Running label.pl test-B2.prelabel STATUS : OK Test Results Match..... ---------------------------------------------------- ---------------------- report.pl ------------------- ---------------------------------------------------- TEST A10 Running report.pl test-A10.map test-A10.matrix STATUS : OK Test Results Match..... TEST A11 Running report.pl test-A11.map test-A11.matrix STATUS : OK Test Results Match..... TEST A12 Running report.pl test-A12.map test-A12.matrix STATUS : OK Test Results Match..... TEST A13 Running report.pl test-A13.map test-A13.matrix STATUS : OK Test Results Match..... TEST A14 Running report.pl test-A14.map test-A14.matrix STATUS : OK Test Results Match..... TEST A15 Running report.pl test-A15.map test-A15.matrix STATUS : OK Test Results Match..... TEST A16 Running report.pl test-A16.map test-A16.matrix STATUS : OK Test Results Match..... TEST A17 Running report.pl test-A17.map test-A17.matrix STATUS : OK Test Results Match..... TEST A1 Running report.pl test-A1.map test-A1.matrix STATUS : OK Test Results Match..... TEST A2 Running report.pl test-A2.map test-A2.matrix STATUS : OK Test Results Match..... TEST A3 Running report.pl test-A3.map test-A3.matrix STATUS : OK Test Results Match..... TEST A4 Running report.pl test-A4.map test-A4.matrix STATUS : OK Test Results Match..... TEST A5 Running report.pl test-A5.map test-A5.matrix STATUS : OK Test Results Match..... TEST A6 Running report.pl test-A6.map test-A6.matrix STATUS : OK Test Results Match..... TEST A7 Running report.pl test-A7.map test-A7.matrix STATUS : OK Test Results Match..... TEST A8 Running report.pl test-A8.map test-A8.matrix STATUS : OK Test Results Match..... TEST A9 Running report.pl test-A9.map test-A9.matrix STATUS : OK Test Results Match..... ---------------------------------------------------- ------------------- bitsimat.pl -------------------- ---------------------------------------------------- Test A1a for bitsimat.pl Running bitsimat.pl --dense --format f8.3 test-A1a1.vec Test Ok Running bitsimat.pl --format f8.3 test-A1a2.vec Test Ok Test A1b for bitsimat.pl Running bitsimat.pl --dense --format f8.3 test-A1b1.vec Test Ok Running bitsimat.pl --format f8.3 test-A1b2.vec Test Ok Test A1c for bitsimat.pl Running bitsimat.pl --dense --format f8.3 test-A1c1.vec Test Ok Running bitsimat.pl --format f8.3 test-A1c2.vec Test Ok Test A2 for bitsimat.pl Running bitsimat.pl --dense --format i2 --measure match test-A21.vec Test Ok Running bitsimat.pl --format i2 --measure match test-A22.vec Test Ok Test A3 for bitsimat.pl Running bitsimat.pl --dense --measure dice --format f8.3 test-A31.vec Test Ok Running bitsimat.pl --measure dice --format f8.3 test-A32.vec Test Ok Test A4 for bitsimat.pl Running bitsimat.pl --dense --measure overlap --format f8.3 test-A41.vec Test Ok Running bitsimat.pl --measure overlap --format f8.3 test-A42.vec Test Ok Test A5 for bitsimat.pl Running bitsimat.pl --dense --measure jaccard --format f8.3 test-A51.vec Test Ok Running bitsimat.pl --measure jaccard --format f8.3 test-A52.vec Test Ok Test A6 for bitsimat.pl Running bitsimat.pl --dense --measure cosine --format f8.4 test-A61.vec Test Ok Running bitsimat.pl --measure cosine --format f8.4 test-A62.vec Test Ok Test A7 for bitsimat.pl Running bitsimat.pl --dense --measure cosine --format f8.4 test-A71.vec Test Ok Running bitsimat.pl --measure cosine --format f8.4 test-A72.vec Test Ok Test A8 for bitsimat.pl Running bitsimat.pl --dense --format f6.3 test-A81.vec Test Ok Running bitsimat.pl --format f6.3 test-A82.vec Test Ok Test B1a for bitsimat.pl Running bitsimat.pl --dense --measure cosine --format f8.4 test-B1a.vec Test Ok Test B1b for bitsimat.pl Running bitsimat.pl --dense --measure cosine --format f8.4 test-B1b.vec Test Ok Running bitsimat.pl --measure cosine --format f8.4 test-B1c.vec Test Ok Test B2 for bitsimat.pl Running bitsimat.pl --dense --measure cosine --format f8.4 test-B21.vec Test Ok Running bitsimat.pl --measure cosine --format f8.4 test-B22.vec Test Ok ---------------------------------------------------- ------------------ simat.pl ------------------------ ---------------------------------------------------- Test A10 for simat.pl Running simat.pl --format f6.3 --dense test-A10a.vec Test Ok Running simat.pl --format f6.3 test-A10b.vec Test Ok Test A11 for simat.pl Running simat.pl --format f6.3 --dense test-A11a.vec Test Ok Running simat.pl --format f6.3 test-A11b.vec Test Ok Test A1 for simat.pl Running simat.pl --dense --format f7.3 test-A1.vec Test Ok Test A2 for simat.pl Running simat.pl --dense --format f2.0 test-A2.vec Test Ok Test A3 for simat.pl Running simat.pl --dense --format f2.0 test-A3.vec Test Ok Test A4 for simat.pl Running simat.pl --dense --format f2.0 test-A4.vec Test Ok Test A5 for simat.pl Running simat.pl --dense --format f7.3 test-A5.vec Test Ok Test A6 for simat.pl Running simat.pl --dense --format f7.3 test-A6.vec Test Ok Test A7 for simat.pl Running simat.pl --dense --format f9.5 test-A7.vec Test Ok Test A8 for simat.pl Running simat.pl --dense --format f6.3 test-A8.vec Test Ok Test A9 for simat.pl Running simat.pl --format f8.3 --dense test-A9.vec Test Ok Test B1 for simat.pl Running simat.pl --dense --format f7.3 test-B1a.vec Test Ok Running simat.pl --format f7.3 test-B1b.vec Test Ok Test B2 for simat.pl Running simat.pl --dense --format f7.4 test-B2a.vec Test Ok Running simat.pl --format f7.4 test-B2b.vec Test Ok Test B3 for simat.pl Running simat.pl --dense --format f7.4 test-B3a.vec Test Ok Running simat.pl --format f7.4 test-B3b.vec Test Ok ---------------------------------------------------- ------------------ mat2harbo.pl -------------------- ---------------------------------------------------- Test A10 for mat2harbo.pl Running mat2harbo.pl --param --k 4 --numform 8f10.6 test-A10.mat Test Ok Running mat2harbo.pl --param --rf 2 --k 3 --numform 8f10.6 test-A10.mat Test Ok Running mat2harbo.pl --param --rf 3 --k 7 --numform 8f10.6 test-A10.mat Test Ok Running mat2harbo.pl --param --iter 3 --rf 1 --k 4 --numform 8f10.6 test-A10.mat Test Ok Running mat2harbo.pl --param test-A10.mat --rf 2 --numform 8f10.6 Test Ok Test A11 for mat2harbo.pl Running mat2harbo.pl --param --numform 8f10.6 --iter 8 --k 5 --rf 3 test-A11.mat Test Ok Test A1 for mat2harbo.pl Running mat2harbo.pl --numform 20i4 test-A1.mat Test Ok Test A2 for mat2harbo.pl Running mat2harbo.pl --numform 20i4 test-A2.mat Test Ok Test A3 for mat2harbo.pl Running mat2harbo.pl --numform 10f8.3 test-A3.mat Test Ok Test A4 for mat2harbo.pl Running mat2harbo.pl --numform 40i2 test-A4.mat Test Ok Test A5 for mat2harbo.pl Running mat2harbo.pl --numform 5f16.8 test-A5.mat Test Ok Test A6 for mat2harbo.pl Running mat2harbo.pl --title linedata --id bigraph --cpform 8i10 --rpform 8i10 --numform 8f10.5 test-A6.mat Test Ok Test A7 for mat2harbo.pl Running mat2harbo.pl --numform 8f10.3 --title Bellcore ADI Linguistics Data --id belladit test-A7.mat Test Ok Test A8 for mat2harbo.pl Running mat2harbo.pl --numform 8f10.3 --title Title: Document by Term Matrix for CISI (1460 by 5609) --id bellcist test-A8a.mat Test Ok Running mat2harbo.pl --numform 8f10.3 --title Title: Document by Term Matrix for CRAN (1398 by 4612) --id bellcrat test-A8b.mat Test Ok Running mat2harbo.pl --numform 8f10.3 --title Title: Document by Term Matrix for MED (1033 by 5831) --id bellmedT test-A8c.mat Test Ok Test A9 for mat2harbo.pl Running mat2harbo.pl --numform 20i4 --param test-A91.mat Test Ok Running mat2harbo.pl --param test-A92.mat Test Ok Running mat2harbo.pl --param test-A93.mat Test Ok Running mat2harbo.pl --param test-A94.mat Test Ok Running mat2harbo.pl --param test-A95.mat Test Ok Test B1 for mat2harbo.pl Running mat2harbo.pl --cpform 10f8.3 test-B1.mat Test Ok Running mat2harbo.pl --rpform 10f8 test-B1.mat Test Ok Running mat2harbo.pl --numform 10i8.3 test-B1.mat Test Ok Test B2 for mat2harbo.pl Running mat2harbo.pl test-B2.mat Test Ok Test B3 for mat2harbo.pl Running mat2harbo.pl --numform 8f10.6 test-B31.mat Test Ok Running mat2harbo.pl --cpform 16i5 test-B32.mat Test Ok Running mat2harbo.pl --rpform 20i4 test-B32.mat Test Ok Running mat2harbo.pl --numform 10f8.3 --cpform 20i4 --rpform 40i2 test-B34.mat Test Ok ---------------------------------------------------- ------------------ svdpackout.pl ------------------- ---------------------------------------------------- note that some variation in SVD output is expected due to differences in architectures and precision Test A1a for svdpackout.pl Running las2 Running svdpackout.pl --format i4 lav2 lao2 > test-A1a.output Test Ok Test A1b for svdpackout.pl Running las2 Running svdpackout.pl lav2 lao2 > test-A1b.output Test Ok Test A1c for svdpackout.pl Running las2 Running svdpackout.pl --format i5 lav2 lao2 > test-A1c.output Test Ok Test A1d for svdpackout.pl Running las2 Running svdpackout.pl lav2 lao2 > test-A1d.output Test Ok Test A1e for svdpackout.pl Running las2 Running svdpackout.pl --format i4 lav2 lao2 > test-A1e.output Test Ok Test A1f for svdpackout.pl Running las2 Running svdpackout.pl lav2 lao2 > test-A1f.output Test Ok Test A1g for svdpackout.pl Running las2 Running svdpackout.pl --format i5 lav2 lao2 > test-A1g.output Test Ok Test A1h for svdpackout.pl Running las2 Running svdpackout.pl lav2 lao2 > test-A1h.output Test Ok Test A2 for svdpackout.pl Running las2 Running svdpackout.pl --format f8.3 lav2 lao2 > test-A2.output Test Ok Test A3a for svdpackout.pl Running las2 Running svdpackout.pl --format i4 lav2 lao2 > test-A3a.output Test Ok Test A3b for svdpackout.pl Running las2 Running svdpackout.pl lav2 lao2 > test-A3b.output Test Ok Test A3c for svdpackout.pl Running las2 Running svdpackout.pl --format i5 lav2 lao2 > test-A3c.output Test Ok Test A3d for svdpackout.pl Running las2 Running svdpackout.pl lav2 lao2 > test-A3d.output Test Ok Test A3e for svdpackout.pl Running las2 Running svdpackout.pl --format i5 lav2 lao2 > test-A3e.output Test Ok Test A3f for svdpackout.pl Running las2 Running svdpackout.pl lav2 lao2 > test-A3f.output Test Ok Test A4 for svdpackout.pl Running las2 Running svdpackout.pl --format f8.3 lav2 lao2 > test-A4.output Test Ok Test A5 for svdpackout.pl Running las2 Running svdpackout.pl --rowonly --format f9.5 lav2 lao2 > test-A5.output Test Ok Test B1 for svdpackout.pl Running svdpackout.pl test-B1.lav2 test-B1.lao2 Test Ok Test B2 for svdpackout.pl Running svdpackout.pl test-B2.lav2 test-B2.lao2 Test Ok ---------------------------------------------------- ---------------- order1vec.pl ---------------------- ---------------------------------------------------- Test A10 for order1vec.pl Running order1vec.pl --dense test-A10.sval2 test-A10.regex Test Ok Running order1vec.pl test-A10.sval2 test-A10.regex Test Ok Test A11 for order1vec.pl Running order1vec.pl --extarget --target test-A11.target test-A11.sval2 test-A11.regex Test Ok Test A12 for order1vec.pl Running order1vec.pl --transpose --testregex test-A12a.testregex --rlabel test-A12a.rlabel --clabel test-A12a.clabel --dense test-A12.sval2 test-A12.regex Test Ok Running order1vec.pl --transpose --testregex test-A12b.testregex --rlabel test-A12b.rlabel --clabel test-A12b.clabel test-A12.sval2 test-A12.regex Test Ok Test A1 for order1vec.pl Running order1vec.pl --dense --binary test-A1.sval2 test-A1.regex Test Ok Running order1vec.pl --binary test-A1.sval2 test-A1.regex Test Ok Test A2 for order1vec.pl Running order1vec.pl --dense --binary test-A2.sval2 test-A2.regex Test Ok Running order1vec.pl --binary test-A2.sval2 test-A2.regex Test Ok Test A3 for order1vec.pl Running order1vec.pl --dense test-A3.sval2 test-A3.regex Test Ok Running order1vec.pl test-A3.sval2 test-A3.regex Test Ok Test A4 for order1vec.pl Running order1vec.pl --dense test-A4.sval2 test-A4.regex Test Ok Running order1vec.pl test-A4.sval2 test-A4.regex Test Ok Test A5 for order1vec.pl Running order1vec.pl --dense test-A5.sval2 test-A5.regex Test Ok Running order1vec.pl test-A5.sval2 test-A5.regex Test Ok Test A6 for order1vec.pl Running order1vec.pl --dense test-A6.sval2 test-A6.regex Test Ok Running order1vec.pl test-A6.sval2 test-A6.regex Test Ok Test A7 for order1vec.pl Running order1vec.pl --dense test-A7.sval2 test-A7.regex Test Ok Running order1vec.pl test-A7.sval2 test-A7.regex Test Ok Test A8 for order1vec.pl Running order1vec.pl --dense --rlabel test-A8.rlabel --clabel test-A8.clabel --rclass test-A8.rclass test-A8.sval2 test-A8.regex Test Ok Running order1vec.pl --rlabel test-A8.rlabel --clabel test-A8.clabel --rclass test-A8.rclass test-A8.sval2 test-A8.regex Test Ok Test A9 for order1vec.pl Running order1vec.pl --dense --extarget --binary test-A91.sval2 test-A91.regex Test Ok Running order1vec.pl --extarget --binary test-A91.sval2 test-A91.regex Test Ok Running order1vec.pl --dense --extarget --target test-A92.target test-A92.sval2 test-A92.regex Test Ok Running order1vec.pl --extarget --target test-A92.target test-A92.sval2 test-A92.regex Test Ok Running order1vec.pl --dense --extarget --target test-A93.target test-A93.sval2 test-A93.regex Test Ok Running order1vec.pl --extarget --target test-A93.target test-A93.sval2 test-A93.regex Test Ok Running order1vec.pl --dense --extarget --target test-A94.target test-A94.sval2 test-A94.regex Test Ok Running order1vec.pl --extarget --target test-A94.target test-A94.sval2 test-A94.regex Test Ok ---------------------------------------------------- ---------------- order2vec.pl ---------------------- ---------------------------------------------------- Test A10 for order2vec.pl Running order2vec.pl --format f16.06 test-A10.sval2 test-A10.wordvec test-A10.regex Test Ok Running order2vec.pl --format f16.06 --binary test-A10.sval2 test-A10.wordvec test-A10.regex Test Ok Test A11 for order2vec.pl Running order2vec.pl --format f16.06 test-A11.sval2 test-A11.wordvec test-A11.regex Test Ok Running order2vec.pl --format f16.06 --binary test-A11.sval2 test-A11.wordvec test-A11.regex Test Ok Test A12 for order2vec.pl Running order2vec.pl --format f16.06 test-A12.sval2 test-A12.wordvec test-A12.regex Test Ok Running order2vec.pl --format f16.06 --binary test-A12.sval2 test-A12.wordvec test-A12.regex Test Ok Test A13 for order2vec.pl Running order2vec.pl --format f16.06 test-A13.sval2 test-A13.wordvec test-A13.regex Test Ok Running order2vec.pl --format f16.06 --binary test-A13.sval2 test-A13.wordvec test-A13.regex Test Ok Test A1 for order2vec.pl Running order2vec.pl --dense --format f6.3 test-A1.sval2 test-A11.wordvec test-A1.regex Test Ok Running order2vec.pl --format f6.3 test-A1.sval2 test-A12.wordvec test-A1.regex Test Ok Test A2 for order2vec.pl Running order2vec.pl --dense --format f7.3 test-A2.sval2 test-A21.wordvec test-A2.regex Test Ok Running order2vec.pl --format f7.3 test-A2.sval2 test-A22.wordvec test-A2.regex Test Ok Test A3 for order2vec.pl Running order2vec.pl --dense --format f7.4 test-A3.sval2 test-A31.wordvec test-A3.regex Test Ok Running order2vec.pl --format f7.4 test-A3.sval2 test-A32.wordvec test-A3.regex Test Ok Test A4 for order2vec.pl Running order2vec.pl --dense --format f6.3 test-A4.sval2 test-A41.wordvec test-A4.regex Test Ok Running order2vec.pl --format f6.3 test-A4.sval2 test-A42.wordvec test-A4.regex Test Ok Test A5 for order2vec.pl Running order2vec.pl --dense --format i2 test-A5.sval2 test-A51.wordvec test-A5.regex Test Ok Running order2vec.pl --format i2 test-A5.sval2 test-A52.wordvec test-A5.regex Test Ok Test A6 for order2vec.pl Running order2vec.pl --dense --format f8.4 test-A6.sval2 test-A61.wordvec test-A6.regex Test Ok Running order2vec.pl --format f8.4 test-A6.sval2 test-A62.wordvec test-A6.regex Test Ok Test A7 for order2vec.pl Running order2vec.pl --dense --format f7.3 --rlabel test-A7.rlabel --rclass test-A7.rclass test-A7.sval2 test-A71.wordvec test-A7.regex Test Ok Running order2vec.pl --format f7.3 --rlabel test-A7.rlabel --rclass test-A7.rclass test-A7.sval2 test-A72.wordvec test-A7.regex Test Ok Test A8a for order2vec.pl Running order2vec.pl --dense --binary test-A8.sval2 test-A8a1.wordvec test-A8.regex Test Ok Running order2vec.pl --binary test-A8.sval2 test-A8a2.wordvec test-A8.regex Test Ok Test A8b for order2vec.pl Running order2vec.pl --dense --binary --format i5 test-A8.sval2 test-A8b1.wordvec test-A8.regex Test Ok Running order2vec.pl --binary --format i5 test-A8.sval2 test-A8b2.wordvec test-A8.regex Test Ok Test A8c for order2vec.pl Running order2vec.pl --dense --binary --format f6.3 test-A8.sval2 test-A8c1.wordvec test-A8.regex Test Ok Running order2vec.pl --binary --format f6.3 test-A8.sval2 test-A8c2.wordvec test-A8.regex Test Ok Test A9 for order2vec.pl Running order2vec.pl --dense --format f6.3 test-A9.sval2 test-A91.wordvec test-A9.regex Test Ok Running order2vec.pl --format f6.3 test-A9.sval2 test-A92.wordvec test-A9.regex Test Ok Test B1 for order2vec.pl Running order2vec.pl --dense test-B1.sval2 test-B1.wordvec test-B1.regex Test Ok Test B2 for order2vec.pl Running order2vec.pl --dense --format i2 test-B2.sval2 test-B21.wordvec test-B2.regex Test Ok Running order2vec.pl test-B2.sval2 test-B22.wordvec test-B2.regex Test Ok Test B3 for order2vec.pl Running order2vec.pl test-B3.sval2 test-B3.wordvec test-B3.regex Test Ok Test B4 for order2vec.pl Running order2vec.pl test-B4.sval2 test-B4.wordvec test-B4.regex Test Ok Test B5 for order2vec.pl Running order2vec.pl test-B5.sval2 test-B5.wordvec test-B5.regex Test Ok ---------------------------------------------------- ----------------- wordvec.pl ----------------------- ---------------------------------------------------- Test A10 for wordvec.pl Running wordvec.pl --dense --extarget --format i3 --feats test-A10a.feats --dims test-A10a.dims test-A10.bi Test Ok Running wordvec.pl --extarget --format i3 --feats test-A10a.feats --dims test-A10a.dims test-A10.bi Test Ok Running wordvec.pl --wordorder precede --dense --extarget --format i3 --feats test-A10b.feats --dims test-A10b.dims test-A10.bi Test Ok Running wordvec.pl --format i3 --extarget --wordorder precede --feats test-A10b.feats --dims test-A10b.dims test-A10.bi Test Ok Running wordvec.pl --dense --format i3 --extarget --wordorder nocare --feats test-A10c.feats --dims test-A10c.dims test-A10.bi Test Ok Running wordvec.pl --format i3 --extarget --wordorder nocare --feats test-A10c.feats --dims test-A10c.dims test-A10.bi Test Ok Test A11 for wordvec.pl Running wordvec.pl --dense --format i2 --feats test-A11a.feats --dims test-A11a.dims test-A11.bi Test Ok Running wordvec.pl --format i2 --feats test-A11a.feats --dims test-A11a.dims test-A11.bi Test Ok Running wordvec.pl --dense --format i2 --wordorder precede --feats test-A11b.feats --dims test-A11b.dims test-A11.bi Test Ok Running wordvec.pl --format i2 --wordorder precede --feats test-A11b.feats --dims test-A11b.dims test-A11.bi Test Ok Running wordvec.pl --dense --format i2 --wordorder nocare --feats test-A11c.feats --dims test-A11c.dims test-A11.bi Test Ok Running wordvec.pl --format i2 --wordorder nocare --feats test-A11c.feats --dims test-A11c.dims test-A11.bi Test Ok Test A12 for wordvec.pl Running wordvec.pl --dense --format i3 --feats test-A12a.feats --dims test-A12a.dims test-A12.bi Test Ok Running wordvec.pl --format i3 --feats test-A12a.feats --dims test-A12a.dims test-A12.bi Test Ok Running wordvec.pl --dense --format i3 --wordorder precede --feats test-A12b.feats --dims test-A12b.dims test-A12.bi Test Ok Running wordvec.pl --format i3 --wordorder precede --feats test-A12b.feats --dims test-A12b.dims test-A12.bi Test Ok Running wordvec.pl --dense --format i3 --wordorder nocare --feats test-A12c.feats --dims test-A12c.dims test-A12.bi Test Ok Running wordvec.pl --format i3 --wordorder nocare --feats test-A12c.feats --dims test-A12c.dims test-A12.bi Test Ok Test A13 for wordvec.pl Running wordvec.pl --dense --format f6.3 --feats test-A13a.feats --dims test-A13a.dims test-A13.bi Test Ok Running wordvec.pl --format f6.3 --feats test-A13a.feats --dims test-A13a.dims test-A13.bi Test Ok Running wordvec.pl --dense --format f6.3 --wordorder precede --feats test-A13b.feats --dims test-A13b.dims test-A13.bi Test Ok Running wordvec.pl --format f6.3 --wordorder precede --feats test-A13b.feats --dims test-A13b.dims test-A13.bi Test Ok Running wordvec.pl --dense --format f6.3 --wordorder nocare --feats test-A13c.feats --dims test-A13c.dims test-A13.bi Test Ok Running wordvec.pl --format f6.3 --wordorder nocare --feats test-A13c.feats --dims test-A13c.dims test-A13.bi Test Ok Test A14 for wordvec.pl Running wordvec.pl --dense --format i3 --feats test-A14a.feats --dims test-A14a.dims test-A14.bi Test Ok Running wordvec.pl --format i3 --feats test-A14a.feats --dims test-A14a.dims test-A14.bi Test Ok Running wordvec.pl --dense --format i3 --wordorder precede --feats test-A14b.feats --dims test-A14b.dims test-A14.bi Test Ok Running wordvec.pl --format i3 --wordorder precede --feats test-A14b.feats --dims test-A14b.dims test-A14.bi Test Ok Running wordvec.pl --dense --format i3 --wordorder nocare --feats test-A14c.feats --dims test-A14c.dims test-A14.bi Test Ok Running wordvec.pl --format i3 --wordorder nocare --feats test-A14c.feats --dims test-A14c.dims test-A14.bi Test Ok Test A15 for wordvec.pl Running wordvec.pl --dense --format f5.2 --feats test-A15a.feats --dims test-A15a.dims test-A15.bi Test Ok Running wordvec.pl --format f5.2 --feats test-A15a.feats --dims test-A15a.dims test-A15.bi Test Ok Running wordvec.pl --dense --format f5.2 --wordorder precede --feats test-A15b.feats --dims test-A15b.dims test-A15.bi Test Ok Running wordvec.pl --format f5.2 --wordorder precede --feats test-A15b.feats --dims test-A15b.dims test-A15.bi Test Ok Running wordvec.pl --dense --format f5.2 --wordorder nocare --feats test-A15c.feats --dims test-A15c.dims test-A15.bi Test Ok Running wordvec.pl --format f5.2 --wordorder nocare --feats test-A15c.feats --dims test-A15c.dims test-A15.bi Test Ok Test A16 for wordvec.pl Running wordvec.pl --dense --format f6.3 --feats test-A16a.feats --dims test-A16a.dims test-A16.bi Test Ok Running wordvec.pl --format f6.3 --feats test-A16a.feats --dims test-A16a.dims test-A16.bi Test Ok Running wordvec.pl --dense --format f6.3 --wordorder precede --feats test-A16b.feats --dims test-A16b.dims test-A16.bi Test Ok Running wordvec.pl --format f6.3 --wordorder precede --feats test-A16b.feats --dims test-A16b.dims test-A16.bi Test Ok Running wordvec.pl --dense --format f6.3 --wordorder nocare --feats test-A16c.feats --dims test-A16c.dims test-A16.bi Test Ok Running wordvec.pl --format f6.3 --wordorder nocare --feats test-A16c.feats --dims test-A16c.dims test-A16.bi Test Ok Test A17 for wordvec.pl Running wordvec.pl --dense --format f4.1 --feats test-A17a.feats --dims test-A17a.dims test-A17.bi Test Ok Running wordvec.pl --format f4.1 --feats test-A17a.feats --dims test-A17a.dims test-A17.bi Test Ok Running wordvec.pl --dense --format f4.1 --wordorder precede --feats test-A17b.feats --dims test-A17b.dims test-A17.bi Test Ok Running wordvec.pl --format f4.1 --wordorder precede --feats test-A17b.feats --dims test-A17b.dims test-A17.bi Test Ok Running wordvec.pl --dense --format f4.1 --wordorder nocare --feats test-A17c.feats --dims test-A17c.dims test-A17.bi Test Ok Running wordvec.pl --format f4.1 --wordorder nocare --feats test-A17c.feats --dims test-A17c.dims test-A17.bi Test Ok Test A18 for wordvec.pl Running wordvec.pl --dense --binary --feats test-A18a.feats --dims test-A18a.dims test-A18.bi Test Ok Running wordvec.pl --binary --feats test-A18a.feats --dims test-A18a.dims test-A18.bi Test Ok Running wordvec.pl --dense --binary --wordorder precede --feats test-A18b.feats --dims test-A18b.dims test-A18.bi Test Ok Running wordvec.pl --binary --wordorder precede --feats test-A18b.feats --dims test-A18b.dims test-A18.bi Test Ok Running wordvec.pl --dense --binary --wordorder nocare --feats test-A18c.feats --dims test-A18c.dims test-A18.bi Test Ok Running wordvec.pl --binary --wordorder nocare --feats test-A18c.feats --dims test-A18c.dims test-A18.bi Test Ok Test A19 for wordvec.pl Running wordvec.pl --dense --extarget --format i5 --feats test-A19a.feats --dims test-A19a.dims test-A19.bi Test Ok Running wordvec.pl --extarget --format i5 --feats test-A19a.feats --dims test-A19a.dims test-A19.bi Test Ok Running wordvec.pl --dense --format i5 --extarget --wordorder nocare --feats test-A19b.feats --dims test-A19b.dims test-A19.bi Test Ok Running wordvec.pl --format i5 --extarget --wordorder nocare --feats test-A19b.feats --dims test-A19b.dims test-A19.bi Test Ok Test A1 for wordvec.pl Running wordvec.pl --dense --format i2 --feats test-A1a.feats --dims test-A1a.dims test-A1.bi Test Ok Running wordvec.pl --format i2 --feats test-A1a.feats --dims test-A1a.dims test-A1.bi Test Ok Running wordvec.pl --dense --format i2 --wordorder precede --feats test-A1b.feats --dims test-A1b.dims test-A1.bi Test Ok Running wordvec.pl --format i2 --wordorder precede --feats test-A1b.feats --dims test-A1b.dims test-A1.bi Test Ok Running wordvec.pl --dense --format i2 --wordorder nocare --feats test-A1c.feats --dims test-A1c.dims test-A1.bi Test Ok Running wordvec.pl --format i2 --wordorder nocare --feats test-A1c.feats --dims test-A1c.dims test-A1.bi Test Ok Test A20 for wordvec.pl Running wordvec.pl --dense --extarget --format i3 --feats test-A20a.feats --dims test-A20a.dims test-A20.bi Test Ok Running wordvec.pl --extarget --format i3 --feats test-A20a.feats --dims test-A20a.dims test-A20.bi Test Ok Running wordvec.pl --wordorder precede --dense --extarget --format i3 --feats test-A20b.feats --dims test-A20b.dims test-A20.bi Test Ok Running wordvec.pl --format i3 --extarget --wordorder precede --feats test-A20b.feats --dims test-A20b.dims test-A20.bi Test Ok Running wordvec.pl --dense --format i3 --extarget --wordorder nocare --feats test-A20c.feats --dims test-A20c.dims test-A20.bi Test Ok Running wordvec.pl --format i3 --extarget --wordorder nocare --feats test-A20c.feats --dims test-A20c.dims test-A20.bi Test Ok Test A2 for wordvec.pl Running wordvec.pl --dense --format i3 --feats test-A2a.feats --dims test-A2a.dims test-A2.bi Test Ok Running wordvec.pl --format i3 --feats test-A2a.feats --dims test-A2a.dims test-A2.bi Test Ok Running wordvec.pl --dense --format i3 --wordorder precede --feats test-A2b.feats --dims test-A2b.dims test-A2.bi Test Ok Running wordvec.pl --format i3 --wordorder precede --feats test-A2b.feats --dims test-A2b.dims test-A2.bi Test Ok Running wordvec.pl --dense --format i3 --wordorder nocare --feats test-A2c.feats --dims test-A2c.dims test-A2.bi Test Ok Running wordvec.pl --format i3 --wordorder nocare --feats test-A2c.feats --dims test-A2c.dims test-A2.bi Test Ok Test A3 for wordvec.pl Running wordvec.pl --dense --format f6.3 --feats test-A3a.feats --dims test-A3a.dims test-A3.bi Test Ok Running wordvec.pl --format f6.3 --feats test-A3a.feats --dims test-A3a.dims test-A3.bi Test Ok Running wordvec.pl --dense --format f6.3 --wordorder precede --feats test-A3b.feats --dims test-A3b.dims test-A3.bi Test Ok Running wordvec.pl --format f6.3 --wordorder precede --feats test-A3b.feats --dims test-A3b.dims test-A3.bi Test Ok Running wordvec.pl --dense --format f6.3 --wordorder nocare --feats test-A3c.feats --dims test-A3c.dims test-A3.bi Test Ok Running wordvec.pl --format f6.3 --wordorder nocare --feats test-A3c.feats --dims test-A3c.dims test-A3.bi Test Ok Test A4 for wordvec.pl Running wordvec.pl --dense --format i3 --feats test-A4a.feats --dims test-A4a.dims test-A4.bi Test Ok Running wordvec.pl --format i3 --feats test-A4a.feats --dims test-A4a.dims test-A4.bi Test Ok Running wordvec.pl --dense --format i3 --wordorder precede --feats test-A4b.feats --dims test-A4b.dims test-A4.bi Test Ok Running wordvec.pl --format i3 --wordorder precede --feats test-A4b.feats --dims test-A4b.dims test-A4.bi Test Ok Running wordvec.pl --dense --format i3 --wordorder nocare --feats test-A4c.feats --dims test-A4c.dims test-A4.bi Test Ok Running wordvec.pl --format i3 --wordorder nocare --feats test-A4c.feats --dims test-A4c.dims test-A4.bi Test Ok Test A5 for wordvec.pl Running wordvec.pl --dense --format f5.2 --feats test-A5a.feats --dims test-A5a.dims test-A5.bi Test Ok Running wordvec.pl --format f5.2 --feats test-A5a.feats --dims test-A5a.dims test-A5.bi Test Ok Running wordvec.pl --dense --format f5.2 --wordorder precede --feats test-A5b.feats --dims test-A5b.dims test-A5.bi Test Ok Running wordvec.pl --format f5.2 --wordorder precede --feats test-A5b.feats --dims test-A5b.dims test-A5.bi Test Ok Running wordvec.pl --dense --format f5.2 --wordorder nocare --feats test-A5c.feats --dims test-A5c.dims test-A5.bi Test Ok Running wordvec.pl --format f5.2 --wordorder nocare --feats test-A5c.feats --dims test-A5c.dims test-A5.bi Test Ok Test A6 for wordvec.pl Running wordvec.pl --dense --format f6.3 --feats test-A6a.feats --dims test-A6a.dims test-A6.bi Test Ok Running wordvec.pl --format f6.3 --feats test-A6a.feats --dims test-A6a.dims test-A6.bi Test Ok Running wordvec.pl --dense --format f6.3 --wordorder precede --feats test-A6b.feats --dims test-A6b.dims test-A6.bi Test Ok Running wordvec.pl --format f6.3 --wordorder precede --feats test-A6b.feats --dims test-A6b.dims test-A6.bi Test Ok Running wordvec.pl --dense --format f6.3 --wordorder nocare --feats test-A6c.feats --dims test-A6c.dims test-A6.bi Test Ok Running wordvec.pl --format f6.3 --wordorder nocare --feats test-A6c.feats --dims test-A6c.dims test-A6.bi Test Ok Test A7 for wordvec.pl Running wordvec.pl --dense --format f4.1 --feats test-A7a.feats --dims test-A7a.dims test-A7.bi Test Ok Running wordvec.pl --format f4.1 --feats test-A7a.feats --dims test-A7a.dims test-A7.bi Test Ok Running wordvec.pl --dense --format f4.1 --wordorder precede --feats test-A7b.feats --dims test-A7b.dims test-A7.bi Test Ok Running wordvec.pl --format f4.1 --wordorder precede --feats test-A7b.feats --dims test-A7b.dims test-A7.bi Test Ok Running wordvec.pl --dense --format f4.1 --wordorder nocare --feats test-A7c.feats --dims test-A7c.dims test-A7.bi Test Ok Running wordvec.pl --format f4.1 --wordorder nocare --feats test-A7c.feats --dims test-A7c.dims test-A7.bi Test Ok Test A8 for wordvec.pl Running wordvec.pl --dense --binary --feats test-A8a.feats --dims test-A8a.dims test-A8.bi Test Ok Running wordvec.pl --binary --feats test-A8a.feats --dims test-A8a.dims test-A8.bi Test Ok Running wordvec.pl --dense --binary --wordorder precede --feats test-A8b.feats --dims test-A8b.dims test-A8.bi Test Ok Running wordvec.pl --binary --wordorder precede --feats test-A8b.feats --dims test-A8b.dims test-A8.bi Test Ok Running wordvec.pl --dense --binary --wordorder nocare --feats test-A8c.feats --dims test-A8c.dims test-A8.bi Test Ok Running wordvec.pl --binary --wordorder nocare --feats test-A8c.feats --dims test-A8c.dims test-A8.bi Test Ok Test A9 for wordvec.pl Running wordvec.pl --dense --extarget --format i5 --feats test-A9a.feats --dims test-A9a.dims test-A9.bi Test Ok Running wordvec.pl --extarget --format i5 --feats test-A9a.feats --dims test-A9a.dims test-A9.bi Test Ok Running wordvec.pl --dense --format i5 --extarget --wordorder nocare --feats test-A9b.feats --dims test-A9b.dims test-A9.bi Test Ok Running wordvec.pl --format i5 --extarget --wordorder nocare --feats test-A9b.feats --dims test-A9b.dims test-A9.bi Test Ok Test B1 for wordvec.pl Running wordvec.pl --dense --format i4 test-B1a.bi Test Error When tested against test-B1a.reqd 0a1,2 > 351 0 0 0 0 0 0 0 > 9 11 3c5 < Value <101693> can't be represented with format %4d. --- > Value <9070> can't be represented with format %4d. Running wordvec.pl --dense --format f20.10 test-B1b.bi Test Error When tested against test-B1b.reqd 0a1,5 > 0.0000000000 0.0000000000 0.0000000000 > 0.0000000000 0.0000000000 1277048.5903157510 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 > 0.0000000000 1599000.5396772602 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 > 10 11 > 3372654.3806728791 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 Running wordvec.pl --dense --format f20.10 test-B1c.bi Test Error When tested against test-B1c.reqd 0a1,6 > 0.0000000000 0.0000000000 0.0000000000 0.0000000000 > 0.0000000000 0.0000000000 0.0000000000 11144463.9928190522 0.0000000000 0.0000000000 0.0000000000 0.0000000000 842955.8152687779 0.0000000000 0.0000000000 > 0.0000000000 0.0000000000 1277048.5903157510 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 > 0.0000000000 1599000.5396772602 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 > 10 11 > 3372654.3806728791 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 0.0000000000 Test B2 for wordvec.pl Running wordvec.pl --wordorder nocare test-B2.bi Test Ok Running wordvec.pl --wordorder nocare --feats test-B2.feats test-B2.bi Test Ok ---------------------------------------------------- ----------------- text2sval.pl --------------------- ---------------------------------------------------- Test A1 for text2sval.pl Running text2sval.pl --key test-A1.key --lexelt line-n test-A1.text Test Ok Test A2 for text2sval.pl Running text2sval.pl test-A2.text Test Ok Test A3 for text2sval.pl Running text2sval.pl --key test-A3.key --lexelt line-n test-A3.text Test Ok Test A4 for text2sval.pl Running text2sval.pl --key test-A4.key --lexelt serve-v test-A4.text Test Ok Test B1 for text2sval.pl Running text2sval.pl --key test-B1.key --lexelt line-n test-B1.text Test Ok Test B2 for text2sval.pl Running text2sval.pl --key test-B2.key --lexelt line-n test-B2.text Test Ok Test B3 for text2sval.pl Running text2sval.pl --key test-B3.key --lexelt line-n test-B3.text Test Ok ---------------------------------------------------- ------------------- balance.pl --------------------- ---------------------------------------------------- Test 1 for balance.pl Running balance.pl test-A1.xml 1 Test Ok Test 2 for balance.pl Running balance.pl test-A2.xml 2 Test Ok Test 3 for balance.pl Running balance.pl test-A3.xml 2 Test Ok Test 4 for balance.pl Running balance.pl test-A4.xml 1 Test Ok Test 5 for balance.pl Running balance.pl test-A5.xml 3 Test Ok Test 6 for balance.pl Running balance.pl test-A6.xml 2 Test Ok Test A7 for balance.pl Running balance.pl test-A7.xml 2 Test Ok Test A8 for balance.pl Running balance.pl --count test-A8.count test-A8.xml 2 Test Ok ---------------------------------------------------- ------------------ filter.pl ------------------------ ---------------------------------------------------- UNIT Test A10 - For Sense Filter Program filter.pl Data - Source file from test-A10.data Frequency Report - test-A10.report Count - test-A10.count Output - Filtered Data file from test-A10.reqd Filtered Count - test-A10.count.reqd Test - Checks filter.pl when corresponding count file is given for filtering and nomulti is chosen. STATUS : OK Test Results Match..... UNIT Test A11 - For Sense Filter Program filter.pl Data - Source file from test-A11.data Frequency Report - test-A11.report Output - Filtered Data file from test-A11.reqd Test - Checks filter.pl's default filter (percent=1% to remove senses occurring below 1%) when no Filters are selected but --nomulti is chosen. STATUS : OK Test Results Match..... UNIT Test A12 - For Sense Filter Program filter.pl Data - Source file from test-A12.data Frequency Report - test-A12.report Output - Filtered Data file from test-A12.reqd Test - Checks filter.pl when percent is set to 0 and --nomulti is selected. STATUS : OK Test Results Match..... UNIT Test A13 - For Sense Filter Program filter.pl Data - Source file from test-A13.data Count - test-A13.count Frequency Report - test-A13.report Output - Filtered Data file from test-A13.reqd Test - Checks filter.pl when nomulti and count options are provided and percent is set to 0 STATUS : OK Test Results Match..... UNIT Test A1 - For Sense Filter Program filter.pl Data - Source file from test-A1.data Frequency Report - test-A1.report Output - Filtered Data file from test-A1.reqd Test - Checks filter.pl's --rank R filter to select Top R most frequent senses. STATUS : OK Test Results Match..... UNIT Test A2 - For Sense Filter Program filter.pl Data - Source file from test-A2.data Frequency Report - test-A2.report Output - Filtered Data file from test-A2.reqd Test - Checks filter.pl's default filter (frequency=1% to remove senses occurring below 1%) when no Filters are selected. STATUS : OK Test Results Match..... UNIT Test A3 - For Sense Filter Program filter.pl Data - Source file from test-A3.data Frequency Report - test-A3.report Output - Filtered Data file from test-A3.reqd Test - Checks filter.pl's --percent P filter to select senses with percent P% or more. STATUS : OK Test Results Match..... UNIT Test A4 - For Sense Filter Program filter.pl Data - Source file from test-A4.data Frequency Report - test-A4.report Output - Filtered Data file from test-A4.reqd Test - Checks the boundary condition on --percent option STATUS : OK Test Results Match..... UNIT Test A5 - For Sense Filter Program filter.pl Data - Source file from test-A5.data Frequency Report - test-A5.report Output - Filtered Data file from test-A5.reqd Test - Checks filter.pl's --rank R filter when there are ties on ranks at or below R. STATUS : OK Test Results Match..... UNIT Test A6 - For Sense Filter Program filter.pl Data - Source file from test-A6.data Frequency Report - test-A6.report Output - Filtered Data file from test-A6.reqd Test - Checks filter.pl when extra tags appear in data. STATUS : OK Test Results Match..... UNIT Test A7 - For Sense Filter Program filter.pl Data - Source file from test-A7.data Frequency Report - test-A7.report Count - test-A7.count Output - Filtered Data file from test-A7.reqd Filtered Count - test-A7.count.reqd Test - Checks filter.pl when corresponding count file is given for filtering. STATUS : OK Test Results Match..... UNIT Test A8 - For Sense Filter Program filter.pl Data- Source File test-A8.data Report - Frequency Report file test-A8.report Output - File test-A8.reqd Checks the condition in filter.pl when a sense tag in the data file is not listed in the Frequency Report STATUS : OK Test Results Match..... UNIT Test A9 - For Sense Filter Program filter.pl Data - Source file from test-A9.data Frequency Report - test-A9.report Output - Filtered Data file from test-A9.reqd Test - Checks filter.pl's --nomulti option STATUS : OK Test Results Match..... UNIT Test B1 - For Sense Filter Program filter.pl Data- Source File test-B1.data Report - Frequency Report file test-B1.report Output - Error message in file test-B1.reqd Checks the error condition in filter.pl when both --percent and --rank are selected. STATUS : OK Test Results Match..... UNIT Test B2 - For Sense Filter Program filter.pl Data- Source File test-B2.data Report - Frequency Report file test-B2.report Output - Error message in file test-B2.reqd Checks the error condition in filter.pl when Frequency Report doesn't follow the required format. STATUS : OK Test Results Match..... ---------------------------------------------------- ------------------ frequency.pl -------------------- ---------------------------------------------------- Test A1 - Testing frequency.pl when Source has balanced distribution. Running frequency.pl test-A1.source Test A1 OK Test A2 - Testing frequency.pl when Source has only one sense(100%). Running frequency.pl test-A2.source Test A2 OK Test A3 - Testing frequency.pl when Source has 2 senses in ratio(66:33). Running frequency.pl test-A3.source Test A3 OK Test A4 - Testing frequency.pl when Source is a part of actual Senseval-2. Running frequency.pl test-A4.source Test A4 OK Test A5 - Testing frequency.pl when Source is a part of actual Senseval-2. And single sense tag occurs. Running frequency.pl test-A5.source Test A5 OK ---------------------------------------------------- ---------------- keyconvert.pl --------------------- ---------------------------------------------------- UNIT Test A1 - For Key Convertor keyconvert.pl Input - Senseval2 Key file from test-A1.keyin Output - Equivalent SenseClusters Key file from test-A1.keyout Test - Tests program keyconvert.pl on sample keyfiles defined(%hash) is deprecated at /usr/local/bin/keyconvert.pl line 195. (Maybe you should just omit the defined()?) STATUS : OK Test Results Match..... UNIT Test A2 - For Key Convertor keyconvert.pl Input - Senseval2 Key file from fine.key Output - Equivalent SenseClusters Key file from SenseCluster.key Test - Tests program keyconvert.pl on actual Senseval keyfile. defined(%hash) is deprecated at /usr/local/bin/keyconvert.pl line 195. (Maybe you should just omit the defined()?) STATUS : OK Test Results Match..... UNIT Test A3 - For Key Convertor keyconvert.pl Input - Senseval2 Key file from test-A3.keyin Output - Equivalent SenseClusters Key file from test-A3.keyout Test - Tests program keyconvert.pl when --attach_P option is selected defined(%hash) is deprecated at /usr/local/bin/keyconvert.pl line 195. (Maybe you should just omit the defined()?) STATUS : OK Test Results Match..... ---------------------------------------------------- ---------------- maketarget.pl --------------------- ---------------------------------------------------- Test A1 for maketarget.pl Running maketarget.pl --head test-A1.sval2 Test Ok Test A1a for maketarget.pl Running maketarget.pl test-A1.sval2 Test Ok Test A2 for maketarget.pl Running maketarget.pl --head test-A2.sval2 Test Ok Test A2a for maketarget.pl Running maketarget.pl test-A2.sval2 Test Ok Test A3 for maketarget.pl Running maketarget.pl --head test-A3.sval2 Test Ok Test A3a for maketarget.pl Running maketarget.pl test-A3.sval2 Test Ok Test A4 for maketarget.pl Running maketarget.pl --head test-A4.sval2 Test Ok Test A4a for maketarget.pl Running maketarget.pl test-A4.sval2 Test Ok Test B1 for maketarget.pl Running maketarget.pl test-B1.sval2 Test Ok Test B1a for maketarget.pl Running maketarget.pl --head test-B1.sval2 Test Ok ---------------------------------------------------- --------------- prepare_sval2.pl ------------------- ---------------------------------------------------- Test A1 - Testing if P tags are getting removed. Running prepare_sval2.pl test-A1.data Test A1 OK Test A2 - Testing if attach_P is working. Running prepare_sval2.pl --attachP test-A2.data Test A2 OK Test A3 - Testing if prepare_sval2 attaches NOTAGs when Input is untagged. Running prepare_sval2.pl test-A3.data Test A3 OK Test A4 - Testing if prepare_sval2 attaches tags from KEY file. Running prepare_sval2.pl --key test-A4.key test-A4.data Test A4 OK Test A5 - Testing when some instances do not have tags in KEY file. Running prepare_sval2.pl --key test-A5.key test-A5.data Test A5 OK Test A6 - Testing when KEY file has tags for already tagged data. Running prepare_sval2.pl --key test-A6.key test-A6.data Test A6 OK Test A7 - Testing some instances are not tagged. Running prepare_sval2.pl test-A7.data Test A7 OK Test A8 - Testing when instances are tagged with single tag=P. Running prepare_sval2.pl test-A8.data Test A8 OK Test B1 - Testing an error condition when some instances are attached tags in untagged data Running prepare_sval2.pl test-B1.data Test B1 OK ---------------------------------------------------- ----------------- sval2plain.pl -------------------- ---------------------------------------------------- Test A1 for sval2plain.pl Running sval2plain.pl test-A1.sval2 Test Ok Test A2 for sval2plain.pl Running sval2plain.pl test-A2.sval2 Test Ok Test A3 for sval2plain.pl Running sval2plain.pl test-A3.sval2 Test Ok Test B1 for sval2plain.pl Running sval2plain.pl test-B1.sval2 Test Ok ---------------------------------------------------- ----------------- windower.pl ---------------------- ---------------------------------------------------- Test A1 for windower.pl Running windower.pl test-A1.input 5 Test Ok Test A2 for windower.pl Running windower.pl --token test-A2.token --target test-A2.target test-A2.input 5 Test Ok Test A3 for windower.pl Running windower.pl --token test-A3.token --target test-A3.target test-A3.input 5 Test Ok Test A4 for windower.pl Running windower.pl --plain test-A4.input 5 Test Ok Test A5 for windower.pl Running windower.pl --token test-A5.token --target test-A5.target --plain test-A5.input 5 Test Ok Test B1 for windower.pl Running windower.pl test-B1.input 5 Test Ok Test B2 for windower.pl Running windower.pl test-B2.input 5 Test Ok ---------------------------------------------------- ----------------- nsp2regex.pl --------------------- ---------------------------------------------------- Subtest 1 : Testing nsp2regex thusly: nsp2regex.pl sub-1.source > out Test OK Testing nsp2regex thusly: nsp2regex.pl sub-1.source --token token.txt > out Test OK Subtest 2 : Testing nsp2regex thusly: nsp2regex.pl sub-2.source > out Test OK Testing nsp2regex thusly: nsp2regex.pl sub-2.source --token token.txt > out Test OK Subtest 3 : Testing nsp2regex thusly: nsp2regex.pl sub-3.source > out Test OK Testing nsp2regex thusly: nsp2regex.pl sub-3.source --token token.txt > out Test OK Subtest 4 : Testing nsp2regex thusly: nsp2regex.pl sub-4.source > out Test OK Testing nsp2regex thusly: nsp2regex.pl sub-4.source --token token.txt > out Test OK Subtest 5 : Testing nsp2regex thusly: nsp2regex.pl sub-5.source > out Test OK Testing nsp2regex thusly: nsp2regex.pl sub-5.source --token token.txt > out Test OK Subtest 6 : Testing nsp2regex thusly: nsp2regex.pl sub-6.source > out Test OK Testing nsp2regex thusly: nsp2regex.pl sub-6.source --token token.txt > out Test OK Subtest 7 : Testing nsp2regex thusly: nsp2regex.pl sub-7.source > out Test OK Testing nsp2regex thusly: nsp2regex.pl sub-7.source --token token.txt > out Test OK ---------------------------------------------------- ----------------- preprocess.pl -------------------- ---------------------------------------------------- Subtest 1: Testing without options. Testing preprocess thusly: preprocess.pl test-1.xml Test OK Test OK Test OK Test OK Test OK Test OK Subtest 2: Testing preprocess.pl with a token file Testing preprocess thusly: preprocess.pl test-1.xml --token test-1.sub-2.token.txt Test OK Test OK Test OK Test OK Test OK Test OK Subtest 3: Testing preprocess.pl with --useLexelt option Testing preprocess thusly: preprocess.pl test-1.xml --useLexelt --token test-1.sub-3.token.txt Test OK Test OK Test OK Test OK Test OK Test OK Subtest 4: Testing preprocess.pl with --useSenseid option Testing preprocess thusly: preprocess.pl test-1.xml --useSenseid --token test-1.sub-3.token.txt Test OK Test OK Test OK Test OK Test OK Test OK Subtest 5: Testing preprocess.pl with a token file a... [truncated message content] |
From: Ted P. <tpederse@d.umn.edu> - 2013-06-27 22:43:32
|
Normally I run this as cd External csh install.sh /usr/local/bin the argument specifies the install directory for las2, vcluster and scluster. If you don't have /usr/local/bin on your system, you could also put them in /usr/bin, in which case you would run csh install.sh /usr/bin Let me know if that works for you. Thanks, Ted On Thu, Jun 27, 2013 at 3:20 PM, Anand Jha <ana...@gm...> wrote: > Dr Pedersen, > > I have tried multiple times, but I am getting some error while installing > the external\install.sh. It is expecting an argument and I have passed the > argument as SVDPACKC. > Is this right way to run it or do I need to pass different argument and > create the directory of the same name? > > And I am getting the same errors while running test-all.sh > > Please help. > > Regards, > Anand > > > On Thu, Jun 27, 2013 at 9:27 AM, Ted Pedersen <tpederse@d.umn.edu> wrote: >> >> BTW, all Ubuntu distributions less than 12.04 are no longer supported, >> so you may run into funny issues with repositories and so forth. I'd >> suggest using 12.04 as a part of your testing, since that's a current >> and long term support release. I don't need to do anything special to >> get csh with that (other than the apt-get install command). >> >> https://en.wikipedia.org/wiki/List_of_Ubuntu_releases >> >> Thanks, >> Ted >> >> On Wed, Jun 26, 2013 at 7:51 PM, Anand Jha <ana...@gm...> wrote: >> > Dr. Pedersen, >> > >> > Sorry, I completely forget that I can simply install csh instead of >> > trying >> > other workaround. I installed the csh on the system and I need to enable >> > the >> > 'Universe' option of software repository of Ubuntu to get through it. >> > The >> > linux distributions that I have tried are Ubuntu 11.10, 10.10 and >> > OpenSuse >> > 12.2. Currently, I am testing on Ubuntu 11.10 >> > >> > After installing csh, I am able to run the ALL-TESTS.sh; however some of >> > the >> > test cases which are not relevant to clusterlabeling are failing. I am >> > not >> > sure whether it is because I am still missing some packages or something >> > else. >> > >> > I am attaching the two files which contains the output of ALL-TESTS.sh >> > >> > Regards, >> > Anand >> > >> > >> > >> > On Wed, Jun 26, 2013 at 5:02 PM, Ted Pedersen <tpederse@d.umn.edu> >> > wrote: >> >> >> >> Hi Anand, >> >> >> >> You should still be able to install software on a live cd - so you >> >> could install csh and then test like that. On an unbuntu or debian >> >> system you can just do the following, for example. >> >> >> >> apt-get install csh >> >> >> >> You might also need to change the default system shell if using ubuntu >> >> : >> >> >> >> https://wiki.ubuntu.com/DashAsBinSh >> >> >> >> I normally need to do both of these to get things running on Ubuntu, >> >> so perhaps that's the issue? >> >> >> >> If this seems relevant can you try the above and see if things work? >> >> Which distributions have you been using, btw? >> >> >> >> Thanks, >> >> Ted >> >> >> >> On Wed, Jun 26, 2013 at 3:05 PM, Anand Jha <ana...@gm...> >> >> wrote: >> >> > Dr. Pedersen, >> >> > >> >> > I have modified the following two files and also done the checkin for >> >> > these: >> >> > >> >> > 1. clusterlabeling.pl : Added the ngram option in perldoc segment >> >> > 2. MANIFEST: Added the new files related to new >> >> > test-cases >> >> > of >> >> > clusterlabeling.pl >> >> > >> >> > >> >> > I did try the testing of SenseClusters on live-cds, however I am not >> >> > confident on it. In most of the system, it was hanging while >> >> > installing >> >> > PDL >> >> > or Bundle::Text::SenseClusters, (I tried on 3 laptops and 2 new linux >> >> > systems of hh314). >> >> > >> >> > In the systems where I was able to install the dependent packages, I >> >> > tried >> >> > to do "make test" which was just running only one testcase. So, I >> >> > tried >> >> > running test-cases through ALL-TESTS.sh. >> >> > However, this file needed csh to run which was not available on >> >> > live-cd. >> >> > So, >> >> > I changed that to bash, but it was giving some other issues. Finally, >> >> > I >> >> > have >> >> > run the individual test-cases of clusterlabeling and it seems to be >> >> > working >> >> > fine. >> >> > >> >> > >> >> > Please let me know, if I am doing something wrong or my approach are >> >> > wrong >> >> > (I am attaching file of the steps that I was following). >> >> > >> >> > Regards, >> >> > Anand >> >> > >> >> > >> >> > On Wed, Jun 26, 2013 at 11:09 AM, Ted Pedersen <tpederse@d.umn.edu> >> >> > wrote: >> >> >> >> >> >> Also, make sure to update the perldoc for clusterlabeling.pl - see >> >> >> that content via : >> >> >> >> >> >> perldoc clusterlabeling.pl >> >> >> >> >> >> Thanks! >> >> >> Ted >> >> >> >> >> >> On Tue, Jun 25, 2013 at 11:13 PM, Ted Pedersen <tpederse@d.umn.edu> >> >> >> wrote: >> >> >> > Sounds like a good plan - if you continue to have trouble please >> >> >> > don't >> >> >> > hesitate to send me more info - I can look into this further as >> >> >> > well. >> >> >> > >> >> >> > Good luck, >> >> >> > Ted >> >> >> > >> >> >> > On Tue, Jun 25, 2013 at 8:18 PM, Anand Jha >> >> >> > <ana...@gm...> >> >> >> > wrote: >> >> >> >> Dr Pedersen, >> >> >> >> >> >> >> >> I have fixed the bug related to testA7 mentioned in your mail and >> >> >> >> also >> >> >> >> issue >> >> >> >> related to print messages of testA6, and testA7. >> >> >> >> I have also modified the help messages in discriminate.pl and >> >> >> >> clusterlabeling,pl for supported value of n in ngram. >> >> >> >> >> >> >> >> Finally I have also added the checks in discriminate.pl and >> >> >> >> clusterlabeling,pl for supported value of n as 2, 3 and 4. >> >> >> >> >> >> >> >> >> >> >> >> However, I am struggling to test the package on live CD as every >> >> >> >> time >> >> >> >> I am >> >> >> >> trying to install the bundle::text::senseclusters package it is >> >> >> >> getting >> >> >> >> stuck after doing some installation and system is getting >> >> >> >> freezed. >> >> >> >> >> >> >> >> I am trying to get hold of a different system and also I will try >> >> >> >> to >> >> >> >> change >> >> >> >> OS to see if the testing can pass. >> >> >> >> >> >> >> >> Regards, >> >> >> >> Anand >> >> >> >> >> >> >> >> >> >> >> >> On Tue, Jun 25, 2013 at 2:10 PM, Anand Jha >> >> >> >> <ana...@gm...> >> >> >> >> wrote: >> >> >> >>> >> >> >> >>> Dr Pedersen, >> >> >> >>> >> >> >> >>> I will look into all these issues and will reply to you by >> >> >> >>> evening. >> >> >> >>> >> >> >> >>> Regards, >> >> >> >>> Anand >> >> >> >>> >> >> >> >>> >> >> >> >>> On Tue, Jun 25, 2013 at 1:10 PM, Ted Pedersen >> >> >> >>> <tpederse@d.umn.edu> >> >> >> >>> wrote: >> >> >> >>>> >> >> >> >>>> BTW, one thing I've noticed is that when we use an ngram value >> >> >> >>>> of >> >> >> >>>> something other than 2, 3, 4 we get an error message - that's >> >> >> >>>> fine, >> >> >> >>>> because I realize we don't have measures of association for >> >> >> >>>> larger >> >> >> >>>> ngrams. But, it seems like clusterlabeling does a fair bit of >> >> >> >>>> work >> >> >> >>>> before issuing the error, and it leaves behind some garbage >> >> >> >>>> files >> >> >> >>>> to >> >> >> >>>> I >> >> >> >>>> think. So, what I'm wondering is if clusterlabeing.pl could >> >> >> >>>> check >> >> >> >>>> the >> >> >> >>>> value of --ngram and give an immediate error if the value is >> >> >> >>>> something >> >> >> >>>> other than 2, 3, 4? Is that possible, and does that seem like a >> >> >> >>>> good >> >> >> >>>> idea? >> >> >> >>>> >> >> >> >>>> Thanks, >> >> >> >>>> Ted >> >> >> >>>> >> >> >> >>>> Below is an example of the sort of error message we get now... >> >> >> >>>> >> >> >> >>>> clusterlabeling.pl --token ./token.regex >> >> >> >>>> testA6.clusters_context >> >> >> >>>> --ngram >> >> >> >>>> 6 >> >> >> >>>> Error: This measure is only defined for bigrams, trigrams and >> >> >> >>>> 4-gramsError while opening the file >> >> >> >>>> >> >> >> >>>> >> >> >> >>>> >> >> >> >>>> >> >> >> >>>> /home/ted/SC/Testing/clusterlabel/clusterlabeling/tmp.1372183751.cluster.0.stat >> >> >> >>>> at /usr/local/bin/clusterlabeling.pl line 461. >> >> >> >>>> >> >> >> >>>> On Tue, Jun 25, 2013 at 9:18 AM, Ted Pedersen >> >> >> >>>> <tpederse@d.umn.edu> >> >> >> >>>> wrote: >> >> >> >>>> > Hi Anand, >> >> >> >>>> > >> >> >> >>>> > I don't think all the test cases are being properly run by >> >> >> >>>> > the >> >> >> >>>> > ALL-TESTS.sh script - right now it does something like >> >> >> >>>> > >> >> >> >>>> > csh ./testA*.sh >> >> >> >>>> > >> >> >> >>>> > ro run, but that only seems to be running the first case. I >> >> >> >>>> > think >> >> >> >>>> > we'll need to enumerate the test cases we want run in that >> >> >> >>>> > script - >> >> >> >>>> > I >> >> >> >>>> > will make that change and see what happens. >> >> >> >>>> > >> >> >> >>>> > In looking at this, I noticed that testA7 seems to fail. >> >> >> >>>> > Also, >> >> >> >>>> > the >> >> >> >>>> > message shown about what command is being tested isn't >> >> >> >>>> > accurate. >> >> >> >>>> > Could >> >> >> >>>> > you check into those. I think the same is true of your other >> >> >> >>>> > case, >> >> >> >>>> > where the message describing what is being run isn't >> >> >> >>>> > accurate. >> >> >> >>>> > >> >> >> >>>> > Here's what i mean about the message not being accurate : >> >> >> >>>> > >> >> >> >>>> > csh ./testA7.sh >> >> >> >>>> > Test A7 - Testing clusterlabeling.pl without stoplist. >> >> >> >>>> > Running clusterlabeling.pl --token token.regex --rank 5 >> >> >> >>>> > --stat >> >> >> >>>> > ll >> >> >> >>>> > --prefix testA7 testA7.clusters_context > testA7.output >> >> >> >>>> > >> >> >> >>>> > (That's a previous case) >> >> >> >>>> > >> >> >> >>>> > Here's the error... >> >> >> >>>> > >> >> >> >>>> > STATUS : ERROR Test Results don't Match.... >> >> >> >>>> > When Tested Against testA7.reqd - >> >> >> >>>> > 1c1 >> >> >> >>>> > < Cluster 0 (Descriptive): fifth World Cup, 1998 World Cup, >> >> >> >>>> > record >> >> >> >>>> > World Cup, the World Cup, in World Cup >> >> >> >>>> > --- >> >> >> >>>> >> Cluster 0 (Descriptive): World Cup title, World Cup >> >> >> >>>> >> opener, >> >> >> >>>> >> World >> >> >> >>>> >> Cup finals, World Cup quarter, World Cup final >> >> >> >>>> > 3c3 >> >> >> >>>> > < Cluster 0 (Discriminating): fifth World Cup, 1998 World >> >> >> >>>> > Cup, >> >> >> >>>> > record World Cup, the World Cup, in World Cup >> >> >> >>>> > --- >> >> >> >>>> >> Cluster 0 (Discriminating): World Cup title, World Cup >> >> >> >>>> >> opener, >> >> >> >>>> >> World Cup finals, World Cup quarter, World Cup final >> >> >> >>>> > 5c5 >> >> >> >>>> > < Cluster 1 (Descriptive): Midfielders Manchester United, >> >> >> >>>> > Butt >> >> >> >>>> > Manchester United, Scholes Manchester United, Brown >> >> >> >>>> > Manchester >> >> >> >>>> > United, Wes Manchester United >> >> >> >>>> > --- >> >> >> >>>> >> Cluster 1 (Descriptive): Manchester United Gerrard, Bulent >> >> >> >>>> >> Sol >> >> >> >>>> >> Campbell, Manchester United Hargreaves, Manchester United >> >> >> >>>> >> Steven, >> >> >> >>>> >> Manchester United Keown >> >> >> >>>> > 7c7 >> >> >> >>>> > < Cluster 1 (Discriminating): Midfielders Manchester United, >> >> >> >>>> > Butt >> >> >> >>>> > Manchester United, Scholes Manchester United, Brown >> >> >> >>>> > Manchester >> >> >> >>>> > United, Wes Manchester United >> >> >> >>>> > --- >> >> >> >>>> >> Cluster 1 (Discriminating): Manchester United Gerrard, >> >> >> >>>> >> Bulent >> >> >> >>>> >> Sol >> >> >> >>>> >> Campbell, Manchester United Hargreaves, Manchester United >> >> >> >>>> >> Steven, >> >> >> >>>> >> Manchester United Keown >> >> >> >>>> > STATUS : OK Cluster file testA7.cluster.0 created. >> >> >> >>>> > STATUS : OK Cluster file testA7.cluster.1 created. >> >> >> >>>> > >> >> >> >>>> > Could you check these out? >> >> >> >>>> > >> >> >> >>>> > Thanks, >> >> >> >>>> > Ted >> >> >> >>>> > >> >> >> >>>> > --- >> >> >> >>>> > Ted Pedersen >> >> >> >>>> > http://www.d.umn.edu/~tpederse >> >> >> >>>> >> >> >> >>>> >> >> >> >>>> >> >> >> >>>> -- >> >> >> >>>> Ted Pedersen >> >> >> >>>> http://www.d.umn.edu/~tpederse >> >> >> >>> >> >> >> >>> >> >> >> >> >> >> >> > >> >> >> > >> >> >> > >> >> >> > -- >> >> >> > Ted Pedersen >> >> >> > http://www.d.umn.edu/~tpederse >> >> >> >> >> >> >> >> >> >> >> >> -- >> >> >> Ted Pedersen >> >> >> http://www.d.umn.edu/~tpederse >> >> > >> >> > >> >> >> >> >> >> >> >> -- >> >> Ted Pedersen >> >> http://www.d.umn.edu/~tpederse >> > >> > >> >> >> >> -- >> Ted Pedersen >> http://www.d.umn.edu/~tpederse > > -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Mrs s. j. <suj...@ya...> - 2010-05-31 11:11:13
|
Hi, I am Sujana doing my phd at National University of Ireland Maynooth. A part of my research is text comparision. I am looking at how one text can be compaired to few other texts and find if the preceeding texts are relevant to the first one (content analysis). I was looking at LSA which looks at the similarity between texts (one-to-many). Then I stumbled on Text::SenseCluster which does take LSA into consideration. My query is: Does SenseCluster helps to find if the 2nd, 3rd and so on texts(paragraphs) are similar to the first text (a paragraph)? Could you please provide with an example or atleast a link which guides me how to use this perl module in a perl code... Does this module already have a corpus which can compare paragraphs? I would be grateful if I could get some help in this regard from your end. I dont think anybody else has used this module as of yet....so I couldnt get help on the internet. Many Thanks. Regards, Sujana. |
From: Ted P. <tpederse@d.umn.edu> - 2010-05-23 14:16:36
|
Greetings all, I recently participated in the Sense Induction task of Semeval-2, and found it to be a very interesting and worthwhile experience. http://www.cs.york.ac.uk/semeval2010_WSI/index.html The final camera-ready version of the paper that describes that experience is now available here: http://www.d.umn.edu/~tpederse/Pubs/pedersen-semeval2-2010.pdf Duluth-WSI: SenseClusters Applied to the Sense Induction Task of SemEval-2 (Pedersen) - To Appear in the Proceedings of the SemEval 2010 Workshop : the 5th International Workshop on Semantic Evaluations, July 15-16, 2010, Uppsala, Sweden In the end it turns out that much of this paper is really more about the evaluation methods of the task than it is my participating system, although I do give some details of what I attempted in my systems (all of which is available fairly directly from SenseClusters http://senseclusters.sourceforge.net) In any case, I do have some concerns about how we do unsupervised evaluations which I've tried to lay out in this paper, and I continue to think (although it's not explicitly stated in this paper) that the F-score we have been using for evaluation in SenseClusters is pretty reliable. I think it is necessary (but not sufficient) that an evaluation measure for unsupervised sense induction (or discrimination as we tend to call it) do the following: 1) Not be fooled by random baselines. A random system should get a painfully low score. :) 2) Reward systems that predict the correct number of senses (relative to the gold standard) and penalize those that get the number of clusters wrong with increasing severity as the actual and predicted number of senses differ. Interestingly enough some of the evaluation measures in this task did not meet either or both of these conditions, which is part of what prompted the focus of this particular paper. The paired F-score that was used in the SemEval-2 task is fairly similar to the SenseClusters F-score, and I think both of these meet the above conditions reasonably well. But, I'll be doing a more formal and comprehensive comparison between them and other possible evaluation methods in the near future to try and establish just how well, and maybe formulate a set of necessary and sufficient conditions that we should try to meet. Any other thoughts and ideas about how to evaluate unsupervised sense induction systems are of course very welcome. Enjoy, Ted -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <dul...@gm...> - 2009-08-09 18:04:38
|
Greetings all, We've gotten back our main server, and so SenseClusters is now available at : http://marimba.d.umn.edu/cgi-bin/SC-cgi/index.cgi In addition, it will remain at the backup site until further notice : http://talisker.d.umn.edu/cgi-bin/SC-cgi/index.cgi All things being equal you should use marimba (it's a faster machine) but for smaller files you probably won't notice much difference between the two. They are running the exact same version of SenseClusters (v 1.01) Please let us know if you have any questions or concerns. Enjoy! Ted -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <tpederse@d.umn.edu> - 2009-08-09 18:02:10
|
If you are using an Ubuntu web server (as we do) for the SenseClusters web interface, you will need to be a little careful when setting up latex. The programs that generate the pdf plots that show clustering criterion functions against number of clusters are created using gnuplot, ps2pdf, and latex. If you just install latex on Ubuntu, you will not get the fullpage.sty file, which is used in creating these plots. This leads to a "silent failure" in the web interface, where blank pdf files are created for those plots. To avoid this, you should make sure to install an extra package which will include fullpage.sty. You need to install the following: sudo apt-get install texlive-latex-base sudo apt-get install tetex-extra In fact, I just noticed that our web server wasn't set up properly early this summer, so this has now been corrected. I'll add a note about this to the install instructions whenever we have a new release. Please let me know of any questions or concerns! Cordially, Ted -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <dul...@gm...> - 2008-12-04 18:25:42
|
Hi Savas, Thanks for the interesting question. There is no annotation tool that does Senseval-2 format, although it's generally such an uncomplicated format it isn't hard to convert into. We have a number of converter programs that convert text in various formats into the Senseval-2 format, you might want to try one of those first and see if they support a format that is convenient for you to work with. Then you could create your data in that format, run the converter on it and then have Senseval-2 format. You can find those converters here... http://www.d.umn.edu/~tpederse/tools.html You will also notice that SenseClusters has a Senseval2 to text converter, which can be useful to get data out of the Senseval-2 format... http://search.cpan.org/~tpederse/Text-SenseClusters-1.01/Toolkit/preprocess/sval2/sval2plain.pl I hope this all helps! Please let us know if there are additional questions, concerns, ideas, etc. Cordially, Ted On Wed, Dec 3, 2008 at 11:28 AM, Savas Yildirim <sa...@gm...> wrote: > Hi, > > Is there any Sense Annotater Tool in accordance with Senseval Format ? > Or do I should annotate my corpus manually... > > Best Regards > > -- > Savas Yildirim > Istanbul Bilgi University & Universitat Tubingen > > Postal Address in Tuebingen: > Seminar für Sprachwissenschaft > Universität Tübingen > Wilhelmstraße 19 > Room 1.07 > D-72074 Tübingen > > Postal Address in Istanbul: > Sisli 34440 Dolapdere Kurtulusdere cad. No:47 > Istanbul / Turkey > Phone: > (0090) (212) 311 50 00 > > ------------------------------------------------------------------------- > This SF.Net email is sponsored by the Moblin Your Move Developer's challenge > Build the coolest Linux based applications with Moblin SDK & win great prizes > Grand prize is a trip for two to an Open Source event anywhere in the world > http://moblin-contest.org/redirect.php?banner_id=100&url=/ > _______________________________________________ > senseclusters-developers mailing list > sen...@li... > https://lists.sourceforge.net/lists/listinfo/senseclusters-developers > -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Savas Y. <sa...@gm...> - 2008-12-03 17:28:40
|
Hi, Is there any Sense Annotater Tool in accordance with Senseval Format ? Or do I should annotate my corpus manually... Best Regards -- Savas Yildirim Istanbul Bilgi University & Universitat Tubingen Postal Address in Tuebingen: Seminar für Sprachwissenschaft Universität Tübingen Wilhelmstraße 19 Room 1.07 D-72074 Tübingen Postal Address in Istanbul: Sisli 34440 Dolapdere Kurtulusdere cad. No:47 Istanbul / Turkey Phone: (0090) (212) 311 50 00 |
From: Ted P. <tpederse@d.umn.edu> - 2008-07-11 15:38:12
|
Hi Paula, I very much appreciate these small details, they are quite helpful! You've actually solved a mystery for me - when I updated the web interface to version 1.01, I wasn't sure why a particular line was in our config.txt file - that line was : cgi=cgi-bin I decided to omit that line just because it seemed not to impact anything, but, you discovered otherwise. :) So, to make a long story short, I've put that line back into our config.txt file, and the "Start Over" link is working again. Thanks very much, and please do let us know of anything else you might notice, or questions you might have! Cordially, Ted On Fri, Jul 11, 2008 at 4:47 AM, Paula Cristina Vaz <pau...@gm...> wrote: > Dear Ted, > > I have been using SenseClusters again. The last page (the results page) has > a link that says "Start over". > > When I try to use this link I get this messager. > > Not Found > > The requested URL /SC-cgi/index.cgi was not found on this server. > > ________________________________ > > Is just a detail! > Best regards, > -- > Paula Cristina Vaz > > Faz o que deves e... está no que fazes! -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <dul...@gm...> - 2008-06-17 22:34:25
|
Greetings all, I have a small adventure today moving the SenseClusters web interface from it's temporary home on talisker (a Fedora Core Linux system) back to it's permanent home, marimba, which is our designated "production" web server, and newly upgraded now with 12 GB of memory and the new Ubuntu Linux distribution (Hardy). The process of getting the web interface running involved a few changes to the system which I will integrate into the notes for the next release, in the event you wish to run the web server on Ubuntu. Please note that all that follows is specific to the web interface, so you don't need to worry about that if you don't run that. So, a few simple things first. Ubuntu prefers that Perl be located at /usr/bin/perl, so I changed all references to /usr/local/bin/perl in the Web/ .pl and .cgi files. It would be possible to create an alias for /usr/local/bin/perl, but I have observed that sometimes this seems to cause problems (for reasons I don't fully understand). Then, config.txt needs to be changed. Fortunately we've kept the install on Ubuntu fairly simple, so there aren't too many paths involved. In our example config.txt file you see a situation where different programs are installed in many different locations, so it is fairly complicated. That is what you need to do if you aren't able to install everything you need as root - at this point we are able to do that on marimba so things are pretty simple. Here's the config.txt file that is currently being used on marimba. PATH=/bin:/sbin:/usr/bin:/usr/sbin:/usr/local/bin PERL5LIB=/usr/local/lib/perl/5.8.8:/usr/local/share/perl/5.8.8:/usr/lib/perl5:/usr/share/perl5:/usr/lib/perl/5.8:/usr/share/perl/5.8:/usr/local/lib/site_perl SC-cgi=/usr/lib/cgi-bin/SC-cgi SC-htdocs=/var/www/SC-htdocs If you are paying close attention, you'll note that this config.txt file has one less line that the example - I didn't actually see that the cgi=cgi-bin line was being used, so I omitted that, although it doesn't hurt anything being there. In general when dealing with Ubuntu one must be aware that the default locations for cgi-bin and htdocs are somewhat different than what one normally gets with apache. You can see above that the cgi directory is /usr/lib/cgi-bin, and the htdocs directory is /var/www. The install instructions now are somewhat oriented towards the Fedora / Red Hat organization of things, but in future releases I'll make it more neutral between the two. All of the above was pretty easy. This last one was the painful part. :) I was running the web interface, and I got to the last screen, and I kept getting errors, saying that there was a problem and that the logfile couldn't be opened. In general the best thing to do when you see those kinds of errors is to check out the apache logs, found in Ubuntu at /var/log/apache2/error.log In that I found the following: sh: Syntax error: Bad fd number This is a strange looking error, and fd made me think of file descriptors, so I was looking around very carefully at file permissions (recall that those need to be set to allow rwx access all the way to the directories where we write output files, and this is among the more common errors in setting up the web interface, along with syntax errors in config.txt) Anyway, after digging around in the Ubuntu mailing lists, I found the following: https://wiki.ubuntu.com/DashAsBinSh So, for whatever reason Ubuntu has decided to make a fairly significant change to /bin/sh, which is our generic command line processor. Apparently this change should not affect you if you are POSIX compliant, but, I'm not even sure what that means - I suspect there might be an issue with the Perl system( ) calls that we use when moving files around in the web interface. But, rather than trying to debug that I decided to restore marimba to using the standard /bin/sh via this command: sudo dpkg-reconfigure dash After I did that, the web interface was working. So, marimba will be the permanent home of SenseClusters - talisker will move on to other uses in the next week or two (although the web interface is still there, all links have been changed to point to marimba). Give the web interface to SenseClusters a try if you haven't lately, and let us know of any problems or concerns you may have. http://marimba.d.umn.edu/cgi-bin/SC-cgi/index.cgi Enjoy, Ted -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <dul...@gm...> - 2008-04-13 20:48:46
|
On Sat, Apr 12, 2008 at 5:19 AM, Teshome Kassie <tk...@ya...> wrote: > Hell all; > > Does SenseClusters support Utf-8 ? > > Teshome > Great question, and I think the answer is no. Unfortunately not. The main issue I think is not so much SenseClusters as it is Text::NSP, which is what we use for a significant portion of our feature extraction needs. There has been considerable discussion regarding how to make Text::NSP better at handling different character sets. If you are interested in the history of that discussion, you can see the most recent version of it here: http://www.mail-archive.com/ng...@ya.../msg00156.html The short version is that I've decided that the right thing to do is to use the Perl module Encode in Text::NSP to provide full unicode support. The only draw back is that this requires a bit of work, and right now it hasn't risen high enough in the queue. But, it's getting there, especially since SenseClusters has such a heavy dependence on Text::NSP. http://search.cpan.org/dist/Encode/ So, that's the long term solution I have planned. Unfortunately that doesn't help much in the shorter term. Sorry I don't have a better answer. Other suggestions are most welcome. Cordially, Ted -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Teshome K. <tk...@ya...> - 2008-04-12 10:19:26
|
Hell all; Does SenseClusters support Utf-8 ? Teshome sen...@li... wrote: Send senseclusters-developers mailing list submissions to sen...@li... To subscribe or unsubscribe via the World Wide Web, visit https://lists.sourceforge.net/lists/listinfo/senseclusters-developers or, via email, send a message with subject or body 'help' to sen...@li... You can reach the person managing the list at sen...@li... When replying, please edit your Subject line so it is more specific than "Re: Contents of senseclusters-developers digest..." Today's Topics: 1. Fwd: New module Text::SenseClusters (Ted Pedersen) 2. Re: SVDPACKC install problem (Ted Pedersen) 3. Re: senseclusters (Ted Pedersen) 4. compat-gcc-32 and SVDPACKC (Ted Pedersen) ---------------------------------------------------------------------- Message: 1 Date: Mon, 7 Apr 2008 15:36:25 -0500 From: "Ted Pedersen" Subject: [Senseclusters-developers] Fwd: New module Text::SenseClusters To: sen...@li... Message-ID: Content-Type: text/plain; charset=ISO-8859-1 SenseClusters is now a registered Perl module. :) This means that people will see it in the list of registered modules, which helps some with respect to visibility. I don't know what the criteria for registration turn out to be, but I know not all modules get registered (even if the developer requests). So, it's a small thing but still nice. ---------- Forwarded message ---------- From: Perl Authors Upload Server Date: Mon, Apr 7, 2008 at 2:11 AM Subject: New module Text::SenseClusters To: mo...@pe..., tpe...@cp... The next version of the Module List will list the following module: modid: Text::SenseClusters DSLIP: Rdpfg description: Cluster Similar Words and Contexts userid: TPEDERSE (Ted Pedersen) chapterid: 11 (String_Lang_Text_Proc) enteredby: BDFOY (brian d foy) enteredon: Mon Apr 7 07:11:48 2008 GMT The resulting entry will be: Text:: ::SenseClusters Rdpfg Cluster Similar Words and Contexts TPEDERSE Please allow a few days until the entry will appear in the published module list. Parts of the data listed above can be edited interactively on the PAUSE. See https://pause.perl.org/pause/authenquery?ACTION=edit_mod Thanks for registering, -- The PAUSE -- Ted Pedersen http://www.d.umn.edu/~tpederse ------------------------------ Message: 2 Date: Tue, 8 Apr 2008 08:36:58 -0500 From: "Ted Pedersen" Subject: Re: [Senseclusters-developers] SVDPACKC install problem To: "Teshome Kassie" Cc: sen...@li... Message-ID: Content-Type: text/plain; charset=ISO-8859-1 Hi Teshome, It looks like you don't have PDL or Bit::Vector installed. Both of those need to be on your system for SenseClusters to run. You can see that they are missing because of messages like this... Can't locate Bit/Vector.pm in @INC (@INC contains: /etc/perl /usr/local/lib/perl/5.8.8 /usr/local/share/perl/5.8.8 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.8 /usr/share/perl/5.8 /usr/local/lib/site_perl .) at /usr/local/bin/bitsimat.pl line 340. BEGIN failed--compilation aborted at /usr/local/bin/bitsimat.pl line 340. Test Error Can't locate PDL.pm in @INC (@INC contains: /etc/perl /usr/local/lib/perl/5.8.8 /usr/local/share/perl/5.8.8 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.8 /usr/share/perl/5.8 /usr/local/lib/site_perl .) at /usr/local/bin/simat.pl line 307. BEGIN failed--compilation aborted at /usr/local/bin/simat.pl line 307. Test Error So, Bit-Vector and PDL both have C components to them, so I'm guessing the problem you were having with gcc probably prevented them from installing. You could just try and install them using the Bundle again... cpan > install Bundle::Text::SenseClusters and that will locate any missing dependencies that aren't yet installed... Good luck! Ted On Tue, Apr 8, 2008 at 8:25 AM, Teshome Kassie wrote: > Hello Ted, > > I have solved the first problem that is installing external packages. > > But when I run ALL-TESTS.sh I got the results with reporting errors. > > I have attached it for review. > Could you help me once again what to do? > > Teshome > > > > Ted Pedersen wrote: > > Hi Teshome, > > It looks to me like you are having problems with compiling las2.c - > you do appear to have gcc installed, but then it's not finding your > include files (like stdio.h, which are usually provided with your > system....So, without finding those .h files nothing else will work > with the compile. > > The output of your gcc -v command tells me you are running Ubuntu, and > one of the strange things about Ubuntu is that it does not include > "developer" settings by default, that is to say Ubuntu sort of assumes > you won't be compiling C programs, so they don't give you the .h files > by default. You just need to install those.... > > Here's a nice post on this issue from : http://www.spiration.co.uk/post/1291 > > I've cut and pasted that note below, which I think is right on target. > If you run the apt-get command below I think things will be fine. If > you want to run install.sh again you should just delete that cluto > directory that got unpacked in the same directory as install.sh, and > then submit again....(after doing the apt-get command below). > > Let us know how that works out... > > Good luck, > Ted > > ============================================================== > > Somehow I assumed that I would be able to compile a basic C program on > any linux box - I mean unices are useful like that, right? So I was a > bit surprised when I decided to compile a bit of C just now (in fact > Christian Wolff's neat little mp3cut tool) and was faced with the > following errors: > > chris@snackerjack-lx:/usr/src/mp3cut-0.8$ make > gcc -o mp3cut mp3cut.c > mp3cut.c:25:19: error: stdio.h: No such file or directory > mp3cut.c:26:20: error: stdlib.h: No such file or directory > mp3cut.c:27:20: error: string.h: No such file or directory > mp3cut.c:28:20: error: unistd.h: No such file or directory > > ..etc .. etc > > > So what kind of unix comes with make and a compiler, but none of the > required dev libraries and headers required to make any normal C > program work? Well a brief google yielded the following solution.. > Yup, you guessed it.. you need to install a dev package: > > sudo apt-get install build-essential > > > Excuse my rant, but if it's so 'essential', then why isn't it > installed as part of the core system? I find that kinda weird. Anyway, > problem fixed and C-sources are now compiling. > > christo > > > > ======================================================= > > % sudo ./install.sh /usr/local/bin > [sudo] password for teshomek: > rm: No match. > ************************************************** > let's install svdpackc... > this involes compiling the las2 program and then > doing a very simple check to make sure that worked > via a diff command of output produced by your > installed version with a key we provide (lao2.key) > > your gcc version is gcc (GCC) 4.1.3 20070929 (prerelease) (Ubuntu > 4.1.2-16ubuntu2) Copyright (C) 2006 Free Software Foundation, Inc. > This is free software; see the source for copying conditions. There is > NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR > PURPOSE. > SVDPACKC generally requires 3.2 or 3.3, and usually > has problems with 4.0 or above > > gcc -ansi -O -c las2.c > las2.c:12:19: error: stdio.h: No such file or directory > las2.c:13:20: error: stdlib.h: No such file or directory > las2.c:14:20: error: string.h: No such file or directory > las2.c:15:19: error: errno.h: No such file or directory > las2.c:16:18: error: math.h: No such file or directory > las2.c:17:19: error: fcntl.h: No such file or directory > In file included from las2.c:18: > las2.h:63: error: 'NULL' undeclared here (not in a function) > las2.h:84: error: expected '=', ',', ';', 'asm' or > '__attribute__' before '*' token > las2.c: In function 'main': > las2.c:152: error: 'FILE' undeclared (first use in this function) > > las2.c:152: error: (Each undeclared identifier is reported only once > las2.c:152: error: for each function it appears in.) > las2.c:152: error: 'fp_in1' undeclared (first use in this function) > las2.c:152: error: 'fp_in2' undeclared (first use in this function) > > las2.c:161: warning: incompatible implicit declaration of built-in > function 'printf' > > las2.c:162: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c:165: warning: incompatible implicit declaration of built-in > function 'printf' > > las2.c:166: warning: incompatible implicit declaration of built-in > function 'exit' > las2.c:168: error: 'fp_out1' undeclared (first use in this function) > > las2.c:169: warning: incompatible implicit declaration of built-in > function 'printf' > > las2.c:170: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c:174: warning: incompatible implicit declaration of built-in > function 'fscanf' > > las2.c:191: warning: incompatible implicit declaration of built-in > function 'printf' > > las2.c:192: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c:214: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c:234: warning: incompatible implicit declaration of built-in > function 'malloc' > > las2.c:237: warning: incompatible implicit declaration of built-in > function 'exit' > las2.c:237: error: 'errno' undeclared (first use in this function) > > las2.c:291: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c:301: warning: incompatible implicit declaration of built-in > function 'fprintf' > > las2.c:330: warning: incompatible implicit declaration of built-in > function 'sqrt' > > las2.c:365: warning: incompatible implicit declaration of built-in > function 'fabs' > > las2.c:384: warning: incompatible implicit declaration of built-in > function 'sqrt' > > las2.c:398: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c: At top level: > las2.c:403: error: expected '=', ',', ';', 'asm' or > '__attribute__' before '*' token > las2.c: In function 'check_parameters': > > las2.c:453: warning: incompatible implicit declaration of built-in > function 'fprintf' > las2.c:453: error: 'fp_out1' undeclared (first use in this function) > > las2.c: At top level: > las2.c:457: error: expected '=', ',', ';', 'asm' or > '__attribute__' before '*' token > las2.c: In function 'write_data': > > las2.c:470: warning: incompatible implicit declaration of built-in > function 'fprintf' > las2.c:470: error: 'fp_out1' undeclared (first use in this function) > las2.c: In function 'landr': > > las2.c:594: warning: incompatible implicit declaration of built-in > function 'sqrt' > > las2.c:613: warning: incompatible implicit declaration of built-in > function 'malloc' > > las2.c:615: warning: incompatible implicit declaration of built-in > function 'exit' > las2.c:615: error: 'errno' undeclared (first use in this function) > > las2.c:635: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c:637: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'ritvec': > > las2.c:725: warning: incompatible implicit declaration of built-in > function 'malloc' > > las2.c:727: warning: incompatible implicit declaration of built-in > function 'exit' > las2.c:727: error: 'errno' undeclared (first use in this function) > > las2.c:749: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'lanso': > > las2.c:953: warning: incompatible implicit declaration of built-in > function 'printf' > > las2.c:961: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'lanczos_step': > > las2.c:1093: warning: incompatible implicit declaration of built-in > function 'fabs' > > las2.c:1112: warning: incompatible implicit declaration of built-in > function 'sqrt' > las2.c: In function 'purge': > > las2.c:1255: warning: incompatible implicit declaration of built-in > function 'fabs' > > las2.c:1280: warning: incompatible implicit declaration of built-in > function 'sqrt' > las2.c: In function 'stpone': > > las2.c:1367: warning: incompatible implicit declaration of built-in > function 'sqrt' > > las2.c:1368: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'startv': > > las2.c:1467: warning: incompatible implicit declaration of built-in > function 'sqrt' > las2.c: In function 'random': > > las2.c:1514: warning: incompatible implicit declaration of built-in > function 'atan' > > las2.c:1515: warning: incompatible implicit declaration of built-in > function 'sqrt' > las2.c: In function 'pythag': > > las2.c:1564: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'error_bound': > > las2.c:1630: warning: incompatible implicit declaration of built-in > function 'fabs' > > las2.c:1632: warning: incompatible implicit declaration of built-in > function 'sqrt' > > las2.c:1640: warning: incompatible implicit declaration of built-in > function 'sqrt' > las2.c: In function 'imtqlb': > > las2.c:1738: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'imtql2': > > las2.c:1894: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'store': > > las2.c:2159: warning: incompatible implicit declaration of built-in > function 'fprintf' > las2.c:2159: error: 'stderr' undeclared (first use in this function) > > las2.c:2165: warning: incompatible implicit declaration of built-in > function 'fprintf' > las2.c: In function 'idamax': > > las2.c:2381: warning: incompatible implicit declaration of built-in > function 'fabs' > > > make: *** [las2.o] Error 1 > las2: Command not found. > > check your las2 output against our key ... > diff: lao2: No such file or directory > > there *may* be some differences in the output of > your lao2 file compared to the key we provide > these are due to execution time differences and > arithmetic differences on different architectures > however, as long as your lao2 file has some output > in a format similar to lao2.key then it you can > assume it has compiled and is running successfully > > clean up a few output files... > cp: cannot stat `las2': No such file or directory > rm -fr las2.o timersun.o las2 lav2 matrix > > ...now installing las2 in /usr/local/bin > *************************************************** > now let's install cluto .... > we are using wget, if you don't have that installed > or there are some problems accessing the cluto site > this could fail, in which case you would need to > visit http://glaros.dtc.umn.edu/gkhome/fetch/sw/cluto/cluto-2.1.1.tar.gz > and download to install (verify the url is correct) > > --10:39:38-- > http://glaros.dtc.umn.edu/gkhome/fetch/sw/cluto/cluto-2.1.1.tar.gz > => `cluto-2.1.1.tar.gz' > Resolving glaros.dtc.umn.edu... 128.101.191.158 > Connecting to glaros.dtc.umn.edu|128.101.191.158|:80... connected. > HTTP request sent, awaiting response... 200 OK > Length: 9,364,297 (8.9M) [application/x-gzip] > > 100%[====================================>] 9,364,297 2.88K/s ETA 00:00 > > 12:34:26 (1.33 KB/s) - `cluto-2.1.1.tar.gz' saved [9364297/9364297] > > ...will now unzip cluto-2.1.1.tar.gz > ...will now untar cluto-2.1.1.tar > cluto-2.1.1/ > cluto-2.1.1/CHANGES > cluto-2.1.1/cluto.h > cluto-2.1.1/COPYRIGHT > cluto-2.1.1/Linux/ > cluto-2.1.1/Linux/libcluto.a > cluto-2.1.1/Linux/scluster > cluto-2.1.1/Linux/vcluster > cluto-2.1.1/manual.pdf > cluto-2.1.1/manual.ps > cluto-2.1.1/Matrices/ > cluto-2.1.1/Matrices/genes1.mat > cluto-2.1.1/Matrices/genes1.mat.rlabel > cluto-2.1.1/Matrices/genes2.mat > cluto-2.1.1/Matrices/genes2.mat.clabel > cluto-2.1.1/Matrices/genes2.mat.rlabel > cluto-2.1.1/Matrices/k1b.mat > cluto-2.1.1/Matrices/k1b.mat.clabel > cluto-2.1.1/Matrices/k1b.mat.rclass > cluto-2.1.1/Matrices/README > cluto-2.1.1/Matrices/sports.clabel > cluto-2.1.1/Matrices/sports.mat > cluto-2.1.1/Matrices/sports.rclass > cluto-2.1.1/Matrices/t4.mat > cluto-2.1.1/Matrices/t7.mat > cluto-2.1.1/Matrices/tr23.graph > cluto-2.1.1/Matrices/tr23.graph.rclass > cluto-2.1.1/Matrices/tr23.mat > cluto-2.1.1/Matrices/tr23.mat.clabel > cluto-2.1.1/Matrices/tr23.mat.r class > cluto-2.1.1/paper1.pdf > cluto-2.1.1/paper2.pdf > cluto-2.1.1/README > cluto-2.1.1/Sun/ > cluto-2.1.1/Sun/libcluto.a > cluto-2.1.1/Sun/scluster > cluto-2.1.1/Sun/vcluster > cluto-2.1.1/VERSION > cluto-2.1.1/Win32/ > cluto-2.1.1/Win32/libcluto.lib > cluto-2.1.1/Win32/scluster.exe > cluto-2.1.1/Win32/vcluster.exe > it looks like you are using Linux ... > ...installed scluster and vcluster in /usr/local/bin > ...make sure /usr/local/bin is included in your PATH > > if all has gone well, you have installed svdpackc (las2) > and cluto (scluter and vcluster) in /usr/local/bin > let's check...you should see three files: las2 scluster vcluster > > ls: /usr/local/bin/las2: No such file or directory > -rwxr-x--- 1 root 1178264 2008-04-07 12:34 /usr/local/bin/scluster > -rwxr-x--- 1 root 1212576 2008-04-07 12:34 /usr/local/bin/vcluster > > .... end of External Software Installation for SenseClusters .... > > if you have some problem with this script, please save the output > and send it to tpederse at d.umn.edu for further assistance > % > > > > On Mon, Apr 7, 2008 at 8:02 AM, Teshome Kassie wrote: > > Hi Ted, > > > > I have attached the error with external installation. > > > > Teshome > > > > > > > > > > ________________________________ > > You rock. That's why Blockbuster's offering you one month of Blockbuster > > Total Access, No Cost. > > > > -- > Ted Pedersen > http://www.d.umn.edu/~tpederse > > > > > ________________________________ > You rock. That's why Blockbuster's offering you one month of Blockbuster > Total Access, No Cost. -- Ted Pedersen http://www.d.umn.edu/~tpederse ------------------------------ Message: 3 Date: Wed, 9 Apr 2008 08:19:28 -0500 From: "Ted Pedersen" Subject: Re: [Senseclusters-developers] senseclusters To: wg...@co... Cc: sen...@li... Message-ID: Content-Type: text/plain; charset=ISO-8859-1 This still look pretty strange. Could you send me your modified makedata.sh file? Just a few misc thoughts on makedata.sh - Whenever you run makedata.sh you want to make sure that you don't have a LexSample directory present in that directory - you should rename any existing ones (if you want to keep them). Your messages below indicate that there is (perhaps?) already a LexSample directory present... My hope is that you can go back to the original makedata.sh file - and then after you have created a LexSample directory with it, before you run it again, delete or move the LexSample directory before you try it again. I should probably modify makedata.sh to move an existing LexSample directory, to avoid any confusion on that point. Anyway, it's good that the demo script is running, I hope you are able to run the others too. Thanks! Ted On Tue, Apr 8, 2008 at 9:16 PM, wrote: > Hi Ted, > > When I run makedata.sh, the error information is: > mv: overwrite `cool.a-test.xml'? y > mv: overwrite `day.n-test.xml'? y > mv: overwrite `facility.n-test.xml'? y > mv: overwrite `fine.a-test.xml'? y > mv: overwrite `free.a-test.xml'? y > mv: overwrite `grip.n-test.xml'? y > mv: overwrite `live.v-test.xml'? y > mv: overwrite `material.n-test.xml'? > mv: overwrite `mouth.n-test.xml'? y > mv: overwrite `natural.a-test.xml'? y > mv: overwrite `post.n-test.xml'? y > mv: overwrite `simple.a-test.xml'? y > ERROR(frequency.pl): > Source file doesn't exist ... > ERROR(filter.pl): > Source file doesn't exist... > mv: overwrite `train.v-test.xml'? y === message truncated === __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com |
From: Ted P. <dul...@gm...> - 2008-04-12 06:07:44
|
Greetings all, More on my continuing saga with SVDPACKC and various versions of gcc. I recently finished upgrading a few of our systems here to relatively current version of Fedora Core Linux, and found that in the process I ended up using a gcc with a version greater than 4.0.0. In general this did continue to cause problems with SVDPACKC, but not always. However, in the process of figuring out how to "back off" to a less recent version of gcc, I found that quite a few software applications seem to have this problem (not compiling or running under a more recent version of gcc). It seems that in Fedora Core and any other Linux that allows for the use of the yum package manager (which are many, most of them) you can get a gcc 3.2 compiler installed as gcc32, which will not mess up your existing gcc and then allow you to compile things like SVDPACKC so that they are more likely to run. yum install compat-gcc-32 Will get that installed for you, and other package managers probably will do the same thing too. This exists as an rpm at various mirrors, including: http://rpmfind.net/linux/rpm2html/search.php?query=compat-gcc-32 What this ends up installing is a 3.2 version of gcc as gcc32, which you can then use when compiling SVDPACK (just go to External/makefile and change CC such that it uses gcc32). I did this on my system and then SVDPACKC worked as we hope, and I still have the most current version of gcc available as gcc. While this isn't a perfect solution, it is at least somewhat systematic and safe, and fairly easy to do on a Linux based system. You might want to check your system and see if you happen to already have gcc32 installed, if so you can just use that for installing SVDPACKC. I hope this helps. Further observations, questions, alternatives are certainly welcome! Cordially, Ted -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <dul...@gm...> - 2008-04-09 13:19:39
|
This still look pretty strange. Could you send me your modified makedata.sh file? Just a few misc thoughts on makedata.sh - Whenever you run makedata.sh you want to make sure that you don't have a LexSample directory present in that directory - you should rename any existing ones (if you want to keep them). Your messages below indicate that there is (perhaps?) already a LexSample directory present... My hope is that you can go back to the original makedata.sh file - and then after you have created a LexSample directory with it, before you run it again, delete or move the LexSample directory before you try it again. I should probably modify makedata.sh to move an existing LexSample directory, to avoid any confusion on that point. Anyway, it's good that the demo script is running, I hope you are able to run the others too. Thanks! Ted On Tue, Apr 8, 2008 at 9:16 PM, <wg...@co...> wrote: > Hi Ted, > > When I run makedata.sh, the error information is: > mv: overwrite `cool.a-test.xml'? y > mv: overwrite `day.n-test.xml'? y > mv: overwrite `facility.n-test.xml'? y > mv: overwrite `fine.a-test.xml'? y > mv: overwrite `free.a-test.xml'? y > mv: overwrite `grip.n-test.xml'? y > mv: overwrite `live.v-test.xml'? y > mv: overwrite `material.n-test.xml'? > mv: overwrite `mouth.n-test.xml'? y > mv: overwrite `natural.a-test.xml'? y > mv: overwrite `post.n-test.xml'? y > mv: overwrite `simple.a-test.xml'? y > ERROR(frequency.pl): > Source file <test-lexelts-test.xml> doesn't exist ... > ERROR(filter.pl): > Source file <test-lexelts-test.xml> doesn't exist... > mv: overwrite `train.v-test.xml'? y > > But it doesn't influence the result, I guess, because I can use > sc-toolkit.sh > > > > > Quoting Ted Pedersen <dul...@gm...>: > > > > Hi, thanks very much for this feedback. I'm a little surprised that > > you are having this particular error - could you run your script > > without your fix and send me the error you get? I'm not able to > > re-create it on my system. > > > > Thanks! > > Ted > > > > On Fri, Apr 4, 2008 at 8:45 PM, <wg...@co...> wrote: > > > > > > > > > > > Thank you very much for the quick response! > > > So it does not matter if the results are different. > > > > > > PS: When I run demo, it seems makedata.sh should be modified a little > bit. > > > I inserted the sentences in line 49: > > > if($lexelt == "test-lexelts") then > > > echo "For directory test-lexelts, do nothing.."; > > > else > > > ...... > > > endif > > > Because there is no $lexelt-test.xml in directory test-lexelts and it > will > > > cause errors... > > > > > > > > > > > > > > > > > > Quoting Ted Pedersen <tpederse@d.umn.edu>: > > > > > > > > > > Hi Weiwei, > > > > > > > > See responses inline... > > > > > > > > > > > > > > > > > > (1) error information > > > > > The scripts in Testing/svd/svdpackout keep saying "Test errors". > For > > > > > exmaple, when running testA5.sh, the output is: > > > > > Test A5 for svdpackout.pl > > > > > Running las2 > > > > > Running svdpackout.pl --rowonly --format f8.5 lav2 lao2 > > > > test-A5.output > > > > > Test Error > > > > > When tested against test-A5.reqd > > > > > 1,12c1,12 > > > > > < 0.02330 0.78145 > > > > > < 0.06605 0.99289 > > > > > < 0.36126 0.00000 > > > > > < 0.37638 0.43627 > > > > > < 0.40459 0.00000 > > > > > < 0.43953 0.06880 > > > > > < 0.48444 0.17084 > > > > > < 0.48444 0.17084 > > > > > < 0.54986 0.00000 > > > > > < 0.73770 0.09099 > > > > > < 0.82344 0.00000 > > > > > < 1.17799 0.00000 > > > > > --- > > > > > > > > > > > > > SVDPACKC seems to provide somewhat different results as we move from > > > > system to system, so these test failures aren't too surprising. I > > > > think right now the test cases don't really account for system > > > > variation, so as long as las2 is producing output, things should be > > > > ok. I'm in the process of modifying svdpackout.pl, and as a part of > > > > that hope to resolve some of these issues with the test cases, but for > > > > now this kind of variation is fairly normal (unfortunately). > > > > > > > > > > > > > > > > > > > 0.02330 0.00000 > > > > > > 0.05804 0.00000 > > > > > > 0.06605 0.00000 > > > > > > 0.36126 0.11493 > > > > > > 0.37638 0.00000 > > > > > > 0.40459 0.18044 > > > > > > 0.43953 0.00000 > > > > > > 0.48444 0.00000 > > > > > > 0.48444 0.00000 > > > > > > 0.54986 0.22522 > > > > > > 0.73770 0.00000 > > > > > > > > > > > > > > > > > > > > > (2) installation > > > > > I used CPAN to install all the perl package. Then I installed the > > > CLUTO, > > > > > SVD, Senseclusters. SVD is in /nlp/tools/SVDPACKC. > > > > > The environmental variables I set were: > > > > > export PERL5LIB=/usr/lib/perl5/site_perl/5.8.8:/usr/lib/perl5/5.8.8 > > > > > # sense cluster > > > > > export HOMEDIR=/nlp/tools > > > > > export SVDPACK=$HOMEDIR/SVDPACKC > > > > > export CLUTO=$HOMEDIR/cluto-2.1.1 > > > > > export MYCLUTO=$CLUTO/Linux > > > > > export SENSECLUSTERS=/usr > > > > > export NSP=/usr/lib/perl5/site_perl/5.8.8 > > > > > > > > > > PATH=$PATH:$HOME/bin:$CLUTO:$MYCLUTO:$SVDPACK:$SENSECLUSTERS > > > > > > > > > > > > > How did you do the install, that is what sequence of commands did you > > > > run to install? And then what was the message that SenseClusters > > > > displayed after 'make install'? > > > > > > > > Also, what problem are you having? The fact that the svdpackout.pl > > > > test is running suggests things are working at least in part... > > > > > > > > Thanks, > > > > Ted > > > > > > > > > > > > > > > > > > > > > > Did I miss something? Thank you very much! > > > > > > > > > > Regards, > > > > > Weiwei > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > > Ted Pedersen > > > > http://www.d.umn.edu/~tpederse > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > Ted Pedersen > > http://www.d.umn.edu/~tpederse > > > > > > > -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <dul...@gm...> - 2008-04-08 13:37:08
|
Hi Teshome, It looks like you don't have PDL or Bit::Vector installed. Both of those need to be on your system for SenseClusters to run. You can see that they are missing because of messages like this... Can't locate Bit/Vector.pm in @INC (@INC contains: /etc/perl /usr/local/lib/perl/5.8.8 /usr/local/share/perl/5.8.8 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.8 /usr/share/perl/5.8 /usr/local/lib/site_perl .) at /usr/local/bin/bitsimat.pl line 340. BEGIN failed--compilation aborted at /usr/local/bin/bitsimat.pl line 340. Test Error Can't locate PDL.pm in @INC (@INC contains: /etc/perl /usr/local/lib/perl/5.8.8 /usr/local/share/perl/5.8.8 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.8 /usr/share/perl/5.8 /usr/local/lib/site_perl .) at /usr/local/bin/simat.pl line 307. BEGIN failed--compilation aborted at /usr/local/bin/simat.pl line 307. Test Error So, Bit-Vector and PDL both have C components to them, so I'm guessing the problem you were having with gcc probably prevented them from installing. You could just try and install them using the Bundle again... cpan > install Bundle::Text::SenseClusters and that will locate any missing dependencies that aren't yet installed... Good luck! Ted On Tue, Apr 8, 2008 at 8:25 AM, Teshome Kassie <tk...@ya...> wrote: > Hello Ted, > > I have solved the first problem that is installing external packages. > > But when I run ALL-TESTS.sh I got the results with reporting errors. > > I have attached it for review. > Could you help me once again what to do? > > Teshome > > > > Ted Pedersen <dul...@gm...> wrote: > > Hi Teshome, > > It looks to me like you are having problems with compiling las2.c - > you do appear to have gcc installed, but then it's not finding your > include files (like stdio.h, which are usually provided with your > system....So, without finding those .h files nothing else will work > with the compile. > > The output of your gcc -v command tells me you are running Ubuntu, and > one of the strange things about Ubuntu is that it does not include > "developer" settings by default, that is to say Ubuntu sort of assumes > you won't be compiling C programs, so they don't give you the .h files > by default. You just need to install those.... > > Here's a nice post on this issue from : http://www.spiration.co.uk/post/1291 > > I've cut and pasted that note below, which I think is right on target. > If you run the apt-get command below I think things will be fine. If > you want to run install.sh again you should just delete that cluto > directory that got unpacked in the same directory as install.sh, and > then submit again....(after doing the apt-get command below). > > Let us know how that works out... > > Good luck, > Ted > > ============================================================== > > Somehow I assumed that I would be able to compile a basic C program on > any linux box - I mean unices are useful like that, right? So I was a > bit surprised when I decided to compile a bit of C just now (in fact > Christian Wolff's neat little mp3cut tool) and was faced with the > following errors: > > chris@snackerjack-lx:/usr/src/mp3cut-0.8$ make > gcc -o mp3cut mp3cut.c > mp3cut.c:25:19: error: stdio.h: No such file or directory > mp3cut.c:26:20: error: stdlib.h: No such file or directory > mp3cut.c:27:20: error: string.h: No such file or directory > mp3cut.c:28:20: error: unistd.h: No such file or directory > > ..etc .. etc > > > So what kind of unix comes with make and a compiler, but none of the > required dev libraries and headers required to make any normal C > program work? Well a brief google yielded the following solution.. > Yup, you guessed it.. you need to install a dev package: > > sudo apt-get install build-essential > > > Excuse my rant, but if it's so 'essential', then why isn't it > installed as part of the core system? I find that kinda weird. Anyway, > problem fixed and C-sources are now compiling. > > christo > > > > ======================================================= > > % sudo ./install.sh /usr/local/bin > [sudo] password for teshomek: > rm: No match. > ************************************************** > let's install svdpackc... > this involes compiling the las2 program and then > doing a very simple check to make sure that worked > via a diff command of output produced by your > installed version with a key we provide (lao2.key) > > your gcc version is gcc (GCC) 4.1.3 20070929 (prerelease) (Ubuntu > 4.1.2-16ubuntu2) Copyright (C) 2006 Free Software Foundation, Inc. > This is free software; see the source for copying conditions. There is > NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR > PURPOSE. > SVDPACKC generally requires 3.2 or 3.3, and usually > has problems with 4.0 or above > > gcc -ansi -O -c las2.c > las2.c:12:19: error: stdio.h: No such file or directory > las2.c:13:20: error: stdlib.h: No such file or directory > las2.c:14:20: error: string.h: No such file or directory > las2.c:15:19: error: errno.h: No such file or directory > las2.c:16:18: error: math.h: No such file or directory > las2.c:17:19: error: fcntl.h: No such file or directory > In file included from las2.c:18: > las2.h:63: error: 'NULL' undeclared here (not in a function) > las2.h:84: error: expected '=', ',', ';', 'asm' or > '__attribute__' before '*' token > las2.c: In function 'main': > las2.c:152: error: 'FILE' undeclared (first use in this function) > > las2.c:152: error: (Each undeclared identifier is reported only once > las2.c:152: error: for each function it appears in.) > las2.c:152: error: 'fp_in1' undeclared (first use in this function) > las2.c:152: error: 'fp_in2' undeclared (first use in this function) > > las2.c:161: warning: incompatible implicit declaration of built-in > function 'printf' > > las2.c:162: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c:165: warning: incompatible implicit declaration of built-in > function 'printf' > > las2.c:166: warning: incompatible implicit declaration of built-in > function 'exit' > las2.c:168: error: 'fp_out1' undeclared (first use in this function) > > las2.c:169: warning: incompatible implicit declaration of built-in > function 'printf' > > las2.c:170: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c:174: warning: incompatible implicit declaration of built-in > function 'fscanf' > > las2.c:191: warning: incompatible implicit declaration of built-in > function 'printf' > > las2.c:192: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c:214: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c:234: warning: incompatible implicit declaration of built-in > function 'malloc' > > las2.c:237: warning: incompatible implicit declaration of built-in > function 'exit' > las2.c:237: error: 'errno' undeclared (first use in this function) > > las2.c:291: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c:301: warning: incompatible implicit declaration of built-in > function 'fprintf' > > las2.c:330: warning: incompatible implicit declaration of built-in > function 'sqrt' > > las2.c:365: warning: incompatible implicit declaration of built-in > function 'fabs' > > las2.c:384: warning: incompatible implicit declaration of built-in > function 'sqrt' > > las2.c:398: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c: At top level: > las2.c:403: error: expected '=', ',', ';', 'asm' or > '__attribute__' before '*' token > las2.c: In function 'check_parameters': > > las2.c:453: warning: incompatible implicit declaration of built-in > function 'fprintf' > las2.c:453: error: 'fp_out1' undeclared (first use in this function) > > las2.c: At top level: > las2.c:457: error: expected '=', ',', ';', 'asm' or > '__attribute__' before '*' token > las2.c: In function 'write_data': > > las2.c:470: warning: incompatible implicit declaration of built-in > function 'fprintf' > las2.c:470: error: 'fp_out1' undeclared (first use in this function) > las2.c: In function 'landr': > > las2.c:594: warning: incompatible implicit declaration of built-in > function 'sqrt' > > las2.c:613: warning: incompatible implicit declaration of built-in > function 'malloc' > > las2.c:615: warning: incompatible implicit declaration of built-in > function 'exit' > las2.c:615: error: 'errno' undeclared (first use in this function) > > las2.c:635: warning: incompatible implicit declaration of built-in > function 'exit' > > las2.c:637: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'ritvec': > > las2.c:725: warning: incompatible implicit declaration of built-in > function 'malloc' > > las2.c:727: warning: incompatible implicit declaration of built-in > function 'exit' > las2.c:727: error: 'errno' undeclared (first use in this function) > > las2.c:749: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'lanso': > > las2.c:953: warning: incompatible implicit declaration of built-in > function 'printf' > > las2.c:961: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'lanczos_step': > > las2.c:1093: warning: incompatible implicit declaration of built-in > function 'fabs' > > las2.c:1112: warning: incompatible implicit declaration of built-in > function 'sqrt' > las2.c: In function 'purge': > > las2.c:1255: warning: incompatible implicit declaration of built-in > function 'fabs' > > las2.c:1280: warning: incompatible implicit declaration of built-in > function 'sqrt' > las2.c: In function 'stpone': > > las2.c:1367: warning: incompatible implicit declaration of built-in > function 'sqrt' > > las2.c:1368: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'startv': > > las2.c:1467: warning: incompatible implicit declaration of built-in > function 'sqrt' > las2.c: In function 'random': > > las2.c:1514: warning: incompatible implicit declaration of built-in > function 'atan' > > las2.c:1515: warning: incompatible implicit declaration of built-in > function 'sqrt' > las2.c: In function 'pythag': > > las2.c:1564: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'error_bound': > > las2.c:1630: warning: incompatible implicit declaration of built-in > function 'fabs' > > las2.c:1632: warning: incompatible implicit declaration of built-in > function 'sqrt' > > las2.c:1640: warning: incompatible implicit declaration of built-in > function 'sqrt' > las2.c: In function 'imtqlb': > > las2.c:1738: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'imtql2': > > las2.c:1894: warning: incompatible implicit declaration of built-in > function 'fabs' > las2.c: In function 'store': > > las2.c:2159: warning: incompatible implicit declaration of built-in > function 'fprintf' > las2.c:2159: error: 'stderr' undeclared (first use in this function) > > las2.c:2165: warning: incompatible implicit declaration of built-in > function 'fprintf' > las2.c: In function 'idamax': > > las2.c:2381: warning: incompatible implicit declaration of built-in > function 'fabs' > > > make: *** [las2.o] Error 1 > las2: Command not found. > > check your las2 output against our key ... > diff: lao2: No such file or directory > > there *may* be some differences in the output of > your lao2 file compared to the key we provide > these are due to execution time differences and > arithmetic differences on different architectures > however, as long as your lao2 file has some output > in a format similar to lao2.key then it you can > assume it has compiled and is running successfully > > clean up a few output files... > cp: cannot stat `las2': No such file or directory > rm -fr las2.o timersun.o las2 lav2 matrix > > ...now installing las2 in /usr/local/bin > *************************************************** > now let's install cluto .... > we are using wget, if you don't have that installed > or there are some problems accessing the cluto site > this could fail, in which case you would need to > visit http://glaros.dtc.umn.edu/gkhome/fetch/sw/cluto/cluto-2.1.1.tar.gz > and download to install (verify the url is correct) > > --10:39:38-- > http://glaros.dtc.umn.edu/gkhome/fetch/sw/cluto/cluto-2.1.1.tar.gz > => `cluto-2.1.1.tar.gz' > Resolving glaros.dtc.umn.edu... 128.101.191.158 > Connecting to glaros.dtc.umn.edu|128.101.191.158|:80... connected. > HTTP request sent, awaiting response... 200 OK > Length: 9,364,297 (8.9M) [application/x-gzip] > > 100%[====================================>] 9,364,297 2.88K/s ETA 00:00 > > 12:34:26 (1.33 KB/s) - `cluto-2.1.1.tar.gz' saved [9364297/9364297] > > ...will now unzip cluto-2.1.1.tar.gz > ...will now untar cluto-2.1.1.tar > cluto-2.1.1/ > cluto-2.1.1/CHANGES > cluto-2.1.1/cluto.h > cluto-2.1.1/COPYRIGHT > cluto-2.1.1/Linux/ > cluto-2.1.1/Linux/libcluto.a > cluto-2.1.1/Linux/scluster > cluto-2.1.1/Linux/vcluster > cluto-2.1.1/manual.pdf > cluto-2.1.1/manual.ps > cluto-2.1.1/Matrices/ > cluto-2.1.1/Matrices/genes1.mat > cluto-2.1.1/Matrices/genes1.mat.rlabel > cluto-2.1.1/Matrices/genes2.mat > cluto-2.1.1/Matrices/genes2.mat.clabel > cluto-2.1.1/Matrices/genes2.mat.rlabel > cluto-2.1.1/Matrices/k1b.mat > cluto-2.1.1/Matrices/k1b.mat.clabel > cluto-2.1.1/Matrices/k1b.mat.rclass > cluto-2.1.1/Matrices/README > cluto-2.1.1/Matrices/sports.clabel > cluto-2.1.1/Matrices/sports.mat > cluto-2.1.1/Matrices/sports.rclass > cluto-2.1.1/Matrices/t4.mat > cluto-2.1.1/Matrices/t7.mat > cluto-2.1.1/Matrices/tr23.graph > cluto-2.1.1/Matrices/tr23.graph.rclass > cluto-2.1.1/Matrices/tr23.mat > cluto-2.1.1/Matrices/tr23.mat.clabel > cluto-2.1.1/Matrices/tr23.mat.r class > cluto-2.1.1/paper1.pdf > cluto-2.1.1/paper2.pdf > cluto-2.1.1/README > cluto-2.1.1/Sun/ > cluto-2.1.1/Sun/libcluto.a > cluto-2.1.1/Sun/scluster > cluto-2.1.1/Sun/vcluster > cluto-2.1.1/VERSION > cluto-2.1.1/Win32/ > cluto-2.1.1/Win32/libcluto.lib > cluto-2.1.1/Win32/scluster.exe > cluto-2.1.1/Win32/vcluster.exe > it looks like you are using Linux ... > ...installed scluster and vcluster in /usr/local/bin > ...make sure /usr/local/bin is included in your PATH > > if all has gone well, you have installed svdpackc (las2) > and cluto (scluter and vcluster) in /usr/local/bin > let's check...you should see three files: las2 scluster vcluster > > ls: /usr/local/bin/las2: No such file or directory > -rwxr-x--- 1 root 1178264 2008-04-07 12:34 /usr/local/bin/scluster > -rwxr-x--- 1 root 1212576 2008-04-07 12:34 /usr/local/bin/vcluster > > .... end of External Software Installation for SenseClusters .... > > if you have some problem with this script, please save the output > and send it to tpederse at d.umn.edu for further assistance > % > > > > On Mon, Apr 7, 2008 at 8:02 AM, Teshome Kassie wrote: > > Hi Ted, > > > > I have attached the error with external installation. > > > > Teshome > > > > > > > > > > ________________________________ > > You rock. That's why Blockbuster's offering you one month of Blockbuster > > Total Access, No Cost. > > > > -- > Ted Pedersen > http://www.d.umn.edu/~tpederse > > > > > ________________________________ > You rock. That's why Blockbuster's offering you one month of Blockbuster > Total Access, No Cost. -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <dul...@gm...> - 2008-04-07 20:36:31
|
SenseClusters is now a registered Perl module. :) This means that people will see it in the list of registered modules, which helps some with respect to visibility. I don't know what the criteria for registration turn out to be, but I know not all modules get registered (even if the developer requests). So, it's a small thing but still nice. ---------- Forwarded message ---------- From: Perl Authors Upload Server <up...@pa...> Date: Mon, Apr 7, 2008 at 2:11 AM Subject: New module Text::SenseClusters To: mo...@pe..., tpe...@cp... The next version of the Module List will list the following module: modid: Text::SenseClusters DSLIP: Rdpfg description: Cluster Similar Words and Contexts userid: TPEDERSE (Ted Pedersen) chapterid: 11 (String_Lang_Text_Proc) enteredby: BDFOY (brian d foy) enteredon: Mon Apr 7 07:11:48 2008 GMT The resulting entry will be: Text:: ::SenseClusters Rdpfg Cluster Similar Words and Contexts TPEDERSE Please allow a few days until the entry will appear in the published module list. Parts of the data listed above can be edited interactively on the PAUSE. See https://pause.perl.org/pause/authenquery?ACTION=edit_mod Thanks for registering, -- The PAUSE -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <dul...@gm...> - 2008-04-07 15:51:58
|
Hi Teshome, It looks to me like you are having problems with compiling las2.c - you do appear to have gcc installed, but then it's not finding your include files (like stdio.h, which are usually provided with your system....So, without finding those .h files nothing else will work with the compile. The output of your gcc -v command tells me you are running Ubuntu, and one of the strange things about Ubuntu is that it does not include "developer" settings by default, that is to say Ubuntu sort of assumes you won't be compiling C programs, so they don't give you the .h files by default. You just need to install those.... Here's a nice post on this issue from : http://www.spiration.co.uk/post/1291 I've cut and pasted that note below, which I think is right on target. If you run the apt-get command below I think things will be fine. If you want to run install.sh again you should just delete that cluto directory that got unpacked in the same directory as install.sh, and then submit again....(after doing the apt-get command below). Let us know how that works out... Good luck, Ted ============================================================== Somehow I assumed that I would be able to compile a basic C program on any linux box - I mean unices are useful like that, right? So I was a bit surprised when I decided to compile a bit of C just now (in fact Christian Wolff's neat little mp3cut tool) and was faced with the following errors: chris@snackerjack-lx:/usr/src/mp3cut-0.8$ make gcc -o mp3cut mp3cut.c mp3cut.c:25:19: error: stdio.h: No such file or directory mp3cut.c:26:20: error: stdlib.h: No such file or directory mp3cut.c:27:20: error: string.h: No such file or directory mp3cut.c:28:20: error: unistd.h: No such file or directory ..etc .. etc So what kind of unix comes with make and a compiler, but none of the required dev libraries and headers required to make any normal C program work? Well a brief google yielded the following solution.. Yup, you guessed it.. you need to install a dev package: sudo apt-get install build-essential Excuse my rant, but if it's so 'essential', then why isn't it installed as part of the core system? I find that kinda weird. Anyway, problem fixed and C-sources are now compiling. christo ======================================================= % sudo ./install.sh /usr/local/bin [sudo] password for teshomek: rm: No match. ************************************************** let's install svdpackc... this involes compiling the las2 program and then doing a very simple check to make sure that worked via a diff command of output produced by your installed version with a key we provide (lao2.key) your gcc version is gcc (GCC) 4.1.3 20070929 (prerelease) (Ubuntu 4.1.2-16ubuntu2) Copyright (C) 2006 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. SVDPACKC generally requires 3.2 or 3.3, and usually has problems with 4.0 or above gcc -ansi -O -c las2.c las2.c:12:19: error: stdio.h: No such file or directory las2.c:13:20: error: stdlib.h: No such file or directory las2.c:14:20: error: string.h: No such file or directory las2.c:15:19: error: errno.h: No such file or directory las2.c:16:18: error: math.h: No such file or directory las2.c:17:19: error: fcntl.h: No such file or directory In file included from las2.c:18: las2.h:63: error: ‘NULL’ undeclared here (not in a function) las2.h:84: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘*’ token las2.c: In function ‘main’: las2.c:152: error: ‘FILE’ undeclared (first use in this function) las2.c:152: error: (Each undeclared identifier is reported only once las2.c:152: error: for each function it appears in.) las2.c:152: error: ‘fp_in1’ undeclared (first use in this function) las2.c:152: error: ‘fp_in2’ undeclared (first use in this function) las2.c:161: warning: incompatible implicit declaration of built-in function ‘printf’ las2.c:162: warning: incompatible implicit declaration of built-in function ‘exit’ las2.c:165: warning: incompatible implicit declaration of built-in function ‘printf’ las2.c:166: warning: incompatible implicit declaration of built-in function ‘exit’ las2.c:168: error: ‘fp_out1’ undeclared (first use in this function) las2.c:169: warning: incompatible implicit declaration of built-in function ‘printf’ las2.c:170: warning: incompatible implicit declaration of built-in function ‘exit’ las2.c:174: warning: incompatible implicit declaration of built-in function ‘fscanf’ las2.c:191: warning: incompatible implicit declaration of built-in function ‘printf’ las2.c:192: warning: incompatible implicit declaration of built-in function ‘exit’ las2.c:214: warning: incompatible implicit declaration of built-in function ‘exit’ las2.c:234: warning: incompatible implicit declaration of built-in function ‘malloc’ las2.c:237: warning: incompatible implicit declaration of built-in function ‘exit’ las2.c:237: error: ‘errno’ undeclared (first use in this function) las2.c:291: warning: incompatible implicit declaration of built-in function ‘exit’ las2.c:301: warning: incompatible implicit declaration of built-in function ‘fprintf’ las2.c:330: warning: incompatible implicit declaration of built-in function ‘sqrt’ las2.c:365: warning: incompatible implicit declaration of built-in function ‘fabs’ las2.c:384: warning: incompatible implicit declaration of built-in function ‘sqrt’ las2.c:398: warning: incompatible implicit declaration of built-in function ‘exit’ las2.c: At top level: las2.c:403: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘*’ token las2.c: In function ‘check_parameters’: las2.c:453: warning: incompatible implicit declaration of built-in function ‘fprintf’ las2.c:453: error: ‘fp_out1’ undeclared (first use in this function) las2.c: At top level: las2.c:457: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘*’ token las2.c: In function ‘write_data’: las2.c:470: warning: incompatible implicit declaration of built-in function ‘fprintf’ las2.c:470: error: ‘fp_out1’ undeclared (first use in this function) las2.c: In function ‘landr’: las2.c:594: warning: incompatible implicit declaration of built-in function ‘sqrt’ las2.c:613: warning: incompatible implicit declaration of built-in function ‘malloc’ las2.c:615: warning: incompatible implicit declaration of built-in function ‘exit’ las2.c:615: error: ‘errno’ undeclared (first use in this function) las2.c:635: warning: incompatible implicit declaration of built-in function ‘exit’ las2.c:637: warning: incompatible implicit declaration of built-in function ‘fabs’ las2.c: In function ‘ritvec’: las2.c:725: warning: incompatible implicit declaration of built-in function ‘malloc’ las2.c:727: warning: incompatible implicit declaration of built-in function ‘exit’ las2.c:727: error: ‘errno’ undeclared (first use in this function) las2.c:749: warning: incompatible implicit declaration of built-in function ‘fabs’ las2.c: In function ‘lanso’: las2.c:953: warning: incompatible implicit declaration of built-in function ‘printf’ las2.c:961: warning: incompatible implicit declaration of built-in function ‘fabs’ las2.c: In function ‘lanczos_step’: las2.c:1093: warning: incompatible implicit declaration of built-in function ‘fabs’ las2.c:1112: warning: incompatible implicit declaration of built-in function ‘sqrt’ las2.c: In function ‘purge’: las2.c:1255: warning: incompatible implicit declaration of built-in function ‘fabs’ las2.c:1280: warning: incompatible implicit declaration of built-in function ‘sqrt’ las2.c: In function ‘stpone’: las2.c:1367: warning: incompatible implicit declaration of built-in function ‘sqrt’ las2.c:1368: warning: incompatible implicit declaration of built-in function ‘fabs’ las2.c: In function ‘startv’: las2.c:1467: warning: incompatible implicit declaration of built-in function ‘sqrt’ las2.c: In function ‘random’: las2.c:1514: warning: incompatible implicit declaration of built-in function ‘atan’ las2.c:1515: warning: incompatible implicit declaration of built-in function ‘sqrt’ las2.c: In function ‘pythag’: las2.c:1564: warning: incompatible implicit declaration of built-in function ‘fabs’ las2.c: In function ‘error_bound’: las2.c:1630: warning: incompatible implicit declaration of built-in function ‘fabs’ las2.c:1632: warning: incompatible implicit declaration of built-in function ‘sqrt’ las2.c:1640: warning: incompatible implicit declaration of built-in function ‘sqrt’ las2.c: In function ‘imtqlb’: las2.c:1738: warning: incompatible implicit declaration of built-in function ‘fabs’ las2.c: In function ‘imtql2’: las2.c:1894: warning: incompatible implicit declaration of built-in function ‘fabs’ las2.c: In function ‘store’: las2.c:2159: warning: incompatible implicit declaration of built-in function ‘fprintf’ las2.c:2159: error: ‘stderr’ undeclared (first use in this function) las2.c:2165: warning: incompatible implicit declaration of built-in function ‘fprintf’ las2.c: In function ‘idamax’: las2.c:2381: warning: incompatible implicit declaration of built-in function ‘fabs’ make: *** [las2.o] Error 1 las2: Command not found. check your las2 output against our key ... diff: lao2: No such file or directory there *may* be some differences in the output of your lao2 file compared to the key we provide these are due to execution time differences and arithmetic differences on different architectures however, as long as your lao2 file has some output in a format similar to lao2.key then it you can assume it has compiled and is running successfully clean up a few output files... cp: cannot stat `las2': No such file or directory rm -fr las2.o timersun.o las2 lav2 matrix ...now installing las2 in /usr/local/bin *************************************************** now let's install cluto .... we are using wget, if you don't have that installed or there are some problems accessing the cluto site this could fail, in which case you would need to visit http://glaros.dtc.umn.edu/gkhome/fetch/sw/cluto/cluto-2.1.1.tar.gz and download to install (verify the url is correct) --10:39:38-- http://glaros.dtc.umn.edu/gkhome/fetch/sw/cluto/cluto-2.1.1.tar.gz => `cluto-2.1.1.tar.gz' Resolving glaros.dtc.umn.edu... 128.101.191.158 Connecting to glaros.dtc.umn.edu|128.101.191.158|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 9,364,297 (8.9M) [application/x-gzip] 100%[====================================>] 9,364,297 2.88K/s ETA 00:00 12:34:26 (1.33 KB/s) - `cluto-2.1.1.tar.gz' saved [9364297/9364297] ...will now unzip cluto-2.1.1.tar.gz ...will now untar cluto-2.1.1.tar cluto-2.1.1/ cluto-2.1.1/CHANGES cluto-2.1.1/cluto.h cluto-2.1.1/COPYRIGHT cluto-2.1.1/Linux/ cluto-2.1.1/Linux/libcluto.a cluto-2.1.1/Linux/scluster cluto-2.1.1/Linux/vcluster cluto-2.1.1/manual.pdf cluto-2.1.1/manual.ps cluto-2.1.1/Matrices/ cluto-2.1.1/Matrices/genes1.mat cluto-2.1.1/Matrices/genes1.mat.rlabel cluto-2.1.1/Matrices/genes2.mat cluto-2.1.1/Matrices/genes2.mat.clabel cluto-2.1.1/Matrices/genes2.mat.rlabel cluto-2.1.1/Matrices/k1b.mat cluto-2.1.1/Matrices/k1b.mat.clabel cluto-2.1.1/Matrices/k1b.mat.rclass cluto-2.1.1/Matrices/README cluto-2.1.1/Matrices/sports.clabel cluto-2.1.1/Matrices/sports.mat cluto-2.1.1/Matrices/sports.rclass cluto-2.1.1/Matrices/t4.mat cluto-2.1.1/Matrices/t7.mat cluto-2.1.1/Matrices/tr23.graph cluto-2.1.1/Matrices/tr23.graph.rclass cluto-2.1.1/Matrices/tr23.mat cluto-2.1.1/Matrices/tr23.mat.clabel cluto-2.1.1/Matrices/tr23.mat.rclass cluto-2.1.1/paper1.pdf cluto-2.1.1/paper2.pdf cluto-2.1.1/README cluto-2.1.1/Sun/ cluto-2.1.1/Sun/libcluto.a cluto-2.1.1/Sun/scluster cluto-2.1.1/Sun/vcluster cluto-2.1.1/VERSION cluto-2.1.1/Win32/ cluto-2.1.1/Win32/libcluto.lib cluto-2.1.1/Win32/scluster.exe cluto-2.1.1/Win32/vcluster.exe it looks like you are using Linux ... ...installed scluster and vcluster in /usr/local/bin ...make sure /usr/local/bin is included in your PATH if all has gone well, you have installed svdpackc (las2) and cluto (scluter and vcluster) in /usr/local/bin let's check...you should see three files: las2 scluster vcluster ls: /usr/local/bin/las2: No such file or directory -rwxr-x--- 1 root 1178264 2008-04-07 12:34 /usr/local/bin/scluster -rwxr-x--- 1 root 1212576 2008-04-07 12:34 /usr/local/bin/vcluster .... end of External Software Installation for SenseClusters .... if you have some problem with this script, please save the output and send it to tpederse at d.umn.edu for further assistance % On Mon, Apr 7, 2008 at 8:02 AM, Teshome Kassie <tk...@ya...> wrote: > Hi Ted, > > I have attached the error with external installation. > > Teshome > > > > > ________________________________ > You rock. That's why Blockbuster's offering you one month of Blockbuster > Total Access, No Cost. -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <dul...@gm...> - 2008-04-06 21:35:19
|
I've updated the web interface to 1.01, and I think all is well. All of the following occurred on marimba.d.umn.edu in /usr/local/apache2 Renamed existing SC-cgi and SC-htdocs directories as SC-cgi_0.95 and SC-htdocs_0.95 to save them. Copied new SC-cgi and SC-htdocs from Text-SenseClusters-1.01/Web to /cgi-bin and /htdocs Changed permissions on SC-cgi and SC-htdocs via chmod -R gou+rwx * Created a directory /space/SC101 and within that a tools and packages directory From SC095 I copied gnuplot, cluto-2.1.1 and SVDPACKC to SC101/tools and did not recompile, since those should not have changed and the binaries remain ok. (We only get binaries of Cluto so recompiling isn't an option). Need to make sure the directories where these binaries reside are put in the PATH in config.txt I dowloaded the most current copies of our dependent Perl modules except PDL and put in /SC101/packages and then installed. All Perl modules are installed at /space/SC1010/tools/lib Did not update PDL as that requires true root access which I don't have on marimba (only sudo) Modified config.txt file in SC-cgi to reflect new location of PATH and PERL5LIB (SC101 directories) Installed Text-SenseClusters-1.01 using PREFIX=/space/SC101/tools LIB=/space/SC101/tools/lib (these were the same PREFIX and LIB settings for all modules) The changes to the web interface are very minimal - I changed the titles of the pages from SenseClusters Demos to just SenseClusters, as the interface is a bit more than a demo. Otherwise, the purpose of the upgrade was to keep the web interface synced up with the stable released version, which is now 1.01. I think everything is running now. -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <tpederse@d.umn.edu> - 2008-04-02 15:28:14
|
Greetings all, As you know in the past we have made rather stern warnings about what version of gcc you should use for compiling SVDPACKC. However, I found myself working these days with version 4.1.1 of gcc, and having absolutely no problems with SVDPACKC (the problem in the past has been rather inexplicable segementation faults when running las2). So....I might temper our version requirements a bit in light of my experience of late - in the past we have suggested using a 3.2 or 3.3 version of gcc, but I'm not sure if that's either realistic or necessary at this point. If you've had any experiences with SVDPACKC good or bad that might shed some light on this, I'd be happy to hear about them. I am actually not a high powered user of gcc, so I'm wondering if there might not be some option settings I don't know about that might help resolve some of this inconsistency as we move from version to version of gcc. Right now we just compile like this... gcc -ansi -O -c las2.c Also, I found that I was having a hard time compiling SVDLIBC on a Xeon processor - so my efforts on that front are momentarily stalled, but will surely continue. So, I'll be updating CPAN with a new development release of SenseClusters sometime in the next few days that will hopefully resolve some issues with svdpackout.pl and might have better guidelines about compiling SVDPACKC. Thereafter we'll move on to considering the issue of SVDPACKC versus SVDLIBC. Cordially, Ted -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <tpederse@d.umn.edu> - 2008-04-01 00:21:09
|
Greetings all, As I've been working on our migration to CPAN, I've also been working on svdpackout.pl, in order to do some comparisons with SVDLIBC, and also just to understand the inner workings of svdpackout.pl again before making any big changes. There are a couple of issues that I've focused attention on one more time, and this is very belated follow-up prompted by some notes from Richard Wicentowski to the developers lists in November 2006 that raised a couple of issues, which had also been raised by other users and developers from time to time in the past. Now, before we get to those, a bit of review. svdpackout.pl takes the output from SVDPACKC and essentially "recombines" the decomposed input matrix in order build a new matrix that represents the k most significant dimensions in that original data. SVD decomposes the input matrix into three matrices, U, S, and V is a common notation for those. svdpackout.pl does this recombination two different ways - first, using the --rowonly option. This just takes the M x k matrix U and combines that with k x k matrix S. M is the number of rows in the original matrix, so we get an M x k matrix that represents the original M x N data. Now, as a part of this operation, when we were doing this recombination we would take the square root of the values in S. Despite my best efforts, I really don't know why we did that, and I don't find much evidence to support the use of this technique in the literature, so I believe we'll stop doing that, and will simply provide a --sqrt option to allow for backwards compatibility. Now, was it a bad thing to take this square root? I don't know if it was bad, although I think the effort of it would be to minimize the differences between the values of the k singular values that we find in S. So if those values were originally (25, 16, 9, 4) (in a 4x4 diagonal matrix) then of course the resulting values after square root would be (5, 4, 3, 2) which is essentially causing these k values to come together, and in the end make our resulting M x k recombination possibly harder to cluster. Now, it's important to point out that we do use --rowonly as the default in discriminate, which means that it is also the default in the web interface. However, svdpackout.pl defaulted to a full M x N matrix reconstruction, which did not have the square root operation. So, I think that in our next release we'll require that a user specify --sqrt if they want this particular feature turned on (and it would only have an effect on --rowonly). Otherwise, we won't take the square root of S (k x k) but will instead use the original values. If anyone knows what we were thinking, speak now. :) Next item, and this is some rather powerful and perhaps unwise smoothing that we applied to the recombined matrix in both the --rowonly and the full recombination (in other words, this would happen for all runs of svdpackout.pl). If the value of the cell in the recombined M x k or M x N matrix was less than 0, we would smooth it to 0, thereby eliminating any negative values. I don't have a good explanation for why we chose to do that, and I'm inclined to think it was a bug. Negative values are a natural byproduct of SVD, so simply removing them does not have a good justification (or not one that I can think of at least.) The problem with removing them of course is that it simply changes the nature of the result, and causes a fairly significant lose of information in that "direction". Richard provided some code to the developers list quite a while ago that includes a --negatives option to turn off that smoothing, and I think we will make that the default behavior, and only have the smoothing of negative values done by request. Thus, you can anticipate some fairly fundamental changes coming to svdpackout.pl - first, no more square root operation being performed on the S matrix, and second, no more smoothing of negative values to zero in the recombined matrix. In the interests of backwards compatibility we'll maintain this functionality via options that a user can request, but I think in general the default behavior should be that we don't take square roots and we don't smooth negative values to 0. So, stay tuned, and the good news is that I think this might actually cause our results from SVD to be a bit more dramatic than they have been thus far. Our experience over the years has been that SVD does not seem to have too much of an effect on overall results, but I think that might have been because we were diluting the resulting information somewhat via these operations. Now, this still does not address the issue of SVDLIBC versus SVDPACKC, but no matter which direction we go there, svdpackout.pl will remain a part of the picture, and in fact my goal is that if we make a change in how we are computing SVD that it would be largely an invisible change to the user. Comments and questions are of course welcome on this. Thanks, Ted -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <tpederse@d.umn.edu> - 2008-03-30 15:09:57
|
Greetings all... I've released version 1.00 of SenseClusters, now available on CPAN at http://search.cpan.org/dist/Text-SenseClusters This release includes revised INSTALL instructions (see below :) and also some fairly significant clean up to Toolkit program documentation. I am in the process of trying to provide more examples in the SYNOPSIS sections, and then just making misc changes to enforce consistency among the programs. This is still a development release however (as demonstrated by the even number of the release). So, please do use with caution. 0.95 remains the most current "stable" release. I anticipate at least one more development release before having a stable release. I plan to experiment with using SVDLIBC rather than SVDPACKC, which really does seem (from my perspective at least) to be incompatible with gcc version 4.0.0 or better, which will become an increasingly difficult problem to deal with. I'm also continuing to work on improving the installation procedures, making them as automatic as possible. I'm also going to try and provide at least a few test cases that use "make test" so that we can expand our testing efforts in that direction, which is generally more compatible with CPAN releases (and easier for the user to do too). My hope is that as of version 1.00, SenseClusters is now backwards compatible to Perl 5.6.2, and that SenseClusters and all the dependent CPAN modules can be installed via the Bundle that has also been provided on CPAN. (http://search.cpan.org/dist/Bundle-Text-SenseClusters) So, if you have the opportunity please do check out the new release, and any and all comments are most welcome, and are especially timely now. Cordially, Ted On Sat, Mar 29, 2008 at 9:45 AM, Ted Pedersen <tpederse@d.umn.edu> wrote: > Hi Teshome, > > The commands in that version of the INSTALL documenation are out of > order, unfortunately. Sorry about that, I am fixing that today. You > should first run the perl -MCPAN command to get SenseClusters and the > CPAN components, and then you can do the External install. > > Here's a preview of the new instructions - note that you'll need to > locate your .cpan directory to find the sources. That's usually in > your home directory as shown below (or the root home). > > NAME > INSTALL Installation instructions for SenseClusters > > SYNOPSIS > If you have su or sudo access, you should be able to install and test > the installation of SenseClusters via automatic download from CPAN as > follows: > > # install SenseClusters and all dependent CPAN modules > perl -MCPAN -e 'install Bundle::Text::SenseClusters'; > > # install cluto and SVDPACKC (included in SenseClusters) > cd ~/.cpan/build/Text-SenseClusters-[insert_version] > cd External > csh ./install.sh /usr/local/bin > cd ~ > > # run SC test cases (note that location of cpan build > # directory might vary on your system. > > cd ~/.cpan/build/Text-SenseClusters-[insert_version] > cd Testing > csh ./ALL-TESTS.sh > cd ~ > > This assumes that /usr/local/bin is in your PATH and is your preferred > location for user installed executable scripts. If it is not, substitute > your perferred directory here. > > Hope this helps, > Ted > > > > On Sat, Mar 29, 2008 at 2:23 AM, Teshome Kassie <tk...@ya...> wrote: > > Hello Ted; > > I coudn't use your SenseClusters in doing my thesis. The reason is that the > > problem of installing it to my PC. I have seen from the internet Bundle Text > > SenseClusters from the web. As I understood the whole requirement is bundled > > with the above to install it. From installation instruction, It says before > > installing Bundle Text SenseClusters it is necessary to install external > > packages CLUTO & SVDPACKC which could be installed by the following script > > which is provided: > > cd External > > csh ./ALL-TESTS.sh INSTALLDIR > > cd .. > > But I couldn't get the script to use for installing the external packages. > > So could you help me to get it. > > In Addition, Please instruct me in detail how to install all the required > > components of SenseClusters in order to use it for the language to apply for > > sense discrimination in a corpus of specific domain. > > Teshome. > > > > > > > > Ted Pedersen <tpederse@d.umn.edu> wrote: > > Hi Teshome, > > > > I'm afraid I'm not sure what the problem is here. PDL is supported by > > another group, so perhaps you could contact them and ask about the > > error. You can find their mailing list at : > > > > http://pdl.perl.org/maillists/ > > > > I am currently using PDL 2.4.1 which is a few versions behind 2.4.3, > > but I wouldn't think there would be that much difference between them. > > This is also the version being used for the SenseClusters web > > interface. > > > > Good luck! > > Ted > > > > > > On Dec 5, 2007 10:17 AM, Teshome Kassie wrote: > > > Dear Sir; > > > > > > I tried to install PDL on my machine so many times according to your > > > instruction to use SenseClusters. The commands I used are as follows with > > > csh prompt: > > > > > > perl -MCPAN -e shell > > > cpan> install PDL > > > then I followed with answering to install for dependencies accordingly. > > > finally I end up with the following error. > > > XXXXXXXXXX Processing gl.h > > > Running cpp on /usr/include/GL/gl.h > > > *** CPP command: gcc -E -P -DGL_MESA_program_debug=0 -D_REENTRANT > > > -D_GNU_SOURCE -fno-strict-aliasing -pipe -Wdeclaration-after-statement > > > -I/usr/local/include -I/usr/include/gdbm -D_REENTRANT -D_GNU_SOURCE > > > -fno-strict-aliasing -pipe -Wdeclaration-after-statement > > > -I/usr/local/include -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 > > > -I/usr/include/gdbm -D_LANGUAGE_C -DAPIENTRY='' tmp_gl.h | > > > open_fencestr = '11HdyTbIVg6s'; close_fencestr = '23Cnba1nbf31' > > > rawfile has 1807 lines... > > > SUB CPP: Returning 1503 lines... > > > XXXXXXXXXX Processing glx.h > > > Running cpp on /usr/include/GL/glx.h > > > *** CPP command: gcc -E -P -DGL_MESA_program_debug=0 -D_REENTRANT > > > -D_GNU_SOURCE -fno-strict-aliasing -pipe -Wdeclaration-after-statement > > > -I/usr/local/include -I/usr/include/gdbm -D_REENTRANT -D_GNU_SOURCE > > > -fno-strict-aliasing -pipe -Wdeclaration-after-statement > > > -I/usr/local/include -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 > > > -I/usr/include/gdbm -D_LANGUAGE_C -DAPIENTRY='' tmp_glx.h | > > > open_fencestr = '11HdyTbIVg6s'; close_fencestr = '23Cnba1nbf31' > > > rawfile has 5970 lines... > > > SUB CPP: Returning 225 lines... > > > XXXXXXXXXX Processing glu.h > > > Running cpp on /usr/include/GL/glu.h > > > *** CPP command: gcc -E -P -DGL_MESA_program_debug=0 -D_REENTRANT > > > -D_GNU_SOURCE -fno-strict-aliasing -pipe -Wdeclaration-after-statement > > > -I/usr/local/include -I/usr/include/gdbm -D_REENTRANT -D_GNU_SOURCE > > > -fno-strict-aliasing -pipe -Wdeclaration-after-statement > > > -I/usr/local/include -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 > > > -I/usr/include/gdbm -D_LANGUAGE_C -DAPIENTRY='' tmp_glu.h | > > > open_fencestr = '11HdyTbIVg6s'; close_fencestr = '23Cnba1nbf31' > > > rawfile has 1874 lines... > > > SUB CPP: Returning 137 lines... > > > cp OpenGL.pm ../../../blib/lib/PDL/Graphics/OpenGL.pm > > > /usr/bin/perl /usr/lib/perl5/5.8.8/ExtUtils/xsubpp -typemap > > > /usr/lib/perl5/5.8.8/ExtUtils/typemap -typemap > > > /root/.cpan/build/PDL-2.4.3/Basic/Core/typemap.pdl OpenGL.xs > OpenGL.xsc > > > && mv OpenGL.xsc OpenGL.c > > > Error: No OUTPUT definition for type 'GLvoid', typekind 'T_VOID' found in > > > OpenGL.xs, line 7547 > > > make[3]: *** [OpenGL.c] Error 1 > > > make[3]: Leaving directory > > > `/root/.cpan/build/PDL-2.4.3/Graphics/TriD/OpenGL' > > > make[2]: *** [subdirs] Error 2 > > > make[2]: Leaving directory `/root/.cpan/build/PDL-2.4.3/Graphics/TriD' > > > make[1]: *** [subdirs] Error 2 > > > make[1]: Leaving directory `/root/.cpan/build/PDL-2.4.3/Graphics' > > > make: *** [subdirs] Error 2 > > > /usr/bin/make -- NOT OK > > > Running make test > > > Can't test without successful make > > > Running make install > > > make had returned bad status, install seems impossible > > > > > > cpan> > > > > > > Any help to find out what the error is? > > > > > > Can I use to install locally by downloading PDL, which version ? > > > > > > With Regards; > > > > > > Teshome > > > > > > > > > > > > ________________________________ > > > Looking for last minute shopping deals? Find them fast with Yahoo! Search. > > > > > > > > -- > > Ted Pedersen > > http://www.d.umn.edu/~tpederse > > > > > > > > ________________________________ > > You rock. That's why Blockbuster's offering you one month of Blockbuster > > Total Access, No Cost. > > > > -- > Ted Pedersen > http://www.d.umn.edu/~tpederse > -- Ted Pedersen http://www.d.umn.edu/~tpederse -- Ted Pedersen http://www.d.umn.edu/~tpederse |
From: Ted P. <tpederse@d.umn.edu> - 2008-03-23 14:07:03
|
Greetings all, I will be working on a clean up release of SenseClusters - mainly taking care of small glitches in documentation, organization, testing, and installation procedures. I don't plan on adding any new functionality for this version, more or less just cleaning up a few lose ends. If there are issues of that sort that you've noticed, this would be a great time to mention them, as it will be very easy to take care of them now. If there is larger functionality changes that you are interested in, please don't hesitate to mention those as I might get ambitious, and there will be another release coming along after this. I am considering using CPAN as a distribution site in addition to sourceforge, and actually have uploaded SenseClusters there now - it would be called Text::SenseClusters at CPAN, just because they frown on introducing new high level names there, and it helps describe it for new users somewhat. The reason I'm thinking of using CPAN is to further automate the installation procedure - some time agree I created a Bundle for SenseClusters (Bundle::SenseClusters, although I might rename as Bundle::Text:: SenseClusters) that if you install using the CPAN.pm module will check for an load all the prerequisite CPAN packages, by simply doing this... cpan > install Bundle::SenseClusters I think we can include SenseClusters in the Bundle, or have it call the Bundle, and reduce the Perl installation part of SenseClusters to a single command. Right now it's actually surprisingly easy, but I'd like it to be easier still. I'm also contemplating including a pre-edited version of SVDPACKC with the package, since new releases of that are very very unlikely, and it appears to be permissible under the terms of the license that it is distributed. That would just leave cluto as an external install, which isn't too bad since that comes as a binary. The other thing I like about CPAN is that it makes the code and documentation much more "visible". You can browse around the source code and documentation and really see it, and that seems like it will lead to easier maintenance. This might also make it possible to reduce certain directories that would become redundant, like our HTML tree (which would be created and available, in effect, via CPAN). It's important to stress that we would still use sourceforge for distribution and CVS, so a possible CPAN release is in addition to what we normally do, not instead of. Finally, this is the route that we took with NSP, and over time we ended up with a package that is much more object oriented, and I think that evolution was in some sense encouraged by being on CPAN. The good news is that none of this is that essential, it's just nice to do. I downloaded SenseClusters on a new machine today, and was actually quite impressed at how easy it was to install and get running, and the demo scripts and test scripts that we make available with it remain extensive and more complete/thorough than we probably have for any other package. So, it remains a great example of how to put together a really complicated system and make it easy to use, and the above are all just tiny tweaks on what is an extremely solid body of work. So, suggestions would be particularly helpful now, and if you are interested in seeing SenseClusters on CPAN you can see an experimental release there now : http://search.cpan.org/dist/Text-SenseClusters/ This is version 0.96, where the even number implies development version, so 0.95 remains our stable release, and what you should use if you are interested in doing experiments, etc. with the package. I'll continue to use the even numbers for development releases like this, hopefully culminating in a 1.01 stable release. SenseClusters really deserves to be > 1 as well. Thanks! Ted -- Ted Pedersen http://www.d.umn.edu/~tpederse |