You can subscribe to this list here.
2009 |
Jan
|
Feb
|
Mar
(1) |
Apr
(14) |
May
(36) |
Jun
(148) |
Jul
(33) |
Aug
(2) |
Sep
(17) |
Oct
(42) |
Nov
(137) |
Dec
(88) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2010 |
Jan
(89) |
Feb
(80) |
Mar
(217) |
Apr
(76) |
May
(5) |
Jun
(39) |
Jul
(35) |
Aug
(4) |
Sep
(7) |
Oct
(14) |
Nov
(12) |
Dec
(9) |
2011 |
Jan
(6) |
Feb
(4) |
Mar
(11) |
Apr
(55) |
May
(90) |
Jun
(39) |
Jul
(15) |
Aug
(15) |
Sep
(23) |
Oct
(12) |
Nov
(17) |
Dec
(20) |
2012 |
Jan
(22) |
Feb
(63) |
Mar
|
Apr
(1) |
May
(6) |
Jun
(3) |
Jul
(1) |
Aug
(1) |
Sep
|
Oct
|
Nov
|
Dec
|
2013 |
Jan
(3) |
Feb
(6) |
Mar
|
Apr
|
May
|
Jun
(4) |
Jul
(1) |
Aug
(1) |
Sep
|
Oct
|
Nov
|
Dec
|
2014 |
Jan
|
Feb
|
Mar
|
Apr
(7) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: SourceForge.net <no...@so...> - 2011-04-11 14:17:23
|
Bugs item #3284494, was opened at 2011-04-11 10:17 Message generated for change (Tracker Item Submitted) made by sfrgpiel You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3284494&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: ui Group: None Status: Open Priority: 8 Private: No Submitted By: William Piel (sfrgpiel) Assigned to: hshyket (hshyket) Summary: When too many trees are uploaded they cannot be deleted Initial Comment: Although the verbiage on our upload page (http://www.treebase.org/treebase-web/user/uploadFile.html) clearly states that submitters should limit the number of trees that they upload, they continue to ignore this warning. When 1000+ trees are uploaded the problem is twofold: (1) the user-experience is swamped by way too many trees for the same basic dataset, and (2) it is impossible to delete the trees, probably because of a database time-out. Let's implement two solutions: (1) only the first 30 trees from an incoming tree block will be parsed and stored -- the others ignored. (2) if possible, modify the delete-tree-block function (ie http://www.treebase.org/treebase-web/user/deleteATreeBlock.html?treeblockid=xyz) so that each time a tree inside this treeblock is deleted, the database does a commit. That way even if there is a time-out, the database won't completely rollback all the trees that were successfully deleted. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3284494&group_id=248804 |
From: <hs...@us...> - 2011-04-08 17:53:24
|
Revision: 781 http://treebase.svn.sourceforge.net/treebase/?rev=781&view=rev Author: hshyket Date: 2011-04-08 17:53:18 +0000 (Fri, 08 Apr 2011) Log Message: ----------- Fixed bug where the Dryad import was not populating the study_nexusfile nexus text field. Added a Unix timestamp to file uploads so that each name is unique. Modified Paths: -------------- trunk/treebase-web/src/main/java/org/cipres/treebase/web/controllers/ProcessUserController.java trunk/treebase-web/src/main/java/org/cipres/treebase/web/controllers/UploadFileController.java Modified: trunk/treebase-web/src/main/java/org/cipres/treebase/web/controllers/ProcessUserController.java =================================================================== --- trunk/treebase-web/src/main/java/org/cipres/treebase/web/controllers/ProcessUserController.java 2011-03-30 18:58:25 UTC (rev 780) +++ trunk/treebase-web/src/main/java/org/cipres/treebase/web/controllers/ProcessUserController.java 2011-04-08 17:53:18 UTC (rev 781) @@ -8,6 +8,7 @@ import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; +import org.apache.commons.io.FileUtils; import org.apache.log4j.Logger; import org.springframework.web.servlet.ModelAndView; import org.springframework.web.servlet.mvc.Controller; @@ -91,10 +92,24 @@ study.setCitation(citation); citation.setStudy(study); Submission submission = mSubmissionService.createSubmission(user, study); + long unixTime = System.currentTimeMillis() / 1000L; List<File> files = DryadUtil.getDataFiles(dataPath); - for(int i=0; i<files.size(); i++ ) - submission.getStudy().addNexusFile(files.get(i).getName(), "NEXUS"); + for(int i=0; i<files.size(); i++ ) { + String copyDir = request.getSession().getServletContext() + .getRealPath(TreebaseUtil.FILESEP + "NexusFileUpload") + + TreebaseUtil.FILESEP + request.getRemoteUser(); + + File originalFile = new File(files.get(i).getAbsolutePath()); + File copyFile = new File(copyDir + TreebaseUtil.FILESEP + unixTime + "_" + files.get(i).getName()); + + FileUtils.copyFile(originalFile, copyFile); + + files.remove(i); + files.add(i,copyFile); + + submission.getStudy().addNexusFile(files.get(i).getName(), TreebaseUtil.readFileToString(files.get(i))); + } MyProgressionListener listener = new MyProgressionListener(); getSubmissionService().addNexusFilesJDBC(submission, files, listener); Modified: trunk/treebase-web/src/main/java/org/cipres/treebase/web/controllers/UploadFileController.java =================================================================== --- trunk/treebase-web/src/main/java/org/cipres/treebase/web/controllers/UploadFileController.java 2011-03-30 18:58:25 UTC (rev 780) +++ trunk/treebase-web/src/main/java/org/cipres/treebase/web/controllers/UploadFileController.java 2011-04-08 17:53:18 UTC (rev 781) @@ -167,13 +167,19 @@ List<File> files = new ArrayList<File>(); String firstFile = null; + + long unixTime = System.currentTimeMillis() / 1000L; + for (FileBean file : getFiles(request)) { if (LOGGER.isDebugEnabled()) { LOGGER .debug("Uploading file to =>" + uploadDir + TreebaseUtil.FILESEP + file.getName()); //$NON-NLS-1$ } + + file.setName(unixTime + "_" + file.getName()); File uploadedFile = new File(uploadDir + TreebaseUtil.FILESEP + file.getName()); + FileCopyUtils.copy(file.getData(), uploadedFile); files.add(uploadedFile); This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site. |
From: SourceForge.net <no...@so...> - 2011-04-01 16:18:21
|
Bugs item #3244573, was opened at 2011-03-25 14:24 Message generated for change (Comment added) made by sfrgpiel You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3244573&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: APIs Group: None Status: Open Priority: 5 Private: No Submitted By: Kevin S. Clarke (ksclarke) Assigned to: Nobody/Anonymous (nobody) Summary: OAI Harvesting Exception Initial Comment: Dryad periodically tries to harvest TreeBASE. Today we received this exception: <title>Java Uncaught Exception</title> <content tag="heading">Uncaught Exception Encountered</content> <p> java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.cipres.treebase.web.controllers.OAIPMHController.handle(OAIPMHController.java:117) at org.springframework.web.servlet.mvc.AbstractCommandController.handleRequestInternal(AbstractCommandController.java:84) at org.springframework.web.servlet.mvc.AbstractController.handleRequest(AbstractController.java:153) at org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:48) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:858) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:792) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:476) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:431) at javax.servlet.http.HttpServlet.service(HttpServlet.java:627) at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:269) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.tuckey.web.filters.urlrewrite.RuleChain.handleRewrite(RuleChain.java:164) at org.tuckey.web.filters.urlrewrite.RuleChain.doRules(RuleChain.java:141) at org.tuckey.web.filters.urlrewrite.UrlRewriter.processRequest(UrlRewriter.java:90) at org.tuckey.web.filters.urlrewrite.UrlRewriteFilter.doFilter(UrlRewriteFilter.java:406) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at com.opensymphony.module.sitemesh.filter.PageFilter.parsePage(PageFilter.java:119) at com.opensymphony.module.sitemesh.filter.PageFilter.doFilter(PageFilter.java:55) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.acegisecurity.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:264) at org.acegisecurity.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:107) at org.acegisecurity.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:72) at org.acegisecurity.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:274) at org.acegisecurity.ui.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:110) at org.acegisecurity.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:274) at org.acegisecurity.wrapper.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:81) at org.acegisecurity.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:274) at org.acegisecurity.ui.AbstractProcessingFilter.doFilter(AbstractProcessingFilter.java:217) at org.acegisecurity.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:274) at org.acegisecurity.context.HttpSessionContextIntegrationFilter.doFilter(HttpSessionContextIntegrationFilter.java:191) at org.acegisecurity.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:274) at org.acegisecurity.util.FilterChainProxy.doFilter(FilterChainProxy.java:148) at org.acegisecurity.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:90) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:174) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:873) at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665) at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528) at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81) at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:689) at java.lang.Thread.run(Thread.java:636) Caused by: org.springframework.transaction.CannotCreateTransactionException: Could not open Hibernate Session for transaction; nested exception is org.hibernate.exception.GenericJDBCException: Cannot open connection at org.springframework.orm.hibernate3.HibernateTransactionManager.doBegin(HibernateTransactionManager.java:541) at org.springframework.transaction.support.AbstractPlatformTransactionManager.getTransaction(AbstractPlatformTransactionManager.java:350) at org.springframework.transaction.interceptor.TransactionAspectSupport.createTransactionIfNecessary(TransactionAspectSupport.java:262) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:101) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204) at $Proxy350.findSubmissionByLastModifiedDateRange(Unknown Source) at org.cipres.treebase.web.controllers.OAIPMHController.ListRecords(OAIPMHController.java:125) ... 54 more Caused by: org.hibernate.exception.GenericJDBCException: Cannot open connection at org.hibernate.exception.SQLStateConverter.handledNonSpecificException(SQLStateConverter.java:103) at org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:91) at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:43) at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:29) at org.hibernate.jdbc.ConnectionManager.openConnection(ConnectionManager.java:426) at org.hibernate.jdbc.ConnectionManager.getConnection(ConnectionManager.java:144) at org.hibernate.jdbc.JDBCContext.connection(JDBCContext.java:119) at org.hibernate.transaction.JDBCTransaction.begin(JDBCTransaction.java:57) at org.hibernate.impl.SessionImpl.beginTransaction(SessionImpl.java:1326) at org.springframework.orm.hibernate3.HibernateTransactionManager.doBegin(HibernateTransactionManager.java:510) ... 61 more Caused by: org.apache.tomcat.dbcp.dbcp.SQLNestedException: Cannot create JDBC driver of class '' for connect URL 'null' at org.apache.tomcat.dbcp.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1150) at org.apache.tomcat.dbcp.dbcp.BasicDataSource.getConnection(BasicDataSource.java:880) at org.springframework.orm.hibernate3.LocalDataSourceConnectionProvider.getConnection(LocalDataSourceConnectionProvider.java:81) at org.hibernate.jdbc.ConnectionManager.openConnection(ConnectionManager.java:423) ... 66 more Caused by: java.lang.NullPointerException at org.postgresql.Driver.parseURL(Driver.java:552) at org.postgresql.Driver.acceptsURL(Driver.java:405) at java.sql.DriverManager.getDriver(DriverManager.java:268) at org.apache.tomcat.dbcp.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1143) ... 69 more Cookies: <p> The URL that was tried (and that produced the exception) is: http://www.treebase.org/treebase-web/top/oai?verb=ListRecords&from=2011-03-25T05:00:49Z&until=2011-03-25T17:02:50Z&metadataPrefix=oai_dc I pasted it into my browser and received the same exception message. ---------------------------------------------------------------------- >Comment By: William Piel (sfrgpiel) Date: 2011-04-01 12:18 Message: This is a case of downtime for maintenance. For the web user interface, Jon has a "unavailable for maintenance" message pop up. Is there some equivalent message to tell OAI-PMH harvesters? ---------------------------------------------------------------------- Comment By: William Piel (sfrgpiel) Date: 2011-04-01 12:18 Message: Thanks for reporting this bug. We'll look into it as soon as possible. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3244573&group_id=248804 |
From: SourceForge.net <no...@so...> - 2011-04-01 16:13:28
|
Bugs item #3101147, was opened at 2010-11-01 20:34 Message generated for change (Settings changed) made by sfrgpiel You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3101147&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: ui Group: None Status: Open Priority: 7 Private: No Submitted By: William Piel (sfrgpiel) >Assigned to: hshyket (hshyket) Summary: Phylows uses wrong version of Phylowidget Initial Comment: When a tree is viewed from the search/browse web page, the link causes the latest build of Phylowidget to launch from the phylowidget.org website, e.g.: http://www.phylowidget.org/full/?tree='http://www.treebase.org/treebase-web/tree_for_phylowidget/TB2:Tr1000' This results in an instance of a fully-feathured Phylowidget, and usually performs well. By contrast, when a tree is viewed from /phylows/, it causes a special (but older) copy of Phylowidget to launch, (a) which has fewer features, but (b) also has features for saving modified trees back to the database. For example: http://purl.org/phylo/treebase/phylows/tree/TB2:Tr1000?format=html This is not ideal because this older version of Phylowidget is buggy, and the added "Save to database" might be a security hole. Better to have it behave like the other link, or alternatively, use some other viewer altogether (e.g. jsPhyloSVG). ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3101147&group_id=248804 |
From: SourceForge.net <no...@so...> - 2011-04-01 16:11:53
|
Bugs item #3267613, was opened at 2011-04-01 12:11 Message generated for change (Tracker Item Submitted) made by sfrgpiel You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3267613&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: ui Group: None Status: Open Priority: 8 Private: No Submitted By: William Piel (sfrgpiel) Assigned to: hshyket (hshyket) Summary: Downloading original file problem Initial Comment: TreeBASE retains the original uploaded NEXUS files so that users can see what the submitter originally uploaded. However, if a submitter uploads several files with exactly the same file name, only the oldest version is available for download. One solution is to prefix or suffix the file name with a time/date stamp, or a count (plus 1) of the number of files already uploaded. This will insure that each filename is unique and available for download. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3267613&group_id=248804 |
From: <sfr...@us...> - 2011-03-30 18:58:32
|
Revision: 780 http://treebase.svn.sourceforge.net/treebase/?rev=780&view=rev Author: sfrgpiel Date: 2011-03-30 18:58:25 +0000 (Wed, 30 Mar 2011) Log Message: ----------- Schema and scripts for creating a vToL. Added Paths: ----------- trunk/treebase-derivatives/vToL/db/vtol.zip trunk/treebase-derivatives/vToL/src/build_vtol.pl trunk/treebase-derivatives/vToL/src/classification_to_tree.pl trunk/treebase-derivatives/vToL/src/download_vtol.pl trunk/treebase-derivatives/vToL/src/query_vtol.pl trunk/treebase-derivatives/vToL/src/tree_to_classification.pl trunk/treebase-derivatives/vToL/src/upload_tree.pl Added: trunk/treebase-derivatives/vToL/db/vtol.zip =================================================================== (Binary files differ) Property changes on: trunk/treebase-derivatives/vToL/db/vtol.zip ___________________________________________________________________ Added: svn:mime-type + application/octet-stream Added: trunk/treebase-derivatives/vToL/src/build_vtol.pl =================================================================== --- trunk/treebase-derivatives/vToL/src/build_vtol.pl (rev 0) +++ trunk/treebase-derivatives/vToL/src/build_vtol.pl 2011-03-30 18:58:25 UTC (rev 780) @@ -0,0 +1,263 @@ +#!/usr/bin/perl + +# Script finds species present in the classification, yet missing +# from the phylogeny, and maps them to the most recent compatible node +# on the phylogeny. The most recent compatible node is the smallest clade +# on the phylogeny whose descendants intersect with the descendants of a +# clade on the classification but do not contain any that intersect with +# names on the classification that are outside of the clade. + +use strict; +use DBI; + +# check that the right number of arguments are listed +die "Input error! Usage: perl build_vtol.pl tree_id\n" if (@ARGV < 1); + +# Fill in the database name and access credentials +my $database = ""; +my $username = ""; +my $password = ""; +my $dbh = &ConnectToPg($database, $username, $password); + +my $tree_id = shift @ARGV; +my $new_nodes_table = "my_edited_classification"; + + +# STEP 1 +# find MRCA in the classification for all leaf nodes in tree_id x + +my $statement = "SELECT tax_id, name_txt, left_id, right_id +FROM $new_nodes_table JOIN ncbi_names USING (tax_id) +WHERE left_id <= ( SELECT MIN(left_id) FROM $new_nodes_table WHERE tax_id IN ( + SELECT tx.taxid + FROM nodes nds JOIN taxon_variants tv USING (taxon_variant_id) + JOIN taxa tx ON (tv.taxon_id = tx.taxon_id) + WHERE tx.taxid > 0 + AND nds.tree_id = $tree_id + AND (nds.right_id - nds.left_id = 1) + ) ) +AND right_id >= ( SELECT MAX(right_id) FROM $new_nodes_table WHERE tax_id IN ( + SELECT tx.taxid + FROM nodes nds JOIN taxon_variants tv USING (taxon_variant_id) + JOIN taxa tx ON (tv.taxon_id = tx.taxon_id) + WHERE tx.taxid > 0 + AND nds.tree_id = $tree_id + AND (nds.right_id - nds.left_id = 1) + ) ) +AND name_class = 'scientific name' +ORDER BY right_id +LIMIT 1"; + + +my ($mrca_ncbi_taxid, $mrca_ncbi_name, $mrca_ncbi_left_id, $mrca_ncbi_right_id) = $dbh->selectrow_array ($statement); + +print "\n\nTree tree_id = $tree_id\n\n"; +print "taxid of the MRCA in NCBI = $mrca_ncbi_taxid\n"; +print "name of the MRCA in NCBI = $mrca_ncbi_name\n\n"; + +if ($mrca_ncbi_taxid == 0) { + my $sc = $dbh->disconnect; + exit; +} + + +# STEP 2 +# list of classification names that potentially could be mapped +# to the tree with the given tree_id + +$statement = "SELECT nnds.tax_id +FROM $new_nodes_table nnds JOIN $new_nodes_table inc + ON (nnds.left_id BETWEEN inc.left_id AND inc.right_id) +WHERE inc.tax_id = ( + SELECT tax_id + FROM $new_nodes_table + WHERE left_id <= ( SELECT MIN(left_id) FROM $new_nodes_table WHERE tax_id IN ( + SELECT tx.taxid + FROM nodes nds JOIN taxon_variants tv USING (taxon_variant_id) + JOIN taxa tx ON (tv.taxon_id = tx.taxon_id) + WHERE tx.taxid > 0 + AND nds.tree_id = $tree_id + AND (nds.right_id - nds.left_id = 1) + ) ) + AND right_id >= ( SELECT MAX(right_id) FROM $new_nodes_table WHERE tax_id IN ( + SELECT tx.taxid + FROM nodes nds JOIN taxon_variants tv USING (taxon_variant_id) + JOIN taxa tx ON (tv.taxon_id = tx.taxon_id) + WHERE tx.taxid > 0 + AND nds.tree_id = $tree_id + AND (nds.right_id - nds.left_id = 1) + ) ) + ORDER BY right_id + LIMIT 1 +) +EXCEPT +SELECT tx.taxid +FROM nodes nds JOIN taxon_variants tv USING (taxon_variant_id) +JOIN taxa tx ON (tv.taxon_id = tx.taxon_id) +WHERE nds.tree_id = $tree_id +AND tx.taxid > 0 +AND (nds.right_id - nds.left_id = 1)"; + +my $sth = $dbh->prepare($statement) +or die "Can't prepare $statement: $dbh->errstr\n"; +my $rv = $sth->execute +or die "can't execute the query: $sth->errstr\n"; + +my @ncbi_list; +my %ncbi_hash; +while(my @row = $sth->fetchrow_array) { + push(@ncbi_list, $row[0]); + $ncbi_hash{ $row[0] } = $row[0]; +} +my $rd = $sth->finish; + +print "Number of classification names that potentially could be mapped to tree $tree_id: $#ncbi_list \n"; +print "i.e., they are all descendants from the MRCA of tree $tree_id (i.e. $mrca_ncbi_name), \n"; +print "excluding names that are already in tree $tree_id.\n\n"; + +# STEP 3 +# List all classification internal nodes bounded by the phylogeny. +# Order by smallest clade to largest + +print "List all ncbi internal nodes in the NCBI $mrca_ncbi_name \n"; +print "subtree order by smallest clade to largest: \n"; +print " taxid left_id right_id clade size\n"; + +$statement = "SELECT nnds.tax_id, nnds.left_id, nnds.right_id, (nnds.right_id - nnds.left_id) - 1 AS clade_size +FROM $new_nodes_table nnds +WHERE nnds.left_id >= $mrca_ncbi_left_id +AND nnds.right_id <= $mrca_ncbi_right_id +AND (nnds.right_id - nnds.left_id) > 1 +ORDER BY (nnds.right_id - nnds.left_id) "; + +my $sth = $dbh->prepare($statement) +or die "Can't prepare $statement: $dbh->errstr\n"; +my $rv = $sth->execute +or die "can't execute the query: $sth->errstr\n"; + +my @ncbi_internalnode_list; +my %ncbi_internalnode_left; +my %ncbi_internalnode_right; +while(my @row = $sth->fetchrow_array) { + printf (" %10d %10d %10d %10d\n", @row ); + push(@ncbi_internalnode_list, $row[0]); + $ncbi_internalnode_left{ $row[0] } = $row[1]; + $ncbi_internalnode_right{ $row[0] } = $row[2]; +} +my $rd = $sth->finish; +print "\n"; + +# STEP 4 +# For each classification clade, see if there is a MRCA equivalent in tree_id $tree_id + +print "For each ncbi clade, see if there is an equivalent clade in tree_id $tree_id:\n\n"; +print "ncbi_taxid -> tree node_id\n"; + +$statement = "SELECT onds.node_id +FROM nodes onds +WHERE onds.tree_id = ? +AND onds.left_id <= ( + SELECT MIN(nds.left_id) + FROM nodes nds + JOIN taxon_variants tv USING (taxon_variant_id) + JOIN taxa tx ON (tv.taxon_id = tx.taxon_id) + JOIN $new_nodes_table nnds ON (tx.taxid = nnds.tax_id) + WHERE nnds.left_id >= ? + AND nnds.left_id <= ? + AND nds.tree_id = ? +) +AND onds.right_id >= ( + SELECT MAX(nds.right_id) + FROM nodes nds + JOIN taxon_variants tv USING (taxon_variant_id) + JOIN taxa tx ON (tv.taxon_id = tx.taxon_id) + JOIN $new_nodes_table nnds ON (tx.taxid = nnds.tax_id) + WHERE nnds.left_id >= ? + AND nnds.left_id <= ? + AND nds.tree_id = ? +) +ORDER BY onds.right_id +LIMIT 1"; + +my $sth = $dbh->prepare($statement) +or die "Can't prepare $statement: $dbh->errstr\n"; + +my %mapped_node; +my $node_in_tree; +for ( my $j=0; $j < @ncbi_internalnode_list; $j++ ) { + my $rv = $sth->execute( $tree_id, $ncbi_internalnode_left{ $ncbi_internalnode_list[$j] }, $ncbi_internalnode_right{ $ncbi_internalnode_list[$j] }, $tree_id, $ncbi_internalnode_left{ $ncbi_internalnode_list[$j] }, $ncbi_internalnode_right{ $ncbi_internalnode_list[$j] }, $tree_id ); + ($node_in_tree) = $sth->fetchrow_array; + $mapped_node{ $ncbi_internalnode_list[$j] } = $node_in_tree if ($node_in_tree); + print " $ncbi_internalnode_list[$j] -> $node_in_tree\n" if ($node_in_tree); +} +my $rd = $sth->finish; +print "\n"; + + +# STEP 5 +# For each classification clade, going from smallest to largest, see if there are +# any descendants that can be mapped + +print "For each ncbi clade going from smallest to largest, take all descendants and \n"; +print "attach them to the equivalent clade in the tree:\n\n"; +print "ncbi_taxid -> tree node_id\n"; + +$statement = "SELECT nnds.tax_id, nna.name_txt +FROM ncbi_names nna JOIN $new_nodes_table nnds USING (tax_id) +JOIN $new_nodes_table ninc ON (nnds.left_id BETWEEN ninc.left_id AND ninc.right_id) +WHERE (nnds.right_id - nnds.left_id = 1) +AND nna.name_class = 'scientific name' +AND ninc.tax_id = ? "; +my $sth = $dbh->prepare($statement) +or die "Can't prepare $statement: $dbh->errstr\n"; + + +for ( my $j=0; $j < @ncbi_internalnode_list; $j++ ) { + if ( defined ( $mapped_node{ $ncbi_internalnode_list[$j] } ) ) { + my $rv = $sth->execute( $ncbi_internalnode_list[$j] ); + while(my @row = $sth->fetchrow_array) { + if ( defined($ncbi_hash{ $row[0] }) ) { + print "map $row[0] ($row[1]) to $mapped_node{ $ncbi_internalnode_list[$j] } \n"; + + my $pq_id; + + $pq_id = $dbh->selectrow_array("SELECT pq_id FROM pq WHERE tax_id = $row[0] "); + + if ( !($pq_id) ) { + $dbh->do( "INSERT INTO pq (taxon_name, tax_id) VALUES (?, ?) ", undef, $row[1], $row[0] ); + $pq_id = $dbh->last_insert_id(undef,undef,undef,undef,{sequence=>'pq_id_seq'}); + } + + $dbh->do( "DELETE FROM pq_subtree WHERE pq_id = ? AND tree_id = ? ", undef, $pq_id, $tree_id ); + $dbh->do( "INSERT INTO pq_subtree (pq_id, tree_id, node_id) VALUES (?, ?, ?) ", undef, $pq_id, $tree_id, $mapped_node{ $ncbi_internalnode_list[$j] } ); + + delete($ncbi_hash{ $row[0] }); + } + } + } +} + +print "\n"; + + + + +my $sc = $dbh->disconnect; +exit; + + + +# Connect to Postgres using DBI +#============================================================== +sub ConnectToPg { + + my ($cstr, $user, $pass) = @_; + + $cstr = "DBI:Pg:dbname="."$cstr"; + #$cstr .= ";host=dev.nescent.org"; + + my $dbh = DBI->connect($cstr, $user, $pass, {PrintError => 0, RaiseError => 1}); + $dbh || &error("DBI connect failed : ",$dbh->errstr); + + return($dbh); +} Added: trunk/treebase-derivatives/vToL/src/classification_to_tree.pl =================================================================== --- trunk/treebase-derivatives/vToL/src/classification_to_tree.pl (rev 0) +++ trunk/treebase-derivatives/vToL/src/classification_to_tree.pl 2011-03-30 18:58:25 UTC (rev 780) @@ -0,0 +1,118 @@ +#!/usr/bin/perl + +# Script exports a clade from the a classification (NCBI's by +# default) to a NEXUS style tree file. This is useful if you'd like to +# use Mesquite to improve the classification -- e.g. you can +# create unnamed ranks that are not present in NCBI, or can +# add in missing species. After editing the classification tree, you +# can re-import it into the database to use with your vToL + +use strict; +use DBI; + +my $rootid = 999999; +my $new_nodes_table = "my_edited_classification"; + +my $file = "ncbi_tree_out.tre"; +open (OUTPUT, ">$file") || die "Cannot open $file!: $!"; + +print OUTPUT "#NEXUS\n\nBEGIN TREES;\n\n"; + +# Fill in the database name and access credentials +my $database = ""; +my $username = ""; +my $password = ""; +my $dbh = &ConnectToPg($database, $username, $password); + +my $count = "SELECT COUNT(*) FROM ncbi_names NATURAL INNER JOIN $new_nodes_table "; +$count .= "WHERE parent_tax_id = ? AND name_class = 'scientific name'"; +#$count .= "WHERE parent_tax_id = ? "; + + +my $select = "SELECT tax_id, name_txt FROM ncbi_names NATURAL INNER JOIN $new_nodes_table "; +$select .= "WHERE parent_tax_id = ? AND name_class = 'scientific name'"; +#$select .= "WHERE parent_tax_id = ? "; + +my $children = $dbh->prepare($select); + +print OUTPUT "\tTREE mytree = "; + +&walktree($dbh, $rootid); + +print OUTPUT ";\n"; + +my $rc = $dbh->disconnect; + +print OUTPUT "\nEND;\n"; +exit; + + +# walktree +#============================================================== +sub walktree { + my $dbh = shift; + my $parent_id = shift; + my $parent_name = shift; + + + my $totRec = $dbh->selectrow_array ($count, undef, $parent_id); + + if ($totRec > 0) { + # still more children -- print a new open parenthesis + print OUTPUT "("; + } else { + # no more children -- print the OTU + print OUTPUT tokenize("$parent_name"); + } + + # get some children + $children->execute($parent_id); + + my $br = 0; + for my $row (@{$children->fetchall_arrayref}) { + $br++; + my ($child_id, $child_name) = @$row; + + #treat each child as a parent and walk the tree some more + walktree($dbh, $child_id, $child_name); + print OUTPUT "," if ($br < $totRec); + } + + if ($totRec > 0) { + print OUTPUT ")"; + print OUTPUT tokenize("$parent_name"); + } +} + +# tokenize +#============================================================== +sub tokenize { + + my $token = shift; + + $token =~ s/\'/\"/g; + + if ($token =~ m/[-\/\?\<\>\*\%\&\$\#\@\!\"\(\)]/) { + $token = "\'$token\'"; + } else { + $token =~ s/\s/_/g; + } + + return ($token); + +} + +# Connect to Postgres using DBI +#============================================================== +sub ConnectToPg { + + my ($cstr, $user, $pass) = @_; + + $cstr = "DBI:Pg:dbname="."$cstr"; + #$cstr .= ";host=dev.nescent.org"; + + my $dbh = DBI->connect($cstr, $user, $pass, {PrintError => 1, RaiseError => 1}); + $dbh || &error("DBI connect failed : ",$dbh->errstr); + + return($dbh); +} Added: trunk/treebase-derivatives/vToL/src/download_vtol.pl =================================================================== --- trunk/treebase-derivatives/vToL/src/download_vtol.pl (rev 0) +++ trunk/treebase-derivatives/vToL/src/download_vtol.pl 2011-03-30 18:58:25 UTC (rev 780) @@ -0,0 +1,176 @@ +#!/usr/bin/perl + +# This script outputs a vToL as a NEXUS tree file, attaching all +# missing species (from the classification) to an extra node that +# is parent to the subtree where the missing species attaches. +# Additionally, the added species are prefixed with "O" so that +# they can be easily highlighted in a different color using PhyloWidget + +use strict; +use DBI; + +my @inputs = @ARGV; +my $map = shift(@inputs); + +if (length($inputs[0]) < 2) { + print "Input error! Usage: perl download_vToL.pl [mapping] [list of tree_ids] \n"; + exit; +} + + +# Fill in the database name and access credentials +my $database = ""; +my $username = ""; +my $password = ""; +my $dbh = &ConnectToPg($database, $username, $password); + +&getTrees ($dbh, $map, @inputs); + +my $rc = $dbh->disconnect; + +exit; + + + + +# get trees +#============================================================== +sub getTrees { + + my ($dbh, $map, @treeItems) = @_; + + if (@treeItems) { + + print "#NEXUS\n\nBEGIN TREES;\n\n"; + foreach my $treeid (@treeItems) { + + print " [tr_id: $treeid]\n"; + + my $statement = "SELECT root, tree_label FROM trees WHERE ( tree_id = ? )"; + my $sth = $dbh->prepare ($statement); + $sth->execute( $treeid ); + + $statement = "SELECT child_id, node_label, edge_length, edge_support "; + $statement .= "FROM edges INNER JOIN nodes ON edges.child_ID = nodes.node_id "; + $statement .= "WHERE parent_id = ?"; + my $children = $dbh->prepare ($statement); + + # return a list of mapped children + $statement = "SELECT taxon_name, tax_id FROM pq_subtree JOIN pq USING (pq_id) WHERE node_id = ? "; + my $mapping = $dbh->prepare ($statement); + + while (my @row = $sth->fetchrow_array()) { + + print "\tTREE ".&tokenize($row[1])." = "; + + &walktree($dbh, $map, $children, $mapping, $row[0]); + + print ";\n"; + + } + $sth->finish(); + } + print "\nEND;\n"; + + + } else { + print "Error: No trees requested\n"; + exit; + } + +} + +# walktree +#============================================================== +sub walktree { + my $dbh = shift; + my $map = shift; + my $children = shift; + my $mapping = shift; + my $id = shift; + my $support = shift; + my $length = shift; + + my $statement = "SELECT COUNT(*) FROM edges WHERE parent_id = $id"; + my $totRec = $dbh->selectrow_array ($statement); + + $statement = "SELECT COUNT(*) FROM pq_subtree WHERE node_id = $id"; + my $totMaps = $dbh->selectrow_array ($statement); + + if ($totRec) { + print "("; + } + if (($totMaps) && ($map)) { + print "("; + } + + + $children->execute($id); + + my $br = 0; + for my $row (@{$children->fetchall_arrayref}) { + $br++; + my ($id, $label, $edge_length, $edge_support) = @$row; + $label = "O $label" if (($map) && ($label)); + print &tokenize($label); + walktree($dbh, $map, $children, $mapping, $id, $edge_support, $edge_length); + print "," if ($br < $totRec); + } + + if ($totRec) { + + print ")"; + print &tokenize($support) if ($support); + print ":". &tokenize($length) if ($length); + } else { + print ":". &tokenize($length) if ($length); + } + + if (($totMaps) && ($map)) { + $mapping->execute($id); + for my $row (@{$mapping->fetchall_arrayref}) { + my ($mapped_label, $tax_id) = @$row; + $mapped_label = "$mapped_label"; + print ",". &tokenize($mapped_label); + } + print ")"; + } + + +} + +# tokenize -- encapsulate tokens according to nexus rules +# technically speaking, single quotes should be repeated +# e.g. change "It's nice" to "'It''s nice'", but I'd rather not +# mess with that, so I'm changing all single quotes to double +#============================================================== +sub tokenize { + + my $token = shift; + + $token =~ s/\'/\"/g; + + if ($token =~ m/[-\/\?\<\>\*\%\&\$\#\@\!\"\:]/) { + $token = "\'$token\'"; + } else { + $token =~ s/\s/_/g; + } + + return ($token); + +} + +# Connect to Postgres using DBI +#============================================================== +sub ConnectToPg { + + my ($cstr, $user, $pass) = @_; + + $cstr = "DBI:Pg:dbname="."$cstr"; + #$cstr .= ";host=dev.nescent.org"; + + my $dbh = DBI->connect($cstr, $user, $pass, {PrintError => 1, RaiseError => 1}); + $dbh || &error("DBI connect failed : ",$dbh->errstr); + + return($dbh); +} \ No newline at end of file Added: trunk/treebase-derivatives/vToL/src/query_vtol.pl =================================================================== --- trunk/treebase-derivatives/vToL/src/query_vtol.pl (rev 0) +++ trunk/treebase-derivatives/vToL/src/query_vtol.pl 2011-03-30 18:58:25 UTC (rev 780) @@ -0,0 +1,143 @@ +#!/usr/bin/perl + +# Searches for the MRCA on a phylogeny based on a list of taxon names. +# Returns all the descendants of the MRCA, including taxa that were +# missing from the phylogeny but that descend from a classification rank that +# maps to the subtree with the MRCA at its origin. Additionally, it returns +# taxa from the classification that map to the direct ancestor of the MRCA + +use strict; +use DBI; + +# check that the right number of arguments are listed +die "Input error! Usage: perl query_vtol.pl tree_id <list of taxa> +e.g.: perl query_taxa.pl 9365 'Dipelta floribunda' 'Zabelia biflora' \n" if (@ARGV < 2); + +my $tree_id = shift; +my @taxonlabels = @ARGV; + +my $list_size = $#taxonlabels; +$list_size++; + +my $list_string = join ("', '", @taxonlabels ); +$list_string = "'$list_string'"; + +# Fill in the database name and access credentials +my $database = ""; +my $username = ""; +my $password = ""; +my $dbh = &ConnectToPg($database, $username, $password); + +print "\nFind all leaf names that descend from the MRCA of ($list_string) in tree $tree_id:\n\n"; + +my $statement = "SELECT nds.node_id, nds.node_label +FROM nodes nds JOIN nodes inc ON (nds.left_id BETWEEN inc.left_id AND inc.right_id) +WHERE nds.tree_id = $tree_id +AND inc.tree_id = nds.tree_id +AND (nds.right_id - nds.left_id = 1) +AND inc.node_id = ( + SELECT ndsP.node_id + FROM nodes ndsC JOIN node_path npth ON (ndsC.node_id = npth.child_node_id) + JOIN nodes ndsP ON (npth.parent_node_id = ndsP.node_id) + JOIN taxon_variants tv ON (ndsC.taxon_variant_id = tv.taxon_variant_id) + JOIN taxa tx ON (tv.taxon_id = tx.taxon_id) + WHERE tx.namestring IN ($list_string) + AND ndsC.tree_id = $tree_id + GROUP BY ndsP.node_id, ndsP.right_id + HAVING COUNT(npth.child_node_id) >= $list_size + ORDER BY ndsP.right_id + LIMIT 1 +)"; + +my $query = $dbh->prepare ($statement); +$query->execute; + +for my $row (@{$query->fetchall_arrayref}) { + my ($node_id, $label) = @$row; + printf ( "%45s\n", $label); +} +$query->finish; + +print "\nFind all mapped names that descend from the MRCA of ($list_string) in tree $tree_id:\n\n"; + +$statement = "SELECT nds.node_id, p.taxon_name +FROM nodes nds JOIN nodes inc ON (nds.left_id BETWEEN inc.left_id AND inc.right_id) +JOIN pq_subtree pqs ON (nds.node_id = pqs.node_id) +JOIN pq p USING (pq_id) +WHERE nds.tree_id = $tree_id +AND inc.tree_id = nds.tree_id +AND inc.node_id = ( + SELECT ndsP.node_id + FROM nodes ndsC JOIN node_path npth ON (ndsC.node_id = npth.child_node_id) + JOIN nodes ndsP ON (npth.parent_node_id = ndsP.node_id) + JOIN taxon_variants tv ON (ndsC.taxon_variant_id = tv.taxon_variant_id) + JOIN taxa tx ON (tv.taxon_id = tx.taxon_id) + WHERE tx.namestring IN ($list_string) + AND ndsC.tree_id = $tree_id + GROUP BY ndsP.node_id, ndsP.right_id + HAVING COUNT(npth.child_node_id) >= $list_size + ORDER BY ndsP.right_id + LIMIT 1 +)"; + +my $query = $dbh->prepare ($statement); +$query->execute; + +for my $row (@{$query->fetchall_arrayref}) { + my ($node_id, $label) = @$row; + printf ( "%45s\n", $label); +} +$query->finish; + +print "\nFind all mapped names that *might* descend from the MRCA of ($list_string) in tree $tree_id:\n\n"; + +$statement = "SELECT nds.node_id, p.taxon_name, (100.0 * (ndsize.right_id - ndsize.left_id) / (nds.right_id - nds.left_id) ) as clade_span +FROM nodes nds JOIN pq_subtree pqs ON (nds.node_id = pqs.node_id) +JOIN pq p USING (pq_id), nodes ndsize +WHERE nds.tree_id = $tree_id +AND nds.left_id < ndsize.left_id +AND nds.right_id > ndsize.right_id +AND ndsize.node_id = ( + SELECT ndsP.node_id + FROM nodes ndsC JOIN node_path npth ON (ndsC.node_id = npth.child_node_id) + JOIN nodes ndsP ON (npth.parent_node_id = ndsP.node_id) + JOIN taxon_variants tv ON (ndsC.taxon_variant_id = tv.taxon_variant_id) + JOIN taxa tx ON (tv.taxon_id = tx.taxon_id) + WHERE tx.namestring IN ($list_string) + AND ndsC.tree_id = $tree_id + GROUP BY ndsP.node_id, ndsP.right_id + HAVING COUNT(npth.child_node_id) >= $list_size + ORDER BY ndsP.right_id + LIMIT 1 +) +ORDER BY (nds.right_id - nds.left_id) DESC;"; + +my $query = $dbh->prepare ($statement); +$query->execute; + +for my $row (@{$query->fetchall_arrayref}) { + my ($node_id, $label, $clade_span) = @$row; + printf ( "%45s %5.1f\%\n", $label, $clade_span); +} +$query->finish; + + +my $rc = $dbh->disconnect; + + + + +# Connect to Postgres using DBI +#============================================================== +sub ConnectToPg { + + my ($cstr, $user, $pass) = @_; + + $cstr = "DBI:Pg:dbname="."$cstr"; + # $cstr .= ";host=10.9.1.1"; + + my $dbh = DBI->connect($cstr, $user, $pass, {PrintError => 1, RaiseError => 1}); + $dbh || &error("DBI connect failed : ",$dbh->errstr); + + return($dbh); +} \ No newline at end of file Added: trunk/treebase-derivatives/vToL/src/tree_to_classification.pl =================================================================== --- trunk/treebase-derivatives/vToL/src/tree_to_classification.pl (rev 0) +++ trunk/treebase-derivatives/vToL/src/tree_to_classification.pl 2011-03-30 18:58:25 UTC (rev 780) @@ -0,0 +1,272 @@ +#!/usr/bin/perl + + +# This script takes a tree from a NEXUS file and creates a +# table to store the tree as a classification, whereupon +# you can build a vToL. So, for example, if you're unhappy with +# NCBI's classification for a certain group, using the "classification_to_tree.pl" +# script to dump the NCBI clade to a tree file, then use Mesquite +# to edit the classification tree, then use this script to re-import it. +# A new table is created to accomodate your edited classification +# so as not to disturb the original NCBI classification. + +use strict; +use warnings; +use DBI; +use Bio::Phylo::IO 'parse'; + +my ( $taxa, $matrix, $forest ); + +my $file = shift @ARGV; +my $blocks = parse( + '-format' => 'nexus', + '-file' => $file, +); + +for my $block ( @{ $blocks } ) { + $forest = $block if $block->isa('Bio::Phylo::Forest'); +} + +# Fill in the database name and access credentials +my $database = ""; +my $username = ""; +my $password = ""; +my $dbh = &ConnectToPg($database, $username, $password); + +my ($sth, $rv, $statement); + +my $new_nodes_table = "my_edited_classification"; +my $new_nodes_seq = "my_edited_classification_seq_taxid"; +my $highest_ncbi = 1000000; +my $start_taxid = 4199; #ncbi taxid for the Dipsacales +my $division_id = 4; +my $inherited_div_flag = 1; +my $genetic_code_id = 1; +my $inherited_gc_flag = 1; +my $mitochondrial_genetic_code_id = 1; +my $inherited_mgc_flag = 1; +my $genbank_hidden_flag = 0; +my $hidden_subtree_root_flag = 0; + +# highest tax_id = 869615 +$dbh->do( "DELETE FROM NCBI_NAMES WHERE TAX_ID > 869615" ); + +$dbh->do( "DROP TABLE IF EXISTS $new_nodes_table CASCADE" ); +$dbh->do( "DROP SEQUENCE IF EXISTS $new_nodes_seq " ); + +$statement = <<STATEMENT; +CREATE SEQUENCE $new_nodes_seq + START WITH $highest_ncbi + INCREMENT BY 1 + NO MAXVALUE + NO MINVALUE + CACHE 1 +STATEMENT +$dbh->do( "$statement" ); + +$statement = <<STATEMENT; +CREATE TABLE $new_nodes_table ( + tax_id integer DEFAULT nextval('$new_nodes_seq'::regclass) NOT NULL, + parent_tax_id integer NOT NULL, + rank character varying(32), + embl_code character varying(16), + division_id integer NOT NULL, + inherited_div_flag integer NOT NULL, + genetic_code_id integer NOT NULL, + inherited_gc_flag integer NOT NULL, + mitochondrial_genetic_code_id integer NOT NULL, + inherited_mgc_flag integer NOT NULL, + genbank_hidden_flag integer NOT NULL, + hidden_subtree_root_flag integer NOT NULL, + comments character varying(255) DEFAULT NULL::character varying, + left_id integer, + right_id integer +) +STATEMENT + +$dbh->do( "$statement" ); +$statement = "ALTER TABLE ONLY $new_nodes_table ADD CONSTRAINT $new_nodes_table" . "_pkey PRIMARY KEY (tax_id)"; +$dbh->do( "$statement" ); +$statement = "CREATE INDEX $new_nodes_table" . "_left_id ON $new_nodes_table USING btree (left_id)"; +$dbh->do( "$statement" ); +$statement = "CREATE INDEX $new_nodes_table" . "_right_id ON $new_nodes_table USING btree (right_id)"; +$dbh->do( "$statement" ); +$statement = "CREATE INDEX $new_nodes_table" . "_parent_tax_id ON $new_nodes_table USING btree (parent_tax_id)"; +$dbh->do( "$statement" ); + +$dbh->do( "DROP TABLE IF EXISTS $new_nodes_table"."_path CASCADE" ); +$statement = "CREATE TABLE $new_nodes_table"."_path (\n"; +$statement .= <<STATEMENT; + child_node_id bigint DEFAULT (0)::bigint NOT NULL, + parent_node_id bigint DEFAULT (0)::bigint NOT NULL, + distance integer DEFAULT 0 NOT NULL +) +STATEMENT +$dbh->do( "$statement" ); +$statement = "CREATE INDEX $new_nodes_table" . "_path_parent_id ON $new_nodes_table"."_path USING btree (parent_node_id)"; +$dbh->do( "$statement" ); +$statement = "CREATE INDEX $new_nodes_table" . "_path_child_id ON $new_nodes_table"."_path USING btree (child_node_id)"; +$dbh->do( "$statement" ); +$statement = "CREATE INDEX $new_nodes_table" . "_path_distance ON $new_nodes_table"."_path USING btree (distance)"; +$dbh->do( "$statement" ); + +# global statements for setting the left-right indexing +my $setleft = $dbh->prepare("UPDATE $new_nodes_table SET left_id = ? WHERE tax_id = ?"); +my $setright = $dbh->prepare("UPDATE $new_nodes_table SET right_id = ? WHERE tax_id = ?"); +my $ctr; + +# global statements for calculating the transitive closure +my $deletepaths = $dbh->prepare("DELETE FROM $new_nodes_table"."_path"); + +my $init_sql = "INSERT INTO $new_nodes_table"."_path (child_node_id, parent_node_id, distance) "; + $init_sql .= "SELECT tax_id, parent_tax_id, 1 FROM $new_nodes_table "; +my $initialize_paths = $dbh->prepare("$init_sql"); + +my $path_sql = "INSERT INTO $new_nodes_table"."_path (child_node_id, parent_node_id, distance) "; + $path_sql .= "SELECT n.tax_id, p.parent_node_id, p.distance+1 "; + $path_sql .= "FROM $new_nodes_table"."_path p, $new_nodes_table n "; + $path_sql .= "WHERE p.child_node_id = n.parent_tax_id "; + $path_sql .= "AND p.distance = ?"; +my $calc_paths = $dbh->prepare("$path_sql"); + +# get only the first tree +my $tree = shift ( @{ $forest->get_entities } ); + +print "###########################\n"; +print $tree->get_name . "\n"; +my $tree_name = $tree->get_name; + +# create a new node record, which will be the root of this tree +my $tree_id = 1; +my $root_node_id = $highest_ncbi - 1; +$statement = "INSERT INTO $new_nodes_table "; +$statement .= "(tax_id, parent_tax_id, division_id, inherited_div_flag, genetic_code_id, inherited_gc_flag, mitochondrial_genetic_code_id, inherited_mgc_flag, genbank_hidden_flag, hidden_subtree_root_flag) VALUES "; +$statement .= "($root_node_id, 0, $division_id, $inherited_div_flag, $genetic_code_id, $inherited_gc_flag, $mitochondrial_genetic_code_id, $inherited_mgc_flag, $genbank_hidden_flag, $hidden_subtree_root_flag)"; +$dbh->do( "$statement" ); + +# set a counter for the left-right indexing +$ctr = ($highest_ncbi * 100); +walktree( $tree->get_root , $root_node_id); + +compute_tc(); + +my $rc = $dbh->disconnect; + +exit; + + +#=================================== +sub walktree { + my $parent = shift; + my $parent_id = shift; + + $setleft->execute($ctr++, $parent_id); + + for my $child ( @{ $parent->get_children } ) { + + my $branch_length; + my $edge_support; + my $child_id; + + if (($child->get_name) && !($child->get_name =~ m/^\d*\.?\d+$/)) { + my $taxon_label = &detokenize( $child->get_name ); + + my $statement = "SELECT COUNT(*) FROM ncbi_names WHERE name_txt = " . $dbh->quote( $taxon_label ); + my $totRec = $dbh->selectrow_array ($statement); + + if ($totRec > 0) { + # there is a node name and this name already exists + $statement = "SELECT tax_id FROM ncbi_names WHERE name_txt = " . $dbh->quote( $taxon_label ) . " LIMIT 1 "; + $child_id = $dbh->selectrow_array ($statement); + + $statement = "INSERT INTO $new_nodes_table "; + $statement .= "(tax_id, parent_tax_id, division_id, inherited_div_flag, genetic_code_id, inherited_gc_flag, "; + $statement .= "mitochondrial_genetic_code_id, inherited_mgc_flag, genbank_hidden_flag, hidden_subtree_root_flag) "; + $statement .= "VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?) "; + $dbh->do( "$statement", undef, $child_id, $parent_id, $division_id, $inherited_div_flag, $genetic_code_id, $inherited_gc_flag, $mitochondrial_genetic_code_id, $inherited_mgc_flag, $genbank_hidden_flag, $hidden_subtree_root_flag ); + + } else { + # there is a node name, but it doesn't exist + $statement = "INSERT INTO $new_nodes_table "; + $statement .= "(parent_tax_id, division_id, inherited_div_flag, genetic_code_id, inherited_gc_flag, "; + $statement .= "mitochondrial_genetic_code_id, inherited_mgc_flag, genbank_hidden_flag, hidden_subtree_root_flag) "; + $statement .= "VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?) "; + $dbh->do( "$statement", undef, $parent_id, $division_id, $inherited_div_flag, $genetic_code_id, $inherited_gc_flag, $mitochondrial_genetic_code_id, $inherited_mgc_flag, $genbank_hidden_flag, $hidden_subtree_root_flag ); + + $child_id = $dbh->last_insert_id(undef,undef,undef,undef,{sequence=>"$new_nodes_seq"}); + $dbh->do( "INSERT INTO ncbi_names (tax_id, name_txt, name_class) VALUES (?, ?, ?) ", undef, $child_id, $taxon_label, 'scientific name' ); + } + + } else { + # there is no node name for this node + $statement = "INSERT INTO $new_nodes_table "; + $statement .= "(parent_tax_id, division_id, inherited_div_flag, genetic_code_id, inherited_gc_flag, "; + $statement .= "mitochondrial_genetic_code_id, inherited_mgc_flag, genbank_hidden_flag, hidden_subtree_root_flag) "; + $statement .= "VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?) "; + $dbh->do( "$statement", undef, $parent_id, $division_id, $inherited_div_flag, $genetic_code_id, $inherited_gc_flag, $mitochondrial_genetic_code_id, $inherited_mgc_flag, $genbank_hidden_flag, $hidden_subtree_root_flag ); + + $child_id = $dbh->last_insert_id(undef,undef,undef,undef,{sequence=>"$new_nodes_seq"}); + $dbh->do( "INSERT INTO ncbi_names (tax_id, name_txt, name_class) VALUES (?, ?, ?) ", undef, $child_id, "Unnamed Rank $child_id", 'scientific name' ); + } + + walktree( $child, $child_id ); + } + + $setright->execute($ctr++, $parent_id); +} + +# Remove nexus tokenization +#============================================================== +sub detokenize { + my $token = shift; + + $token =~ s/_/ /g; + $token =~ s/^\'//g; + $token =~ s/\'$//g; + $token =~ s/''/'/g; + + return($token); +} + +# Compute the transitive closure +#============================================================== +sub compute_tc { + + $deletepaths->execute(); + $initialize_paths->execute(); + + my $dist = 1; + my $rv = 1; + while ($rv > 0) { + $rv = $calc_paths->execute($dist); + $dist++; + } +} + +# Connect to Postgres using DBI +#============================================================== +sub ConnectToPg { + + my ($cstr, $user, $pass) = @_; + + $cstr = "DBI:Pg:dbname="."$cstr"; + #$cstr .= ";host=10.9.1.1"; + + my $dbh = DBI->connect($cstr, $user, $pass, {PrintError => 1, RaiseError => 1}); + $dbh || &error("DBI connect failed : ",$dbh->errstr); + + return($dbh); +} + +# Get the auto-added id +#============================================================== +sub last_insert_id { + my($dbh, $sequence_name) = @_; + + my $driver = $dbh->{Driver}->{Name}; + if (lc($driver) eq 'mysql') { + return $dbh->{'mysql_insertid'}; + } else { + return $dbh->last_insert_id(undef,undef,undef,undef,{sequence=>'$sequence_name'}); + } +} Added: trunk/treebase-derivatives/vToL/src/upload_tree.pl =================================================================== --- trunk/treebase-derivatives/vToL/src/upload_tree.pl (rev 0) +++ trunk/treebase-derivatives/vToL/src/upload_tree.pl 2011-03-30 18:58:25 UTC (rev 780) @@ -0,0 +1,189 @@ +#!/usr/bin/perl + +# This script parsers a NEXUS file and uploads any phylogenies in it +# to the database. When this is done, you'll want to make a mapping +# between the new taxon labels brought in with your tree and the existing +# taxonomy: +# +# UPDATE nodes SET taxon_variant_id = tv.taxon_variant_id +# FROM nodes nds JOIN taxon_variants tv ON (nds.node_label = tv.fullnamestring) +# WHERE nds.tree_id = 9365 +# AND (nds.right_id - nds.left_id = 1) +# AND nodes.node_id = nds.node_id; + +use strict; +use warnings; +use DBI; +use Bio::Phylo::IO 'parse'; + +my ( $taxa, $matrix, $forest ); + +my $file = shift @ARGV; +my $blocks = parse( + '-format' => 'nexus', + '-file' => $file, +); + +for my $block ( @{ $blocks } ) { + $forest = $block if $block->isa('Bio::Phylo::Forest'); +} + +# Fill in the database name and access credentials +my $database = ""; +my $username = ""; +my $password = ""; +my $dbh = &ConnectToPg($database, $username, $password); + +my ($sth, $rv); + +# global statements for setting the left-right indexing +my $setleft = $dbh->prepare("UPDATE nodes SET left_id = ? WHERE node_id = ?"); +my $setright = $dbh->prepare("UPDATE nodes SET right_id = ? WHERE node_id = ?"); +my $ctr; + +# global statements for calculating the transitive closure +my $deletepaths = $dbh->prepare("DELETE FROM node_path WHERE child_node_id IN (SELECT node_id FROM nodes WHERE tree_id = ?)"); + +my $init_sql = "INSERT INTO node_path (child_node_id, parent_node_id, distance) "; + $init_sql .= "SELECT e.child_id, e.parent_id, 1 FROM edges e, nodes n "; + $init_sql .= "WHERE e.child_id = n.node_id AND n.tree_id = ?"; +my $initialize_paths = $dbh->prepare("$init_sql"); + +my $path_sql = "INSERT INTO node_path (child_node_id, parent_node_id, distance)"; + $path_sql .= "SELECT e.child_id, p.parent_node_id, p.distance+1 "; + $path_sql .= "FROM node_path p, edges e, nodes n "; + $path_sql .= "WHERE p.child_node_id = e.parent_id "; + $path_sql .= "AND n.node_id = e.child_id AND n.tree_id = ? "; + $path_sql .= "AND p.distance = ?"; +my $calc_paths = $dbh->prepare("$path_sql"); + +foreach my $tree ( @{ $forest->get_entities } ) { + print "###########################\n"; + print $tree->get_name . "\n"; + my $tree_name = $tree->get_name; + + # create a new node record, which will be the root of this tree + $dbh->do( "INSERT INTO nodes (node_label) VALUES (NULL) " ); + my $root_node_id = $dbh->last_insert_id(undef,undef,undef,undef,{sequence=>'nodes_node_id'}); + + # create a new tree record, specifying the tree's name it its root node + $dbh->do( "INSERT INTO trees (tree_label, root) VALUES ('$tree_name','$root_node_id')" ); + my $tree_id = $dbh->last_insert_id(undef,undef,undef,undef,{sequence=>'trees_tree_id'}); + + # update the newly created node so that it knows what tree it belongs to + $dbh->do( "UPDATE nodes SET tree_id = '$tree_id' WHERE node_id = $root_node_id " ); + + # set a counter for the left-right indexing + $ctr = 1; + walktree( $tree->get_root , $tree_id, $root_node_id); + + compute_tc($tree_id); +} +my $rc = $dbh->disconnect; + +exit; + + +#=================================== +sub walktree { + my $parent = shift; + my $tree_id = shift; + my $parent_id = shift; + + $setleft->execute($ctr++, $parent_id); + + for my $child ( @{ $parent->get_children } ) { + + my $branch_length; + my $edge_support; + + # create a new child record, but only use the label if it doesn't look like a + # clade support value (i.e. a number) + if (($child->get_name) && !($child->get_name =~ m/^\d*\.?\d+$/)) { + my $taxon_label = $child->get_name; + $dbh->do( "INSERT INTO nodes (node_label, tree_id) VALUES (?, ?) ", undef, &detokenize($taxon_label), $tree_id ); + } else { + $dbh->do( "INSERT INTO nodes (node_label, tree_id) VALUES (NULL, $tree_id) " ); + } + my $child_id = $dbh->last_insert_id(undef,undef,undef,undef,{sequence=>'nodes_node_id'}); + + # capture the branch length if there is one + if ($child->get_branch_length) { + $branch_length = $child->get_branch_length; + } else { + $branch_length = undef; + } + + # capture the edge support if internal node label looks like a number + if (($child->is_internal) && ($child->get_internal_name =~ m/^\d*\.?\d+$/)) { + $edge_support = $child->get_internal_name; + } else { + $edge_support = undef; + } + + # create an edge record between parent and child + my @values = ("$parent_id", "$child_id", "$branch_length", "$edge_support"); + $dbh->do( "INSERT INTO edges (parent_id, child_id, edge_length, edge_support) VALUES (?, ?, ?, ?) ", undef, @values); + + walktree( $child, $tree_id, $child_id ); + } + + $setright->execute($ctr++, $parent_id); +} + +# Remove nexus tokenization +#============================================================== +sub detokenize { + my $token = shift; + + $token =~ s/_/ /g; + $token =~ s/^\'//g; + $token =~ s/\'$//g; + $token =~ s/''/'/g; + + return($token); +} + +# Compute the transitive closure +#============================================================== +sub compute_tc { + my $tree_id = shift; + + $deletepaths->execute($tree_id); + $initialize_paths->execute($tree_id); + + my $dist = 1; + my $rv = 1; + while ($rv > 0) { + $rv = $calc_paths->execute($tree_id, $dist); + $dist++; + } +} + +# Connect to Postgres using DBI +#============================================================== +sub ConnectToPg { + + my ($cstr, $user, $pass) = @_; + + $cstr = "DBI:Pg:dbname="."$cstr"; + #$cstr .= ";host=10.9.1.1"; + + my $dbh = DBI->connect($cstr, $user, $pass, {PrintError => 1, RaiseError => 1}); + $dbh || &error("DBI connect failed : ",$dbh->errstr); + + return($dbh); +} + +# Get the auto-added id +#============================================================== +sub last_insert_id { + my($dbh, $sequence_name) = @_; + + my $driver = $dbh->{Driver}->{Name}; + if (lc($driver) eq 'mysql') { + return $dbh->{'mysql_insertid'}; + } else { + return $dbh->last_insert_id(undef,undef,undef,undef,{sequence=>'$sequence_name'}); + } +} \ No newline at end of file This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site. |
From: SourceForge.net <no...@so...> - 2011-03-25 18:24:32
|
Bugs item #3244573, was opened at 2011-03-25 11:24 Message generated for change (Tracker Item Submitted) made by ksclarke You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3244573&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: APIs Group: None Status: Open Priority: 5 Private: No Submitted By: Kevin S. Clarke (ksclarke) Assigned to: Nobody/Anonymous (nobody) Summary: OAI Harvesting Exception Initial Comment: Dryad periodically tries to harvest TreeBASE. Today we received this exception: <title>Java Uncaught Exception</title> <content tag="heading">Uncaught Exception Encountered</content> <p> java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.cipres.treebase.web.controllers.OAIPMHController.handle(OAIPMHController.java:117) at org.springframework.web.servlet.mvc.AbstractCommandController.handleRequestInternal(AbstractCommandController.java:84) at org.springframework.web.servlet.mvc.AbstractController.handleRequest(AbstractController.java:153) at org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:48) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:858) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:792) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:476) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:431) at javax.servlet.http.HttpServlet.service(HttpServlet.java:627) at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:269) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.tuckey.web.filters.urlrewrite.RuleChain.handleRewrite(RuleChain.java:164) at org.tuckey.web.filters.urlrewrite.RuleChain.doRules(RuleChain.java:141) at org.tuckey.web.filters.urlrewrite.UrlRewriter.processRequest(UrlRewriter.java:90) at org.tuckey.web.filters.urlrewrite.UrlRewriteFilter.doFilter(UrlRewriteFilter.java:406) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at com.opensymphony.module.sitemesh.filter.PageFilter.parsePage(PageFilter.java:119) at com.opensymphony.module.sitemesh.filter.PageFilter.doFilter(PageFilter.java:55) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.acegisecurity.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:264) at org.acegisecurity.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:107) at org.acegisecurity.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:72) at org.acegisecurity.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:274) at org.acegisecurity.ui.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:110) at org.acegisecurity.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:274) at org.acegisecurity.wrapper.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:81) at org.acegisecurity.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:274) at org.acegisecurity.ui.AbstractProcessingFilter.doFilter(AbstractProcessingFilter.java:217) at org.acegisecurity.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:274) at org.acegisecurity.context.HttpSessionContextIntegrationFilter.doFilter(HttpSessionContextIntegrationFilter.java:191) at org.acegisecurity.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:274) at org.acegisecurity.util.FilterChainProxy.doFilter(FilterChainProxy.java:148) at org.acegisecurity.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:90) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:174) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:873) at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665) at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528) at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81) at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:689) at java.lang.Thread.run(Thread.java:636) Caused by: org.springframework.transaction.CannotCreateTransactionException: Could not open Hibernate Session for transaction; nested exception is org.hibernate.exception.GenericJDBCException: Cannot open connection at org.springframework.orm.hibernate3.HibernateTransactionManager.doBegin(HibernateTransactionManager.java:541) at org.springframework.transaction.support.AbstractPlatformTransactionManager.getTransaction(AbstractPlatformTransactionManager.java:350) at org.springframework.transaction.interceptor.TransactionAspectSupport.createTransactionIfNecessary(TransactionAspectSupport.java:262) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:101) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204) at $Proxy350.findSubmissionByLastModifiedDateRange(Unknown Source) at org.cipres.treebase.web.controllers.OAIPMHController.ListRecords(OAIPMHController.java:125) ... 54 more Caused by: org.hibernate.exception.GenericJDBCException: Cannot open connection at org.hibernate.exception.SQLStateConverter.handledNonSpecificException(SQLStateConverter.java:103) at org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:91) at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:43) at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:29) at org.hibernate.jdbc.ConnectionManager.openConnection(ConnectionManager.java:426) at org.hibernate.jdbc.ConnectionManager.getConnection(ConnectionManager.java:144) at org.hibernate.jdbc.JDBCContext.connection(JDBCContext.java:119) at org.hibernate.transaction.JDBCTransaction.begin(JDBCTransaction.java:57) at org.hibernate.impl.SessionImpl.beginTransaction(SessionImpl.java:1326) at org.springframework.orm.hibernate3.HibernateTransactionManager.doBegin(HibernateTransactionManager.java:510) ... 61 more Caused by: org.apache.tomcat.dbcp.dbcp.SQLNestedException: Cannot create JDBC driver of class '' for connect URL 'null' at org.apache.tomcat.dbcp.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1150) at org.apache.tomcat.dbcp.dbcp.BasicDataSource.getConnection(BasicDataSource.java:880) at org.springframework.orm.hibernate3.LocalDataSourceConnectionProvider.getConnection(LocalDataSourceConnectionProvider.java:81) at org.hibernate.jdbc.ConnectionManager.openConnection(ConnectionManager.java:423) ... 66 more Caused by: java.lang.NullPointerException at org.postgresql.Driver.parseURL(Driver.java:552) at org.postgresql.Driver.acceptsURL(Driver.java:405) at java.sql.DriverManager.getDriver(DriverManager.java:268) at org.apache.tomcat.dbcp.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1143) ... 69 more Cookies: <p> The URL that was tried (and that produced the exception) is: http://www.treebase.org/treebase-web/top/oai?verb=ListRecords&from=2011-03-25T05:00:49Z&until=2011-03-25T17:02:50Z&metadataPrefix=oai_dc I pasted it into my browser and received the same exception message. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3244573&group_id=248804 |
From: <sfr...@us...> - 2011-03-25 16:27:00
|
Revision: 779 http://treebase.svn.sourceforge.net/treebase/?rev=779&view=rev Author: sfrgpiel Date: 2011-03-25 16:26:54 +0000 (Fri, 25 Mar 2011) Log Message: ----------- adding a directory for derivative projects Added Paths: ----------- trunk/treebase-derivatives/ trunk/treebase-derivatives/LICENSE.txt trunk/treebase-derivatives/vToL/ trunk/treebase-derivatives/vToL/LICENSE.txt trunk/treebase-derivatives/vToL/db/ trunk/treebase-derivatives/vToL/src/ Added: trunk/treebase-derivatives/LICENSE.txt =================================================================== --- trunk/treebase-derivatives/LICENSE.txt (rev 0) +++ trunk/treebase-derivatives/LICENSE.txt 2011-03-25 16:26:54 UTC (rev 779) @@ -0,0 +1,27 @@ +Copyright (c) 2009, Phyloinformatics Research Foundation +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are +met: + +* Redistributions of source code must retain the above copyright notice, + this list of conditions and the following disclaimer. +* Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in the + documentation and/or other materials provided with the distribution. +* Neither the name of the Phyloinformatics Research Foundation nor the + names of its contributors may be used to endorse or promote products + derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS +IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED +TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A +PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT +HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, +SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED +TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR +PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF +LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING +NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS +SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. \ No newline at end of file Added: trunk/treebase-derivatives/vToL/LICENSE.txt =================================================================== --- trunk/treebase-derivatives/vToL/LICENSE.txt (rev 0) +++ trunk/treebase-derivatives/vToL/LICENSE.txt 2011-03-25 16:26:54 UTC (rev 779) @@ -0,0 +1,27 @@ +Copyright (c) 2009, Phyloinformatics Research Foundation +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are +met: + +* Redistributions of source code must retain the above copyright notice, + this list of conditions and the following disclaimer. +* Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in the + documentation and/or other materials provided with the distribution. +* Neither the name of the Phyloinformatics Research Foundation nor the + names of its contributors may be used to endorse or promote products + derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS +IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED +TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A +PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT +HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, +SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED +TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR +PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF +LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING +NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS +SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. \ No newline at end of file This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site. |
From: <rv...@us...> - 2011-03-24 11:28:05
|
Revision: 778 http://treebase.svn.sourceforge.net/treebase/?rev=778&view=rev Author: rvos Date: 2011-03-24 11:27:59 +0000 (Thu, 24 Mar 2011) Log Message: ----------- Added annotations to TaxonLabel to include TaxonVariant and Taxon ID in results. Modified Paths: -------------- trunk/treebase-core/src/main/java/org/cipres/treebase/domain/taxon/TaxonLabel.java Modified: trunk/treebase-core/src/main/java/org/cipres/treebase/domain/taxon/TaxonLabel.java =================================================================== --- trunk/treebase-core/src/main/java/org/cipres/treebase/domain/taxon/TaxonLabel.java 2011-03-22 17:45:05 UTC (rev 777) +++ trunk/treebase-core/src/main/java/org/cipres/treebase/domain/taxon/TaxonLabel.java 2011-03-24 11:27:59 UTC (rev 778) @@ -277,7 +277,12 @@ } if ( null != tv.getTB1LegacyId() ) { annotations.add(new Annotation(Constants.TBTermsURI, "tb:identifier.taxonVariant.tb1", tv.getTB1LegacyId())); - } + } + annotations.add(new Annotation(Constants.TBTermsURI, "tb:identifier.taxonVariant", tv.getId())); + Taxon taxon = tv.getTaxon(); + if ( null != taxon ) { + annotations.add(new Annotation(Constants.TBTermsURI, "tb:identifier.taxon", taxon.getId())); + } } } catch ( Exception e) { This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site. |
From: <rv...@us...> - 2011-03-22 17:45:12
|
Revision: 777 http://treebase.svn.sourceforge.net/treebase/?rev=777&view=rev Author: rvos Date: 2011-03-22 17:45:05 +0000 (Tue, 22 Mar 2011) Log Message: ----------- Added lat/lon coordinates to continuous matrices Modified Paths: -------------- trunk/treebase-core/src/main/java/org/cipres/treebase/domain/nexus/nexml/NexmlMatrixConverter.java Modified: trunk/treebase-core/src/main/java/org/cipres/treebase/domain/nexus/nexml/NexmlMatrixConverter.java =================================================================== --- trunk/treebase-core/src/main/java/org/cipres/treebase/domain/nexus/nexml/NexmlMatrixConverter.java 2011-03-21 20:06:58 UTC (rev 776) +++ trunk/treebase-core/src/main/java/org/cipres/treebase/domain/nexus/nexml/NexmlMatrixConverter.java 2011-03-22 17:45:05 UTC (rev 777) @@ -358,9 +358,9 @@ ContinuousMatrix tbMatrix) { List<org.nexml.model.Character> characterList = xmlMatrix.getCharacters(); OTUs xmlOTUs = xmlMatrix.getOTUs(); - for ( MatrixRow row : tbMatrix.getRowsReadOnly() ) { - List<MatrixElement> elements = row.getElements(); - OTU xmlOTU = getOTUById(xmlOTUs, row.getTaxonLabel().getId()); + for ( MatrixRow tbRow : tbMatrix.getRowsReadOnly() ) { + List<MatrixElement> elements = tbRow.getElements(); + OTU xmlOTU = getOTUById(xmlOTUs, tbRow.getTaxonLabel().getId()); if ( characterList.size() <= MAX_GRANULAR_NCHAR && xmlOTUs.getAllOTUs().size() <= MAX_GRANULAR_NTAX ) { for ( int elementIndex = 0; elementIndex < tbMatrix.getnChar(); elementIndex++ ) { ContinuousMatrixElement tbCell = (ContinuousMatrixElement)elements.get(elementIndex); @@ -370,9 +370,20 @@ } } else { - String seq = row.buildElementAsString(); + String seq = tbRow.buildElementAsString(); xmlMatrix.setSeq(seq,xmlOTU); } + Set<RowSegment> tbSegments = tbRow.getSegmentsReadOnly(); + for ( RowSegment tbSegment : tbSegments ) { + Double latitude = tbSegment.getSpecimenLabel().getLatitude(); + Double longitude = tbSegment.getSpecimenLabel().getLongitude(); + if ( null != latitude ) { + xmlOTU.addAnnotationValue("DwC:DecimalLatitude", Constants.DwCURI, latitude); + } + if ( null != longitude ) { + xmlOTU.addAnnotationValue("DwC:DecimalLongitude", Constants.DwCURI, longitude); + } + } } } This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site. |
From: <sfr...@us...> - 2011-03-21 20:07:08
|
Revision: 776 http://treebase.svn.sourceforge.net/treebase/?rev=776&view=rev Author: sfrgpiel Date: 2011-03-21 20:06:58 +0000 (Mon, 21 Mar 2011) Log Message: ----------- These are modifications to reduce the footprint of the database and increase the performance by creating new indices and removing matrixelement records for all discrete character matrices (which at this instance includes all matrices in the database). This deletion should be performed only after a new build has been applied to production, seeing as only the new build will not create new matrixelement records. Added Paths: ----------- trunk/treebase-core/db/cleaning/2011-03-21_truncate_matrixelement_table/ trunk/treebase-core/db/cleaning/2011-03-21_truncate_matrixelement_table/README.txt trunk/treebase-core/db/cleaning/2011-03-21_truncate_matrixelement_table/truncate_matrixelement.sql trunk/treebase-core/db/schema/patches/0006_create-indices.sql Added: trunk/treebase-core/db/cleaning/2011-03-21_truncate_matrixelement_table/README.txt =================================================================== --- trunk/treebase-core/db/cleaning/2011-03-21_truncate_matrixelement_table/README.txt (rev 0) +++ trunk/treebase-core/db/cleaning/2011-03-21_truncate_matrixelement_table/README.txt 2011-03-21 20:06:58 UTC (rev 776) @@ -0,0 +1,11 @@ +Serious performance and stability issues are predicted to arise because the +matrixelement table is highly normalized and requires and excessive footprint +when storing data matrices. The c. 6000+ matrices in TreeBASE are taking up +nearly 200GB of space in this table alone. Empirical testing determined that +the contents of the matrixelement table are not used when downloading discrete +character data (whether as NEXUS or as NeXML) because the information is +obtained from the symbolstring field in the matrixrow table. This is not the case +for matrices of datatype continuous. Since there are no matrices of this type +currently in TreeBASE, we propose to delete all records in the matrixelement table +and modify the code so as not to create new ones except in cases of continuous data +types. \ No newline at end of file Added: trunk/treebase-core/db/cleaning/2011-03-21_truncate_matrixelement_table/truncate_matrixelement.sql =================================================================== --- trunk/treebase-core/db/cleaning/2011-03-21_truncate_matrixelement_table/truncate_matrixelement.sql (rev 0) +++ trunk/treebase-core/db/cleaning/2011-03-21_truncate_matrixelement_table/truncate_matrixelement.sql 2011-03-21 20:06:58 UTC (rev 776) @@ -0,0 +1,20 @@ +-- make a backup of the matrixelement table just in +-- case, edit the -h and database name as needed: +-- pg_dump -U treebase_app -h treebase.nescent.org -t matrixelement -a -O treebaseprod > matrixelement_bkup.sql + +-- remove foreign key constraints +ALTER TABLE ONLY compound_element DROP CONSTRAINT compound_element_fkto_compound; +ALTER TABLE ONLY compound_element DROP CONSTRAINT compound_element_fkto_element; +ALTER TABLE ONLY itemvalue DROP CONSTRAINT itemvalue_fkto_element; +ALTER TABLE ONLY statemodifier DROP CONSTRAINT statemodifier_fkto_element; + +-- delete all records in matrixelement. Be sure to use TRUNCATE instead +-- of DELETE FROM because this makes it infinitely faster +TRUNCATE matrixelement; + +-- reapply all foreign key constraints +ALTER TABLE ONLY compound_element ADD CONSTRAINT compound_element_fkto_compound FOREIGN KEY (compound_id) REFERENCES matrixelement(matrixelement_id); +ALTER TABLE ONLY compound_element ADD CONSTRAINT compound_element_fkto_element FOREIGN KEY (element_id) REFERENCES matrixelement(matrixelement_id); +ALTER TABLE ONLY itemvalue ADD CONSTRAINT itemvalue_fkto_element FOREIGN KEY (element_id) REFERENCES matrixelement(matrixelement_id); +ALTER TABLE ONLY statemodifier ADD CONSTRAINT statemodifier_fkto_element FOREIGN KEY (element_id) REFERENCES matrixelement(matrixelement_id); + Added: trunk/treebase-core/db/schema/patches/0006_create-indices.sql =================================================================== --- trunk/treebase-core/db/schema/patches/0006_create-indices.sql (rev 0) +++ trunk/treebase-core/db/schema/patches/0006_create-indices.sql 2011-03-21 20:06:58 UTC (rev 776) @@ -0,0 +1,7 @@ +insert into versionhistory(patchnumber, patchlabel, patchdescription) + values (6, 'create-indices', + 'Create additional indices to improve query performance.'); + +CREATE INDEX discretecharstate_phylochar_id_idx ON discretecharstate USING btree (phylochar_id); +CREATE INDEX matrixcolumn_matrix_id_idx ON matrixcolumn USING btree (matrix_id); + This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site. |
From: SourceForge.net <no...@so...> - 2011-03-21 19:03:53
|
Bugs item #3232373, was opened at 2011-03-21 10:51 Message generated for change (Comment added) made by rscherle You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3232373&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: ui Group: None Status: Open Priority: 8 Private: No Submitted By: William Piel (sfrgpiel) Assigned to: hshyket (hshyket) Summary: Files from Dryad are empty Initial Comment: When someone creates a Dryad submission and the data are pushed onto TreeBASE, the source data file (i.e. the original data file that can normally be obtained when you click on the "Download Original" icons) only says "NEXUS" with all remaining data having been deleted. The "Download Original" buttons ought to download the original files that were pushed onto TreeBASE from Dryad. This can be tested by going to http://datadryad.org/ and creating a new user account and submission (but naming it "Testing Only" to inform Dryad editors that it is just for testing), and then submitting a NEXUS file and pushing it on to TreeBASE. ---------------------------------------------------------------------- >Comment By: Ryan Scherle (rscherle) Date: 2011-03-21 14:03 Message: Instead of submitting content to the production instance of Dryad, you can submit test content to the demo instance of Dryad at demo.datadryad.org. This way, you don't need to worry about having your content come up for review by Dryad curators. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3232373&group_id=248804 |
From: SourceForge.net <no...@so...> - 2011-03-21 16:00:16
|
Bugs item #3232414, was opened at 2011-03-21 12:00 Message generated for change (Tracker Item Submitted) made by sfrgpiel You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3232414&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: ui Group: None Status: Open Priority: 7 Private: No Submitted By: William Piel (sfrgpiel) Assigned to: hshyket (hshyket) Summary: Citation created by Dryad has duplicate author names Initial Comment: When a submission is created in Dryad and the data are pushed to TreeBASE, the citation is auto-entered into TreeBASE. However, typically it seems to duplicate one of the authors (namely the submitter's name). I'm inclined to not simply solve the bug as it is, but rather to list the authors (in their proper order) in the notes field for the submission with instructions for the submitter to enter the author's names de-novo. The reason for this is because when author names are auto-entered from Dryad, they tend to create redundant entries in the event that the authors already exist in TreeBASE. So we really need to prompt the submitter to enter author names using the TreeBASE interface. My recommendation then is to remove the feature that auto-enters author names and instead store the author names (in proper order) in the Notes field, and append the following text: "Please enter these author names into the TreeBASE citation by clicking on the highlighted "Authors" item in the Tool Box" ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3232414&group_id=248804 |
From: SourceForge.net <no...@so...> - 2011-03-21 15:51:28
|
Bugs item #3232373, was opened at 2011-03-21 11:51 Message generated for change (Tracker Item Submitted) made by sfrgpiel You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3232373&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: ui Group: None Status: Open Priority: 8 Private: No Submitted By: William Piel (sfrgpiel) Assigned to: hshyket (hshyket) Summary: Files from Dryad are empty Initial Comment: When someone creates a Dryad submission and the data are pushed onto TreeBASE, the source data file (i.e. the original data file that can normally be obtained when you click on the "Download Original" icons) only says "NEXUS" with all remaining data having been deleted. The "Download Original" buttons ought to download the original files that were pushed onto TreeBASE from Dryad. This can be tested by going to http://datadryad.org/ and creating a new user account and submission (but naming it "Testing Only" to inform Dryad editors that it is just for testing), and then submitting a NEXUS file and pushing it on to TreeBASE. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3232373&group_id=248804 |
From: <sfr...@us...> - 2011-03-16 20:18:24
|
Revision: 775 http://treebase.svn.sourceforge.net/treebase/?rev=775&view=rev Author: sfrgpiel Date: 2011-03-16 20:18:18 +0000 (Wed, 16 Mar 2011) Log Message: ----------- Added blurb and icon for AJB Modified Paths: -------------- trunk/treebase-web/src/main/webapp/WEB-INF/pages/journal.jsp Added Paths: ----------- trunk/treebase-web/src/main/webapp/images/journal_files/image022.gif Modified: trunk/treebase-web/src/main/webapp/WEB-INF/pages/journal.jsp =================================================================== --- trunk/treebase-web/src/main/webapp/WEB-INF/pages/journal.jsp 2011-03-16 16:21:59 UTC (rev 774) +++ trunk/treebase-web/src/main/webapp/WEB-INF/pages/journal.jsp 2011-03-16 20:18:18 UTC (rev 775) @@ -28,6 +28,23 @@ </tr> <tr> <td> + <p><a href="http://www.amjbot.org/" + title="American Journal of Botany"> + <img class="journal" + src="images/journal_files/image022.gif" alt="AJB"/> + </a> + </p> + </td> + <td> + <p><a + href="<%=purlBase%>study/find?query=prism.publicationName%3D%3D%22American%20Journal%20of%20Botany%22" + title="Find records in TreeBASE for articles published in the American Journal of Botany"> + <%=purlBase%>study/find?query=prism.publicationName%3D%3D%22American%20Journal%20of%20Botany%22 + </a></p> + </td> + </tr> + <tr> + <td> <p><a href="http://www.wiley.com/bw/journal.asp?ref=0014-3820" title="Evolution"> <img class="journal" @@ -332,8 +349,6 @@ </table> <p><b>Other Journals with a Significant Presence in TreeBASE</b>: <a - href="<%=purlBase%>study/find?query=prism.publicationName%3D%3D%2American+Journal+of+Botany%22" - title="American Journal of Botany">American Journal of Botany</a>; <a href="<%=purlBase%>study/find?query=prism.publicationName%3D%3D%22Annals+of+the+Missouri+Botanical+Garden%22" title="Annals of the Missouri Botanical Garden">Annals of the Missouri Botanical Garden</a>; <a Added: trunk/treebase-web/src/main/webapp/images/journal_files/image022.gif =================================================================== (Binary files differ) Property changes on: trunk/treebase-web/src/main/webapp/images/journal_files/image022.gif ___________________________________________________________________ Added: svn:mime-type + application/octet-stream This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site. |
From: <hs...@us...> - 2011-03-16 16:22:05
|
Revision: 774 http://treebase.svn.sourceforge.net/treebase/?rev=774&view=rev Author: hshyket Date: 2011-03-16 16:21:59 +0000 (Wed, 16 Mar 2011) Log Message: ----------- Does not allow the Martixelement table to be populated when the matrix is either of a type sequence or standard Modified Paths: -------------- trunk/treebase-core/src/main/java/org/cipres/treebase/domain/nexus/mesquite/MesquiteStandardMatrixConverter.java Modified: trunk/treebase-core/src/main/java/org/cipres/treebase/domain/nexus/mesquite/MesquiteStandardMatrixConverter.java =================================================================== --- trunk/treebase-core/src/main/java/org/cipres/treebase/domain/nexus/mesquite/MesquiteStandardMatrixConverter.java 2011-02-04 22:02:15 UTC (rev 773) +++ trunk/treebase-core/src/main/java/org/cipres/treebase/domain/nexus/mesquite/MesquiteStandardMatrixConverter.java 2011-03-16 16:21:59 UTC (rev 774) @@ -507,7 +507,10 @@ } discreteMatrixJDBC.batchUpdateRowSymbol(pCon); - DiscreteMatrixElementJDBC.batchDiscreteElements(elements, pCon); + if (!(discreteMatrixJDBC.getCharacterMatrix().getDataType().isSequence()) + && !(discreteMatrixJDBC.getCharacterMatrix().getDataType().isStandard())) { + DiscreteMatrixElementJDBC.batchDiscreteElements(elements, pCon); + } } /** This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site. |
From: SourceForge.net <no...@so...> - 2011-02-26 00:06:02
|
Bugs item #3192771, was opened at 2011-02-25 19:06 Message generated for change (Tracker Item Submitted) made by balhoff You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3192771&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: APIs Group: None Status: Open Priority: 5 Private: No Submitted By: Jim Balhoff (balhoff) Assigned to: Nobody/Anonymous (nobody) Summary: RDF link returns nothing Initial Comment: Clicking on an RDF link seems to work on something for a while but returns an empty document (nothing, not even empty RDF). For example, on this page: http://www.treebase.org/treebase-web/search/study/summary.html?id=423 click on the RDF link which goes here: http://purl.org/phylo/treebase/phylows/study/TB2:S423?format=rdf and redirects to here: http://treebase.org/treebase-web/search/downloadAStudy.html?id=423&format=rdf ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3192771&group_id=248804 |
From: SourceForge.net <no...@so...> - 2011-02-07 16:43:57
|
Bugs item #3172061, was opened at 2011-02-03 17:17 Message generated for change (Settings changed) made by sfrgpiel You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3172061&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: ui Group: None >Status: Closed Priority: 7 Private: No Submitted By: William Piel (sfrgpiel) Assigned to: hshyket (hshyket) Summary: Some DOI hyperlinks fail Initial Comment: Search for author 'Lutzoni' - several of the rows have DOI links that go to http://dx.doi.org without the actual DOI appended. The "http://dx.doi.org" link is provided for all non-NULL values, however when users enter citation information in which the DOI entry is left blank, the field is filled with "blank" rather than NULL. Hence a hyperlink to nowhere is displayed. Either the entry form should be designed to enter NULL for blank values, or the hyperlink should not be presented to users when the DOI is blank. If the latter solution, this can probably be fixed in the jsp by testing (<c:if/>) for the ${variable} yields false for null and empty string: treebase-web/src/main/webapp/WEB-INF/pages/search/studyList.jsp (line 48 indeed tests for null instead of true) ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3172061&group_id=248804 |
From: <hs...@us...> - 2011-02-04 22:02:21
|
Revision: 773 http://treebase.svn.sourceforge.net/treebase/?rev=773&view=rev Author: hshyket Date: 2011-02-04 22:02:15 +0000 (Fri, 04 Feb 2011) Log Message: ----------- ASetting the DOI property to check not empty instead of just NULL. The empty string results were getting through on the row layout. Modified Paths: -------------- trunk/treebase-web/src/main/webapp/WEB-INF/pages/search/studyList.jsp Modified: trunk/treebase-web/src/main/webapp/WEB-INF/pages/search/studyList.jsp =================================================================== --- trunk/treebase-web/src/main/webapp/WEB-INF/pages/search/studyList.jsp 2011-01-25 14:43:48 UTC (rev 772) +++ trunk/treebase-web/src/main/webapp/WEB-INF/pages/search/studyList.jsp 2011-02-04 22:02:15 UTC (rev 773) @@ -45,7 +45,7 @@ </display:column> <display:column class="iconColumn noBreak" headerClass="iconColumn" sortable="false"> - <c:if test="${study.citation.doi != null}"> + <c:if test="${not empty study.citation.doi}"> <c:set var="DOIResource" value="${DOIResolver}${study.citation.doi}"/> <a href="${DOIResource}" target="_blank"> <img class="iconButton" src="<fmt:message key="icons.weblink"/>" /> This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site. |
From: SourceForge.net <no...@so...> - 2011-02-03 22:17:58
|
Bugs item #3172061, was opened at 2011-02-03 17:17 Message generated for change (Tracker Item Submitted) made by sfrgpiel You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3172061&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: ui Group: None Status: Open Priority: 7 Private: No Submitted By: William Piel (sfrgpiel) Assigned to: hshyket (hshyket) Summary: Some DOI hyperlinks fail Initial Comment: Search for author 'Lutzoni' - several of the rows have DOI links that go to http://dx.doi.org without the actual DOI appended. The "http://dx.doi.org" link is provided for all non-NULL values, however when users enter citation information in which the DOI entry is left blank, the field is filled with "blank" rather than NULL. Hence a hyperlink to nowhere is displayed. Either the entry form should be designed to enter NULL for blank values, or the hyperlink should not be presented to users when the DOI is blank. If the latter solution, this can probably be fixed in the jsp by testing (<c:if/>) for the ${variable} yields false for null and empty string: treebase-web/src/main/webapp/WEB-INF/pages/search/studyList.jsp (line 48 indeed tests for null instead of true) ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3172061&group_id=248804 |
From: SourceForge.net <no...@so...> - 2011-01-26 15:59:58
|
Bugs item #2992930, was opened at 2010-04-27 09:04 Message generated for change (Comment added) made by sfrgpiel You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=2992930&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: ui Group: None >Status: Closed Priority: 8 Private: No Submitted By: William Piel (sfrgpiel) Assigned to: hshyket (hshyket) Summary: Uploading metadata with missing values causes exception Initial Comment: The process for uploading metadata is to first download a row segment template, then fill it in in Excel, then upload it. i.e. from this page: http://www.treebase.org/treebase-web/user/uploadRowSegmentData.html On the following page, designate which columns to be uploaded. The problem is that if any of the designated columns have no data for one or more rows, the parser throws an exception. What it should do is treat empty fields as NULL. ---------------------------------------------------------------------- >Comment By: William Piel (sfrgpiel) Date: 2011-01-26 10:59 Message: This bug behavior has vanished. I'm not sure who fixed it, but I can't recreate the problem. ---------------------------------------------------------------------- Comment By: Hilmar Lapp (hlapp) Date: 2010-04-27 11:33 Message: There is no good workaround, and this is bad enough that it could cause people to give up on submitting data. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=2992930&group_id=248804 |
From: SourceForge.net <no...@so...> - 2011-01-26 15:48:11
|
Bugs item #3163638, was opened at 2011-01-21 16:45 Message generated for change (Settings changed) made by sfrgpiel You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3163638&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: APIs Group: None >Status: Closed Priority: 7 Private: No Submitted By: William Piel (sfrgpiel) Assigned to: hshyket (hshyket) Summary: OAI-PMH datestamp searches createDate not lastModified Initial Comment: The "from" or "until" parameters in the OAI API are searching based on submission.createdate instead of study.lastmodifieddate. For example: http://treebase-dev.nescent.org/treebase-web/top/oai?verb=ListRecords&metadataPrefix=oai_dc&from=2010-05-18T00:00:00Z results in datestamp: <datestamp>2010-05-19</datestamp>, yet in the database: study.lastmodifieddate = 2011-01-19 study.releasedate = 2011-01-19 submission.createdate = 2010-05-18 It would be better if this searched on study.lastmodifieddate, so that Dryad can be alerted to modifications in TreeBASE studies. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3163638&group_id=248804 |
From: <hs...@us...> - 2011-01-25 14:43:54
|
Revision: 772 http://treebase.svn.sourceforge.net/treebase/?rev=772&view=rev Author: hshyket Date: 2011-01-25 14:43:48 +0000 (Tue, 25 Jan 2011) Log Message: ----------- Changing the OAI-PMH date search. Using the last modified date (Study table) instead of the created date (Submission table) for the results. Modified Paths: -------------- trunk/treebase-core/src/main/java/org/cipres/treebase/dao/study/SubmissionDAO.java trunk/treebase-core/src/main/java/org/cipres/treebase/domain/study/SubmissionHome.java trunk/treebase-core/src/main/java/org/cipres/treebase/domain/study/SubmissionService.java trunk/treebase-core/src/main/java/org/cipres/treebase/service/study/SubmissionServiceImpl.java trunk/treebase-core/src/test/java/org/cipres/treebase/dao/study/SubmissionDAOTest.java trunk/treebase-web/src/main/java/org/cipres/treebase/web/controllers/OAIPMHController.java Modified: trunk/treebase-core/src/main/java/org/cipres/treebase/dao/study/SubmissionDAO.java =================================================================== --- trunk/treebase-core/src/main/java/org/cipres/treebase/dao/study/SubmissionDAO.java 2011-01-19 19:59:02 UTC (rev 771) +++ trunk/treebase-core/src/main/java/org/cipres/treebase/dao/study/SubmissionDAO.java 2011-01-25 14:43:48 UTC (rev 772) @@ -192,7 +192,17 @@ q.setDate("end", until); return q.list(); } + + public Collection<Submission> findByLastModifiedDateRange(Date from, Date until) { + Query q = getSession().createQuery( + "from Submission sub where sub.study.lastModifiedDate between :begin and :end"); + q.setDate("begin", from); + q.setDate("end", until); + return q.list(); + + } + public Submission findByStudyID(Long pID) { // TODO Auto-generated method stub Submission returnVal = null; Modified: trunk/treebase-core/src/main/java/org/cipres/treebase/domain/study/SubmissionHome.java =================================================================== --- trunk/treebase-core/src/main/java/org/cipres/treebase/domain/study/SubmissionHome.java 2011-01-19 19:59:02 UTC (rev 771) +++ trunk/treebase-core/src/main/java/org/cipres/treebase/domain/study/SubmissionHome.java 2011-01-25 14:43:48 UTC (rev 772) @@ -72,4 +72,5 @@ Collection<Submission> findByInProgressState(); Collection<Submission> findByCreateDateRange(Date from, Date until); + Collection<Submission> findByLastModifiedDateRange(Date from, Date until); } Modified: trunk/treebase-core/src/main/java/org/cipres/treebase/domain/study/SubmissionService.java =================================================================== --- trunk/treebase-core/src/main/java/org/cipres/treebase/domain/study/SubmissionService.java 2011-01-19 19:59:02 UTC (rev 771) +++ trunk/treebase-core/src/main/java/org/cipres/treebase/domain/study/SubmissionService.java 2011-01-25 14:43:48 UTC (rev 772) @@ -42,6 +42,8 @@ Collection<Submission> findPublishedSubmissions(); Collection<Submission> findSubmissionByCreateDateRange(Date from, Date until); + + Collection<Submission> findSubmissionByLastModifiedDateRange(Date from, Date until); /** * Create a submission, which associates with a new study. A submitter is required. * Modified: trunk/treebase-core/src/main/java/org/cipres/treebase/service/study/SubmissionServiceImpl.java =================================================================== --- trunk/treebase-core/src/main/java/org/cipres/treebase/service/study/SubmissionServiceImpl.java 2011-01-19 19:59:02 UTC (rev 771) +++ trunk/treebase-core/src/main/java/org/cipres/treebase/service/study/SubmissionServiceImpl.java 2011-01-25 14:43:48 UTC (rev 772) @@ -934,7 +934,13 @@ return getSubmissionHome().findByCreateDateRange(from, until); } + + public Collection<Submission> findSubmissionByLastModifiedDateRange(Date from, Date until) { + + return getSubmissionHome().findByLastModifiedDateRange(from, until); + } + @Override Modified: trunk/treebase-core/src/test/java/org/cipres/treebase/dao/study/SubmissionDAOTest.java =================================================================== --- trunk/treebase-core/src/test/java/org/cipres/treebase/dao/study/SubmissionDAOTest.java 2011-01-19 19:59:02 UTC (rev 771) +++ trunk/treebase-core/src/test/java/org/cipres/treebase/dao/study/SubmissionDAOTest.java 2011-01-25 14:43:48 UTC (rev 772) @@ -184,6 +184,24 @@ } } + public void testFindByLastModifiedDateRange() { + String testName = "testFindByLastModifiedDateRange"; + if (logger.isInfoEnabled()) { + logger.info("\n\t\tRunning Test: " + testName); + } + + Date from = (new GregorianCalendar(2011,1,1)).getTime(); + Date until = (new GregorianCalendar(2011,3,1)).getTime(); + + Collection<Submission> s = getFixture().findByLastModifiedDateRange(from, until); + + + + assertTrue(s.size() > 0); + if (logger.isInfoEnabled()) { + logger.info("\n\t\tRunning Test: found " + s.size()); + } + } } Modified: trunk/treebase-web/src/main/java/org/cipres/treebase/web/controllers/OAIPMHController.java =================================================================== --- trunk/treebase-web/src/main/java/org/cipres/treebase/web/controllers/OAIPMHController.java 2011-01-19 19:59:02 UTC (rev 771) +++ trunk/treebase-web/src/main/java/org/cipres/treebase/web/controllers/OAIPMHController.java 2011-01-25 14:43:48 UTC (rev 772) @@ -122,7 +122,7 @@ List<Submission> list=null; try { - list = (List)submissionService.findSubmissionByCreateDateRange(IdentifyUtil.parseGranularity(identify.getGranularityPattern(),params.getModifiedFrom()), + list = (List)submissionService.findSubmissionByLastModifiedDateRange(IdentifyUtil.parseGranularity(identify.getGranularityPattern(),params.getModifiedFrom()), IdentifyUtil.parseGranularity(identify.getGranularityPattern(),params.getModifiedUntil())); } catch (ParseException e) { model.put("error_code", "badArgument"); @@ -139,7 +139,7 @@ List<Submission> list=null; try { - list = (List)submissionService.findSubmissionByCreateDateRange(IdentifyUtil.parseGranularity(identify.getGranularityPattern(),params.getModifiedFrom()), + list = (List)submissionService.findSubmissionByLastModifiedDateRange(IdentifyUtil.parseGranularity(identify.getGranularityPattern(),params.getModifiedFrom()), IdentifyUtil.parseGranularity(identify.getGranularityPattern(), params.getModifiedUntil())); } catch (ParseException e) { model.put("error_code", "badArgument"); This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site. |
From: SourceForge.net <no...@so...> - 2011-01-21 21:45:40
|
Bugs item #3163638, was opened at 2011-01-21 16:45 Message generated for change (Tracker Item Submitted) made by sfrgpiel You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3163638&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: APIs Group: None Status: Open Priority: 7 Private: No Submitted By: William Piel (sfrgpiel) Assigned to: hshyket (hshyket) Summary: OAI-PMH datestamp searches createDate not lastModified Initial Comment: The "from" or "until" parameters in the OAI API are searching based on submission.createdate instead of study.lastmodifieddate. For example: http://treebase-dev.nescent.org/treebase-web/top/oai?verb=ListRecords&metadataPrefix=oai_dc&from=2010-05-18T00:00:00Z results in datestamp: <datestamp>2010-05-19</datestamp>, yet in the database: study.lastmodifieddate = 2011-01-19 study.releasedate = 2011-01-19 submission.createdate = 2010-05-18 It would be better if this searched on study.lastmodifieddate, so that Dryad can be alerted to modifications in TreeBASE studies. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=3163638&group_id=248804 |
From: SourceForge.net <no...@so...> - 2011-01-21 21:37:32
|
Bugs item #2992930, was opened at 2010-04-27 09:04 Message generated for change (Settings changed) made by sfrgpiel You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=2992930&group_id=248804 Please note that this message will contain a full copy of the comment thread, including the initial issue submission, for this request, not just the latest update. Category: ui Group: None Status: Open Priority: 8 Private: No Submitted By: William Piel (sfrgpiel) >Assigned to: hshyket (hshyket) Summary: Uploading metadata with missing values causes exception Initial Comment: The process for uploading metadata is to first download a row segment template, then fill it in in Excel, then upload it. i.e. from this page: http://www.treebase.org/treebase-web/user/uploadRowSegmentData.html On the following page, designate which columns to be uploaded. The problem is that if any of the designated columns have no data for one or more rows, the parser throws an exception. What it should do is treat empty fields as NULL. ---------------------------------------------------------------------- Comment By: Hilmar Lapp (hlapp) Date: 2010-04-27 11:33 Message: There is no good workaround, and this is bad enough that it could cause people to give up on submitting data. ---------------------------------------------------------------------- You can respond by visiting: https://sourceforge.net/tracker/?func=detail&atid=1126676&aid=2992930&group_id=248804 |