Hello,

I've integrated SourceForge.net: eXist: Detail: 3037168 - database inconsistencies fixed by patch to VersioningTrigger into trunk which also fixes an issue with versioning of XQuery updates not being tracked due to late copying of the original document to create the base.

I have also integrated SourceForge.net: eXist: Detail: 3107412 - Corruption of versions fix but have not committed the change yet to trunk pending further testing.

JUnit tests are needed for 3037168 and 3107412. In particular for 3107412. It would be most helpful if the submitters of 3037168 and 3107412 could help with the test cases.

Thanks,
Chris



On Nov 15, 2010, at 5:34 AM, Adam Retter wrote:

Guys these patches are great stuff and I will try and get them into
trunk, but can I please also beg for JUnit tests. Without test cases,
we may fix this issue today, but there is nothing to stop a regression
occurring at any time in the future when the code is adjusted again
for some other reason...

OK, I have some XQueries which reveal the bugs so it shouldn't be too hard
to invoke them from a JUnit test case.

Is there a good example of a test case in the exist source that invokes
xqueries - just so I know what best practice looks like?

$EXIST_HOME/test/src/org/exist/xquery/XQueryTest.java

...should be a gold mine of examples ;-)


Cheers

Alistair


On 12 November 2010 10:26, Alistair Miles <alimanfoo@googlemail.com> wrote:
Hi Patrick,

Thanks for this, I'm using the versioning module heavily also and will run
some tests to see if I can reproduce the behaviour you observe.

Btw I also found a separate issue in the versioning trigger causing database
inconsistencies and submitted a patch:

http://sourceforge.net/tracker/?func=detail&aid=3037168&group_id=17691&atid=317691

I don't know if this has made it into trunk yet.

Cheers

Alistair

On Thu, Nov 11, 2010 at 12:53:37PM -0500, Patrick Bosek wrote:
Hello everyone,

I've been working on a bug in the versioning module for the last week or so,
and I believe I have fixed it. The issue arises when two (or more) nodes
with the same qname are siblings and the first one is deleted. In this case
the second node's attributes are all lost. Which, progressively corrupts the
versioning record more and more as the file is saved, eventually giving you
a version record that bares no resemblance to the actual versions of the
file.

I'm not 100% sure that there aren't other issues in the versioning module,
but I have tested it pretty extensively and haven't found any. So at this
point, I'm comfortable putting it into production for our company.

I've attached the patch file in case anyone wants to just grab it and go.
I've also added it to SourceForge.

Let me know if anyone has any questions.

Cheers,

--
Patrick Bosek
Jorsek Software
Cell (585) 820 9634
Office (585) 239 6060
Jorsek.com

*** StandardDiff_old.java     2010-11-11 12:34:48.348352497 -0500
--- StandardDiff.java 2010-11-11 10:54:55.624351394 -0500
***************
*** 57,63 ****
--- 57,65 ----

              Diff diff = new Diff(nodesA, nodesB);
              Diff.change script = diff.diff_2(false);
+
              changes = getChanges(script, docA, docB, nodesA, nodesB);
+
          } catch (XMLStreamException e) {
              throw new DiffException(e.getMessage(), e);
          } catch (IOException e) {
***************
*** 112,133 ****
          List changes = new ArrayList();
          Map inserts = new TreeMap();
          Diff.change next = script;
!         while (next != null) {
              int start0 = next.line0;
              int start = next.line1;
              int last = start + next.inserted;
              int lastDeleted = start0 + next.deleted;

!             if (next.inserted > 0) {
!                 if (next.deleted == 0) {
                      // Simplify edit script: if there's a set of start tags at the end of the
                      // insertion, check if they correspond to similar start tags *before* the
                      // inserted section. If yes, move the inserted range to match the entire
                      // inserted element instead of a sequence of end/start tags.
                      int offsetFix = 0;
!                     for (int i = last - 1; i > start; i--) {
                          DiffNode node = nodesB[i];
!                         if (node.nodeType == XMLStreamReader.START_ELEMENT && start - (last - i) > 0) {
                              DiffNode before = nodesB[start - (last - i)];
                              if (before.nodeType == XMLStreamReader.START_ELEMENT &&
                                  before.qname.equals(node.qname))
--- 114,143 ----
          List changes = new ArrayList();
          Map inserts = new TreeMap();
          Diff.change next = script;
!
!         //int deleteCorrection = 0;
!
!         while (next != null)
!         {
              int start0 = next.line0;
              int start = next.line1;
              int last = start + next.inserted;
              int lastDeleted = start0 + next.deleted;

!             if (next.inserted > 0)
!             {
!                 if (next.deleted == 0)
!                 {
                      // Simplify edit script: if there's a set of start tags at the end of the
                      // insertion, check if they correspond to similar start tags *before* the
                      // inserted section. If yes, move the inserted range to match the entire
                      // inserted element instead of a sequence of end/start tags.
                      int offsetFix = 0;
!                     for (int i = last - 1; i > start; i--)
!                     {
                          DiffNode node = nodesB[i];
!                         if (node.nodeType == XMLStreamReader.START_ELEMENT && start - (last - i) > 0)
!                         {
                              DiffNode before = nodesB[start - (last - i)];
                              if (before.nodeType == XMLStreamReader.START_ELEMENT &&
                                  before.qname.equals(node.qname))
***************
*** 135,151 ****
                          } else
                              break;
                      }
                      if (offsetFix > 0) {
                          start = start - offsetFix;
                          start0 = start0 - offsetFix;
                          last = start + next.inserted;
                      }
                  }
                  Difference.Insert diff;
!                 if (nodesA[start0].nodeType == XMLStreamReader.END_ELEMENT) {
                      diff = new Difference.Append(new NodeProxy(docA, nodesA[start0].nodeId), docB);
                      changes.add(diff);
!                 } else {
                      diff = (Difference.Insert) inserts.get(nodesA[start0].nodeId);
                      if (diff == null) {
                          diff = new Difference.Insert(new NodeProxy(docA, nodesA[start0].nodeId), docB);
--- 145,166 ----
                          } else
                              break;
                      }
+
                      if (offsetFix > 0) {
                          start = start - offsetFix;
                          start0 = start0 - offsetFix;
                          last = start + next.inserted;
                      }
                  }
+
                  Difference.Insert diff;
!
!                 if (nodesA[start0].nodeType == XMLStreamReader.END_ELEMENT)
!                 {
                      diff = new Difference.Append(new NodeProxy(docA, nodesA[start0].nodeId), docB);
                      changes.add(diff);
!                 } else
!                 {
                      diff = (Difference.Insert) inserts.get(nodesA[start0].nodeId);
                      if (diff == null) {
                          diff = new Difference.Insert(new NodeProxy(docA, nodesA[start0].nodeId), docB);
***************
*** 156,177 ****
                  // now scan the chunk and collect the nodes
                  DiffNode[] nodes = new DiffNode[last - start];
                  int j = 0;
!                 for (int i = start; i < last; i++, j++) {
                      if (LOG.isTraceEnabled())
                          LOG.trace(Integer.toString(i) + " " + nodesB[i]);
                      nodes[j] = nodesB[i];
                  }
                  diff.addNodes(nodes);
              }
!             if (next.deleted > 0) {
                  if (LOG.isTraceEnabled())
                      LOG.trace("Deleted: " + start0 + " last: " + lastDeleted);
!                 for (int i = start0; i < lastDeleted; i++) {
                      boolean elementDeleted = false;
!                     if (nodesA[i].nodeType == XMLStreamReader.START_ELEMENT) {
                          for (int j = i; j < lastDeleted; j++) {
!                             if (nodesA[j].nodeType == XMLStreamReader.END_ELEMENT &&
!                                     nodesA[j].nodeId.equals(nodesA[i].nodeId)) {
                                  Difference.Delete diff = new Difference.Delete(new NodeProxy(docA, nodesA[i].nodeId));
                                  changes.add(diff);
                                  i = j;
--- 171,221 ----
                  // now scan the chunk and collect the nodes
                  DiffNode[] nodes = new DiffNode[last - start];
                  int j = 0;
!                 for (int i = start; i < last; i++, j++)
!                 {
                      if (LOG.isTraceEnabled())
                          LOG.trace(Integer.toString(i) + " " + nodesB[i]);
                      nodes[j] = nodesB[i];
                  }
                  diff.addNodes(nodes);
              }
!
!             if (next.deleted > 0)
!             {
!                     // This is a simple test to correct an issue when two nodes of the same
!                     // node-name are siblings and the first is deleted. What happens is the first
!                     // element doesn't get it's start node deleted and the second does. So
!                     // the second element basically ends up with the first one's start element.
!                     // Which causes problems for the second element's attributes.
!                     DiffNode beforeElement = nodesA[start0 - 1];
!                     DiffNode lastElement = nodesA[lastDeleted - 1];
!
!                     if(beforeElement.qname != null
!                             && lastElement.qname != null
!                             && beforeElement.qname.equals(lastElement.qname)
!                             && beforeElement.nodeType == XMLStreamReader.START_ELEMENT
!                             && lastElement.nodeType == XMLStreamReader.START_ELEMENT)
!                     {
!                             start0--;
!                             lastDeleted--;
!                     }
!
!
                  if (LOG.isTraceEnabled())
                      LOG.trace("Deleted: " + start0 + " last: " + lastDeleted);
!
!                 for (int i = start0; i < lastDeleted; i++)
!                 {
                      boolean elementDeleted = false;
!                     if (nodesA[i].nodeType == XMLStreamReader.START_ELEMENT)
!                     {
                          for (int j = i; j < lastDeleted; j++) {
!                             if (nodesA[j].nodeType == XMLStreamReader.END_ELEMENT && nodesA[j].nodeId.equals(nodesA[i].nodeId))
!                             {
!
!                                 if (LOG.isTraceEnabled())
!                                     LOG.trace("LEGIT Deleted "+Integer.toString(i) + " " + nodesA[i]);
!
                                  Difference.Delete diff = new Difference.Delete(new NodeProxy(docA, nodesA[i].nodeId));
                                  changes.add(diff);
                                  i = j;
***************
*** 180,193 ****
                              }
                          }
                      }
!                     if (!elementDeleted) {
                          Difference.Delete diff = new Difference.Delete(nodesA[i].nodeType, new NodeProxy(docA, nodesA[i].nodeId));
                          changes.add(diff);
                      }
                  }
              }
              next = next.link;
          }
          for (Iterator i = inserts.values().iterator(); i.hasNext();) {
              changes.add(i.next());
          }
--- 224,245 ----
                              }
                          }
                      }
!
!                     if (!elementDeleted)
!                     {
!
!                         if (LOG.isTraceEnabled())
!                             LOG.trace("Element Not Deleted "+Integer.toString(i) + " " + nodesA[i]);
!
                          Difference.Delete diff = new Difference.Delete(nodesA[i].nodeType, new NodeProxy(docA, nodesA[i].nodeId));
                          changes.add(diff);
                      }
                  }
              }
+
              next = next.link;
          }
+
          for (Iterator i = inserts.values().iterator(); i.hasNext();) {
              changes.add(i.next());
          }

------------------------------------------------------------------------------
Centralized Desktop Delivery: Dell and VMware Reference Architecture
Simplifying enterprise desktop deployment and management using
Dell EqualLogic storage and VMware View: A highly scalable, end-to-end
client virtualization framework. Read more!
http://p.sf.net/sfu/dell-eql-dev2dev

_______________________________________________
Exist-open mailing list
Exist-open@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/exist-open


--
Alistair Miles
Head of Epidemiological Informatics
Centre for Genomics and Global Health <http://cggh.org>
The Wellcome Trust Centre for Human Genetics
Roosevelt Drive
Oxford
OX3 7BN
United Kingdom
Web: http://purl.org/net/aliman
Email: alimanfoo@gmail.com
Tel: +44 (0)1865 287669

------------------------------------------------------------------------------
Centralized Desktop Delivery: Dell and VMware Reference Architecture
Simplifying enterprise desktop deployment and management using
Dell EqualLogic storage and VMware View: A highly scalable, end-to-end
client virtualization framework. Read more!
http://p.sf.net/sfu/dell-eql-dev2dev
_______________________________________________
Exist-open mailing list
Exist-open@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/exist-open




--
Adam Retter

eXist Developer
{ United Kingdom }
adam@exist-db.org
irc://irc.freenode.net/existdb

--
Alistair Miles
Head of Epidemiological Informatics
Centre for Genomics and Global Health <http://cggh.org>
The Wellcome Trust Centre for Human Genetics
Roosevelt Drive
Oxford
OX3 7BN
United Kingdom
Web: http://purl.org/net/aliman
Email: alimanfoo@gmail.com
Tel: +44 (0)1865 287669




--
Adam Retter

eXist Developer
{ United Kingdom }
adam@exist-db.org
irc://irc.freenode.net/existdb
------------------------------------------------------------------------------
Centralized Desktop Delivery: Dell and VMware Reference Architecture
Simplifying enterprise desktop deployment and management using
Dell EqualLogic storage and VMware View: A highly scalable, end-to-end
client virtualization framework. Read more!
http://p.sf.net/sfu/dell-eql-dev2dev
_______________________________________________
Exist-open mailing list
Exist-open@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/exist-open