You can subscribe to this list here.
2004 |
Jan
|
Feb
|
Mar
(57) |
Apr
(103) |
May
(164) |
Jun
(139) |
Jul
(173) |
Aug
(196) |
Sep
(221) |
Oct
(333) |
Nov
(214) |
Dec
(88) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2005 |
Jan
(163) |
Feb
(165) |
Mar
(98) |
Apr
(93) |
May
(199) |
Jun
(118) |
Jul
(200) |
Aug
(212) |
Sep
(185) |
Oct
(297) |
Nov
(437) |
Dec
(272) |
2006 |
Jan
(542) |
Feb
(329) |
Mar
(267) |
Apr
(332) |
May
(267) |
Jun
(130) |
Jul
(161) |
Aug
(348) |
Sep
(166) |
Oct
(305) |
Nov
(173) |
Dec
(173) |
2007 |
Jan
(199) |
Feb
(118) |
Mar
(133) |
Apr
(200) |
May
(208) |
Jun
(146) |
Jul
(198) |
Aug
(146) |
Sep
(187) |
Oct
(182) |
Nov
(181) |
Dec
(83) |
2008 |
Jan
(252) |
Feb
(124) |
Mar
(124) |
Apr
(101) |
May
(143) |
Jun
(122) |
Jul
(129) |
Aug
(60) |
Sep
(80) |
Oct
(89) |
Nov
(54) |
Dec
(112) |
2009 |
Jan
(88) |
Feb
(145) |
Mar
(105) |
Apr
(164) |
May
(123) |
Jun
(154) |
Jul
(374) |
Aug
(341) |
Sep
(219) |
Oct
(137) |
Nov
(373) |
Dec
(240) |
2010 |
Jan
(197) |
Feb
(270) |
Mar
(253) |
Apr
(150) |
May
(102) |
Jun
(51) |
Jul
(300) |
Aug
(512) |
Sep
(254) |
Oct
(258) |
Nov
(288) |
Dec
(143) |
2011 |
Jan
(238) |
Feb
(179) |
Mar
(253) |
Apr
(332) |
May
(248) |
Jun
(255) |
Jul
(216) |
Aug
(282) |
Sep
(146) |
Oct
(77) |
Nov
(86) |
Dec
(69) |
2012 |
Jan
(172) |
Feb
(234) |
Mar
(229) |
Apr
(101) |
May
(212) |
Jun
(267) |
Jul
(129) |
Aug
(210) |
Sep
(239) |
Oct
(271) |
Nov
(368) |
Dec
(220) |
2013 |
Jan
(179) |
Feb
(155) |
Mar
(59) |
Apr
(47) |
May
(99) |
Jun
(158) |
Jul
(185) |
Aug
(16) |
Sep
(16) |
Oct
(7) |
Nov
(20) |
Dec
(12) |
2014 |
Jan
(21) |
Feb
(17) |
Mar
(18) |
Apr
(13) |
May
(27) |
Jun
(15) |
Jul
(19) |
Aug
(22) |
Sep
(30) |
Oct
(16) |
Nov
(19) |
Dec
(16) |
2015 |
Jan
(14) |
Feb
(24) |
Mar
(33) |
Apr
(41) |
May
(14) |
Jun
(80) |
Jul
(53) |
Aug
(8) |
Sep
(7) |
Oct
(15) |
Nov
(13) |
Dec
(2) |
2016 |
Jan
(22) |
Feb
(12) |
Mar
(30) |
Apr
(6) |
May
(33) |
Jun
(16) |
Jul
(8) |
Aug
(20) |
Sep
(12) |
Oct
(18) |
Nov
(12) |
Dec
(11) |
2017 |
Jan
(24) |
Feb
(26) |
Mar
(47) |
Apr
(23) |
May
(19) |
Jun
(14) |
Jul
(28) |
Aug
(30) |
Sep
(17) |
Oct
|
Nov
|
Dec
|
2019 |
Jan
(1) |
Feb
(73) |
Mar
(90) |
Apr
(42) |
May
(116) |
Jun
(90) |
Jul
(127) |
Aug
(103) |
Sep
(56) |
Oct
(42) |
Nov
(95) |
Dec
(58) |
2020 |
Jan
(102) |
Feb
(31) |
Mar
(93) |
Apr
(60) |
May
(57) |
Jun
(45) |
Jul
(29) |
Aug
(32) |
Sep
(44) |
Oct
(86) |
Nov
(51) |
Dec
(71) |
2021 |
Jan
(44) |
Feb
(25) |
Mar
(78) |
Apr
(130) |
May
(64) |
Jun
(74) |
Jul
(21) |
Aug
(64) |
Sep
(40) |
Oct
(43) |
Nov
(21) |
Dec
(99) |
2022 |
Jan
(154) |
Feb
(64) |
Mar
(45) |
Apr
(95) |
May
(62) |
Jun
(48) |
Jul
(73) |
Aug
(37) |
Sep
(71) |
Oct
(27) |
Nov
(40) |
Dec
(65) |
2023 |
Jan
(89) |
Feb
(130) |
Mar
(124) |
Apr
(50) |
May
(93) |
Jun
(46) |
Jul
(45) |
Aug
(68) |
Sep
(62) |
Oct
(71) |
Nov
(108) |
Dec
(82) |
2024 |
Jan
(53) |
Feb
(76) |
Mar
(64) |
Apr
(75) |
May
(36) |
Jun
(54) |
Jul
(98) |
Aug
(137) |
Sep
(58) |
Oct
(177) |
Nov
(84) |
Dec
(52) |
2025 |
Jan
(70) |
Feb
(53) |
Mar
(72) |
Apr
(47) |
May
(88) |
Jun
(49) |
Jul
(86) |
Aug
(24) |
Sep
|
Oct
|
Nov
|
Dec
|
From: Wolfgang M. M. <wol...@us...> - 2004-07-12 17:18:22
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/collections In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv32480/src/org/exist/collections Modified Files: Collection.java Log Message: Various concurrency-related bug fixes: * there has been a conflicting access to the owner property in class DOMFile: in some cases, a second thread set the property to itself while another thread has been in the process of writing data. As the owner object is used to determine the current page in the document, the writing thread used a wrong data page (for a very short period). Thus, one or two document nodes got lost. * a number of small caching problems in dom.dbx led to inconsistencies in the db. Also, queries using string-equality comparisons did not use the cache, so increasing the cache size had no positive effect on query speed. Index: Collection.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/collections/Collection.java,v retrieving revision 1.39 retrieving revision 1.40 diff -C2 -d -r1.39 -r1.40 *** Collection.java 10 Jul 2004 15:14:55 -0000 1.39 --- Collection.java 12 Jul 2004 17:17:40 -0000 1.40 *************** *** 710,715 **** --- 710,717 ---- if (broker.isReadOnly()) throw new PermissionDeniedException("Database is read-only"); + DocumentImpl document, oldDoc = null; XMLReader reader; + InputSource source = new InputSource(new StringReader(data)); oldDoc = getDocument(broker, name); *************** *** 917,925 **** if (hasDocument(name) && (oldDoc ) != null) { - // jmv: Note: this was only in addDocument(DBBroker broker, String name, String data,) - if(oldDoc.isLockedForWrite()) - throw new PermissionDeniedException("Document " + name + - " is locked for write"); - // check if the document is locked by another user User lockUser = oldDoc.getUserLock(); --- 919,922 ---- |
From: Wolfgang M. M. <wol...@us...> - 2004-07-12 17:18:20
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/value In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv32480/src/org/exist/xquery/value Modified Files: IntegerValue.java Log Message: Various concurrency-related bug fixes: * there has been a conflicting access to the owner property in class DOMFile: in some cases, a second thread set the property to itself while another thread has been in the process of writing data. As the owner object is used to determine the current page in the document, the writing thread used a wrong data page (for a very short period). Thus, one or two document nodes got lost. * a number of small caching problems in dom.dbx led to inconsistencies in the db. Also, queries using string-equality comparisons did not use the cache, so increasing the cache size had no positive effect on query speed. Index: IntegerValue.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/value/IntegerValue.java,v retrieving revision 1.6 retrieving revision 1.7 diff -C2 -d -r1.6 -r1.7 *** IntegerValue.java 7 Jul 2004 17:15:46 -0000 1.6 --- IntegerValue.java 12 Jul 2004 17:17:40 -0000 1.7 *************** *** 26,33 **** import org.exist.xquery.XPathException; ! /** [Definition:] integer is ·derived· from decimal by fixing the value of ·fractionDigits· to be 0. * This results in the standard mathematical concept of the integer numbers. ! * The ·value space· of integer is the infinite set {...,-2,-1,0,1,2,...}. ! * The ·base type· of integer is decimal. * cf http://www.w3.org/TR/xmlschema-2/#integer */ --- 26,33 ---- import org.exist.xquery.XPathException; ! /** [Definition:] integer is �derived� from decimal by fixing the value of �fractionDigits� to be 0. * This results in the standard mathematical concept of the integer numbers. ! * The �value space� of integer is the infinite set {...,-2,-1,0,1,2,...}. ! * The �base type� of integer is decimal. * cf http://www.w3.org/TR/xmlschema-2/#integer */ |
From: Wolfgang M. M. <wol...@us...> - 2004-07-12 17:17:54
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/parser In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv32480/src/org/exist/xquery/parser Modified Files: XQueryParser.java XQueryTreeParser.java Log Message: Various concurrency-related bug fixes: * there has been a conflicting access to the owner property in class DOMFile: in some cases, a second thread set the property to itself while another thread has been in the process of writing data. As the owner object is used to determine the current page in the document, the writing thread used a wrong data page (for a very short period). Thus, one or two document nodes got lost. * a number of small caching problems in dom.dbx led to inconsistencies in the db. Also, queries using string-equality comparisons did not use the cache, so increasing the cache size had no positive effect on query speed. Index: XQueryTreeParser.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/parser/XQueryTreeParser.java,v retrieving revision 1.14 retrieving revision 1.15 diff -C2 -d -r1.14 -r1.15 *** XQueryTreeParser.java 6 Jul 2004 16:10:41 -0000 1.14 --- XQueryTreeParser.java 12 Jul 2004 17:17:42 -0000 1.15 *************** *** 4932,4938 **** _t = _t.getNextSibling(); ! // jmv: trouble with bIg integer in XQuery source: step= new LiteralValue(context, new IntegerValue(Integer.parseInt(i.getText()))); ! step= new LiteralValue(context, new IntegerValue( i.getText() )); ! step.setASTNode(i); --- 4932,4936 ---- _t = _t.getNextSibling(); ! step= new LiteralValue(context, new IntegerValue(i.getText())); step.setASTNode(i); Index: XQueryParser.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/parser/XQueryParser.java,v retrieving revision 1.14 retrieving revision 1.15 diff -C2 -d -r1.14 -r1.15 *** XQueryParser.java 29 Jun 2004 14:23:25 -0000 1.14 --- XQueryParser.java 12 Jul 2004 17:17:42 -0000 1.15 *************** *** 1688,1692 **** exprSingle_AST = (org.exist.xquery.parser.XQueryAST)currentAST.root; } ! else if ((_tokenSet_5.member(LA(1)))) { orExpr(); astFactory.addASTChild(currentAST, returnAST); --- 1688,1692 ---- exprSingle_AST = (org.exist.xquery.parser.XQueryAST)currentAST.root; } ! else if ((_tokenSet_0.member(LA(1)))) { orExpr(); astFactory.addASTChild(currentAST, returnAST); *************** *** 2634,2638 **** } } ! else if ((_tokenSet_6.member(LA(1)))) { { { --- 2634,2638 ---- } } ! else if ((_tokenSet_5.member(LA(1)))) { { { *************** *** 2833,2837 **** _loop138: do { ! if ((_tokenSet_7.member(LA(1)))) { { switch ( LA(1)) { --- 2833,2837 ---- _loop138: do { ! if ((_tokenSet_6.member(LA(1)))) { { switch ( LA(1)) { *************** *** 2953,2956 **** --- 2953,2957 ---- case XQUERY: case VERSION: + case LITERAL_declare: case LITERAL_default: case LITERAL_function: *************** *** 3196,3199 **** --- 3197,3201 ---- case XQUERY: case VERSION: + case LITERAL_declare: case LITERAL_default: case LITERAL_function: *************** *** 3398,3402 **** boolean synPredMatched157 = false; ! if (((_tokenSet_8.member(LA(1))))) { int _m157 = mark(); synPredMatched157 = true; --- 3400,3404 ---- boolean synPredMatched157 = false; ! if (((_tokenSet_7.member(LA(1))))) { int _m157 = mark(); synPredMatched157 = true; *************** *** 3463,3467 **** else { boolean synPredMatched160 = false; ! if (((_tokenSet_9.member(LA(1))))) { int _m160 = mark(); synPredMatched160 = true; --- 3465,3469 ---- else { boolean synPredMatched160 = false; ! if (((_tokenSet_8.member(LA(1))))) { int _m160 = mark(); synPredMatched160 = true; *************** *** 3523,3527 **** else { boolean synPredMatched163 = false; ! if (((_tokenSet_9.member(LA(1))))) { int _m163 = mark(); synPredMatched163 = true; --- 3525,3529 ---- else { boolean synPredMatched163 = false; ! if (((_tokenSet_8.member(LA(1))))) { int _m163 = mark(); synPredMatched163 = true; *************** *** 3574,3578 **** else { boolean synPredMatched166 = false; ! if (((_tokenSet_9.member(LA(1))))) { int _m166 = mark(); synPredMatched166 = true; --- 3576,3580 ---- else { boolean synPredMatched166 = false; ! if (((_tokenSet_8.member(LA(1))))) { int _m166 = mark(); synPredMatched166 = true; *************** *** 3591,3594 **** --- 3593,3597 ---- case XQUERY: case VERSION: + case LITERAL_declare: case LITERAL_default: case LITERAL_function: *************** *** 3699,3703 **** stepExpr_AST = (org.exist.xquery.parser.XQueryAST)currentAST.root; } ! else if ((_tokenSet_8.member(LA(1)))) { axisStep(); astFactory.addASTChild(currentAST, returnAST); --- 3702,3706 ---- stepExpr_AST = (org.exist.xquery.parser.XQueryAST)currentAST.root; } ! else if ((_tokenSet_7.member(LA(1)))) { axisStep(); astFactory.addASTChild(currentAST, returnAST); *************** *** 3781,3785 **** boolean synPredMatched175 = false; ! if (((_tokenSet_10.member(LA(1))))) { int _m175 = mark(); synPredMatched175 = true; --- 3784,3788 ---- boolean synPredMatched175 = false; ! if (((_tokenSet_9.member(LA(1))))) { int _m175 = mark(); synPredMatched175 = true; *************** *** 3829,3833 **** forwardOrReverseStep_AST = (org.exist.xquery.parser.XQueryAST)currentAST.root; } ! else if ((_tokenSet_8.member(LA(1)))) { abbrevStep(); astFactory.addASTChild(currentAST, returnAST); --- 3832,3836 ---- forwardOrReverseStep_AST = (org.exist.xquery.parser.XQueryAST)currentAST.root; } ! else if ((_tokenSet_7.member(LA(1)))) { abbrevStep(); astFactory.addASTChild(currentAST, returnAST); *************** *** 4008,4012 **** nodeTest_AST = (org.exist.xquery.parser.XQueryAST)currentAST.root; } ! else if ((_tokenSet_11.member(LA(1)))) { nameTest(); astFactory.addASTChild(currentAST, returnAST); --- 4011,4015 ---- nodeTest_AST = (org.exist.xquery.parser.XQueryAST)currentAST.root; } ! else if ((_tokenSet_10.member(LA(1)))) { nameTest(); astFactory.addASTChild(currentAST, returnAST); *************** *** 4097,4100 **** --- 4100,4104 ---- case XQUERY: case VERSION: + case LITERAL_declare: case LITERAL_default: case LITERAL_function: *************** *** 4166,4169 **** --- 4170,4174 ---- case XQUERY: case VERSION: + case LITERAL_declare: case LITERAL_default: case LITERAL_function: *************** *** 4257,4261 **** boolean synPredMatched190 = false; ! if (((_tokenSet_11.member(LA(1))))) { int _m190 = mark(); synPredMatched190 = true; --- 4262,4266 ---- boolean synPredMatched190 = false; ! if (((_tokenSet_10.member(LA(1))))) { int _m190 = mark(); synPredMatched190 = true; *************** *** 4422,4425 **** --- 4427,4431 ---- case XQUERY: case VERSION: + case LITERAL_declare: case LITERAL_default: case LITERAL_function: *************** *** 4551,4555 **** default: boolean synPredMatched198 = false; ! if (((_tokenSet_12.member(LA(1))))) { int _m198 = mark(); synPredMatched198 = true; --- 4557,4561 ---- default: boolean synPredMatched198 = false; ! if (((_tokenSet_11.member(LA(1))))) { int _m198 = mark(); synPredMatched198 = true; *************** *** 4611,4615 **** else { boolean synPredMatched201 = false; ! if (((_tokenSet_12.member(LA(1))))) { int _m201 = mark(); synPredMatched201 = true; --- 4617,4621 ---- else { boolean synPredMatched201 = false; ! if (((_tokenSet_11.member(LA(1))))) { int _m201 = mark(); synPredMatched201 = true; *************** *** 4806,4809 **** --- 4812,4816 ---- case XQUERY: case VERSION: + case LITERAL_declare: case LITERAL_default: case LITERAL_function: *************** *** 4936,4939 **** --- 4943,4947 ---- case XQUERY: case VERSION: + case LITERAL_declare: case LITERAL_default: case LITERAL_function: *************** *** 5239,5243 **** qName(); { ! match(_tokenSet_13); } } --- 5247,5251 ---- qName(); { ! match(_tokenSet_12); } } *************** *** 5837,5841 **** _loop259: do { ! if ((_tokenSet_14.member(LA(1)))) { elementContent(); astFactory.addASTChild(currentAST, returnAST); --- 5845,5849 ---- _loop259: do { ! if ((_tokenSet_13.member(LA(1)))) { elementContent(); astFactory.addASTChild(currentAST, returnAST); *************** *** 6739,6742 **** --- 6747,6762 ---- break; } + case LITERAL_declare: + { + org.exist.xquery.parser.XQueryAST tmp421_AST = null; + tmp421_AST = (org.exist.xquery.parser.XQueryAST)astFactory.create(LT(1)); + astFactory.addASTChild(currentAST, tmp421_AST); + match(LITERAL_declare); + if ( inputState.guessing==0 ) { + name = "declare"; + } + reservedKeywords_AST = (org.exist.xquery.parser.XQueryAST)currentAST.root; + break; + } default: { *************** *** 6940,6949 **** public static final BitSet _tokenSet_1 = new BitSet(mk_tokenSet_1()); private static final long[] mk_tokenSet_2() { ! long[] data = { -4057250683051573248L, 8013775106948149189L, 100728768L, 0L, 0L, 0L}; return data; } public static final BitSet _tokenSet_2 = new BitSet(mk_tokenSet_2()); private static final long[] mk_tokenSet_3() { ! long[] data = { -4057250683051573248L, 9166696611554996165L, 100728768L, 0L, 0L, 0L}; return data; } --- 6960,6969 ---- public static final BitSet _tokenSet_1 = new BitSet(mk_tokenSet_1()); private static final long[] mk_tokenSet_2() { ! long[] data = { -4039236284542091264L, 8013775106948149189L, 100728768L, 0L, 0L, 0L}; return data; } public static final BitSet _tokenSet_2 = new BitSet(mk_tokenSet_2()); private static final long[] mk_tokenSet_3() { ! long[] data = { -4039236284542091264L, 9166696611554996165L, 100728768L, 0L, 0L, 0L}; return data; } *************** *** 6955,6998 **** public static final BitSet _tokenSet_4 = new BitSet(mk_tokenSet_4()); private static final long[] mk_tokenSet_5() { ! long[] data = { -3479646438655262720L, -2561859953414155L, 101187571L, 0L, 0L, 0L}; return data; } public static final BitSet _tokenSet_5 = new BitSet(mk_tokenSet_5()); private static final long[] mk_tokenSet_6() { ! long[] data = { 562949953421312L, 1927366574080L, 0L, 0L}; return data; } public static final BitSet _tokenSet_6 = new BitSet(mk_tokenSet_6()); private static final long[] mk_tokenSet_7() { ! long[] data = { 0L, 985162418487312L, 0L, 0L}; return data; } public static final BitSet _tokenSet_7 = new BitSet(mk_tokenSet_7()); private static final long[] mk_tokenSet_8() { ! long[] data = { -4057250683051573248L, 9166696611554996181L, 100728816L, 0L, 0L, 0L}; return data; } public static final BitSet _tokenSet_8 = new BitSet(mk_tokenSet_8()); private static final long[] mk_tokenSet_9() { ! long[] data = { -3479646438655262720L, -56675424226037819L, 101187523L, 0L, 0L, 0L}; return data; } public static final BitSet _tokenSet_9 = new BitSet(mk_tokenSet_9()); private static final long[] mk_tokenSet_10() { ! long[] data = { 0L, 288230376151711744L, 4032L, 0L, 0L, 0L}; return data; } public static final BitSet _tokenSet_10 = new BitSet(mk_tokenSet_10()); private static final long[] mk_tokenSet_11() { ! long[] data = { -4057250683051573248L, 8013775106948149205L, 100728768L, 0L, 0L, 0L}; return data; } public static final BitSet _tokenSet_11 = new BitSet(mk_tokenSet_11()); private static final long[] mk_tokenSet_12() { - long[] data = { 288511851128422400L, 6701356245527298048L, 0L, 0L}; - return data; - } - public static final BitSet _tokenSet_12 = new BitSet(mk_tokenSet_12()); - private static final long[] mk_tokenSet_13() { long[] data = new long[8]; data[0]=-16L; --- 6975,7013 ---- public static final BitSet _tokenSet_4 = new BitSet(mk_tokenSet_4()); private static final long[] mk_tokenSet_5() { ! long[] data = { 562949953421312L, 1927366574080L, 0L, 0L}; return data; } public static final BitSet _tokenSet_5 = new BitSet(mk_tokenSet_5()); private static final long[] mk_tokenSet_6() { ! long[] data = { 0L, 985162418487312L, 0L, 0L}; return data; } public static final BitSet _tokenSet_6 = new BitSet(mk_tokenSet_6()); private static final long[] mk_tokenSet_7() { ! long[] data = { -4039236284542091264L, 9166696611554996181L, 100728816L, 0L, 0L, 0L}; return data; } public static final BitSet _tokenSet_7 = new BitSet(mk_tokenSet_7()); private static final long[] mk_tokenSet_8() { ! long[] data = { -3461632040145780736L, -56675424226037819L, 101187523L, 0L, 0L, 0L}; return data; } public static final BitSet _tokenSet_8 = new BitSet(mk_tokenSet_8()); private static final long[] mk_tokenSet_9() { ! long[] data = { 0L, 288230376151711744L, 4032L, 0L, 0L, 0L}; return data; } public static final BitSet _tokenSet_9 = new BitSet(mk_tokenSet_9()); private static final long[] mk_tokenSet_10() { ! long[] data = { -4039236284542091264L, 8013775106948149205L, 100728768L, 0L, 0L, 0L}; return data; } public static final BitSet _tokenSet_10 = new BitSet(mk_tokenSet_10()); private static final long[] mk_tokenSet_11() { ! long[] data = { 288511851128422400L, 6701356245527298048L, 0L, 0L}; return data; } public static final BitSet _tokenSet_11 = new BitSet(mk_tokenSet_11()); private static final long[] mk_tokenSet_12() { long[] data = new long[8]; data[0]=-16L; *************** *** 7001,7010 **** return data; } ! public static final BitSet _tokenSet_13 = new BitSet(mk_tokenSet_13()); ! private static final long[] mk_tokenSet_14() { long[] data = { 1152921504606846976L, 1073741824L, 4194307L, 0L, 0L, 0L}; return data; } ! public static final BitSet _tokenSet_14 = new BitSet(mk_tokenSet_14()); } --- 7016,7025 ---- return data; } ! public static final BitSet _tokenSet_12 = new BitSet(mk_tokenSet_12()); ! private static final long[] mk_tokenSet_13() { long[] data = { 1152921504606846976L, 1073741824L, 4194307L, 0L, 0L, 0L}; return data; } ! public static final BitSet _tokenSet_13 = new BitSet(mk_tokenSet_13()); } |
From: Wolfgang M. M. <wol...@us...> - 2004-07-12 17:17:53
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv32480/src/org/exist/storage Modified Files: NativeBroker.java Log Message: Various concurrency-related bug fixes: * there has been a conflicting access to the owner property in class DOMFile: in some cases, a second thread set the property to itself while another thread has been in the process of writing data. As the owner object is used to determine the current page in the document, the writing thread used a wrong data page (for a very short period). Thus, one or two document nodes got lost. * a number of small caching problems in dom.dbx led to inconsistencies in the db. Also, queries using string-equality comparisons did not use the cache, so increasing the cache size had no positive effect on query speed. Index: NativeBroker.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/NativeBroker.java,v retrieving revision 1.87 retrieving revision 1.88 diff -C2 -d -r1.87 -r1.88 *** NativeBroker.java 5 Jul 2004 20:02:47 -0000 1.87 --- NativeBroker.java 12 Jul 2004 17:17:43 -0000 1.88 *************** *** 896,900 **** * the document if node is null. */ ! public void reindex(DocumentImpl oldDoc, DocumentImpl doc, NodeImpl node) { int idxLevel = doc.reindexRequired(); if (idxLevel < 0) { --- 896,901 ---- * the document if node is null. */ ! public void reindex(final DocumentImpl oldDoc, final DocumentImpl doc, ! final NodeImpl node) { int idxLevel = doc.reindexRequired(); if (idxLevel < 0) { *************** *** 905,942 **** if (node == null) LOG.debug("reindexing level " + idxLevel + " of document " + doc.getDocId()); final long start = System.currentTimeMillis(); ! // remove all old index keys from the btree ! Value ref = new NodeRef(doc.getDocId()); ! final IndexQuery query = new IndexQuery(IndexQuery.TRUNC_RIGHT, ref); ! final Lock lock = domDb.getLock(); ! // try to acquire a lock on the file ! try { ! lock.acquire(Lock.WRITE_LOCK); ! domDb.setOwnerObject(this); ! final ArrayList nodes = domDb.findKeys(query); ! long gid; ! for (Iterator i = nodes.iterator(); i.hasNext();) { ! ref = (Value) i.next(); ! gid = ByteConversion.byteToLong(ref.data(), ref.start() + 4); ! if (oldDoc.getTreeLevel(gid) >= doc.reindexRequired()) { ! if (node != null) { ! if (XMLUtil.isDescendant(oldDoc, node.getGID(), gid)) { ! domDb.removeValue(ref); } ! } else ! domDb.removeValue(ref); } } ! } catch (DBException e) { ! LOG.warn("db error during reindex", e); ! } catch (IOException e) { ! LOG.warn("io error during reindex", e); ! } catch (LockException e) { ! // timed out ! LOG.warn("lock timed out during reindex", e); ! return; ! } finally { ! lock.release(); ! } try { // now reindex the nodes --- 906,940 ---- if (node == null) LOG.debug("reindexing level " + idxLevel + " of document " + doc.getDocId()); + // checkTree(doc); + final long start = System.currentTimeMillis(); ! // remove all old index keys from the btree ! new DOMTransaction(this, domDb, Lock.WRITE_LOCK) { ! public Object start() throws ReadOnlyException { ! try { ! Value ref = new NodeRef(doc.getDocId()); ! IndexQuery query = new IndexQuery(IndexQuery.TRUNC_RIGHT, ref); ! final ArrayList nodes = domDb.findKeys(query); ! long gid; ! for (Iterator i = nodes.iterator(); i.hasNext();) { ! ref = (Value) i.next(); ! gid = ByteConversion.byteToLong(ref.data(), ref.start() + 4); ! if (oldDoc.getTreeLevel(gid) >= doc.reindexRequired()) { ! if (node != null) { ! if (XMLUtil.isDescendant(oldDoc, node.getGID(), gid)) { ! domDb.removeValue(ref); ! } ! } else ! domDb.removeValue(ref); } ! } ! } catch (BTreeException e) { ! LOG.debug("Exception while reindexing document: " + e.getMessage(), e); ! } catch (IOException e) { ! LOG.debug("Exception while reindexing document: " + e.getMessage(), e); } + return null; } ! }.run(); try { // now reindex the nodes *************** *** 962,970 **** } catch(Exception e) { LOG.error("Error occured while reindexing document: " + e.getMessage(), e); - LOG.debug(domDb.debugPages(doc)); } elementIndex.reindex(oldDoc, node); textEngine.reindex(oldDoc, node); doc.setReindexRequired(-1); LOG.debug("reindex took " + (System.currentTimeMillis() - start) + "ms."); } --- 960,968 ---- } catch(Exception e) { LOG.error("Error occured while reindexing document: " + e.getMessage(), e); } elementIndex.reindex(oldDoc, node); textEngine.reindex(oldDoc, node); doc.setReindexRequired(-1); + // checkTree(doc); LOG.debug("reindex took " + (System.currentTimeMillis() - start) + "ms."); } *************** *** 1209,1214 **** try { // checkTree(doc); ! final NodeImpl firstChild = (NodeImpl)doc.getFirstChild(); ! // LOG.debug(domDb.debugPages(doc)); // dropping old structure index --- 1207,1213 ---- try { // checkTree(doc); ! ! // remember this for later remove ! final long firstChild = doc.getFirstChildAddress(); // dropping old structure index *************** *** 1222,1225 **** --- 1221,1225 ---- try { domDb.remove(idx, null); + domDb.flush(); } catch (BTreeException e) { LOG.warn("start() - " + "error while removing doc", e); *************** *** 1228,1232 **** } catch (TerminatedException e) { LOG.warn("method terminated", e); ! } return null; } --- 1228,1234 ---- } catch (TerminatedException e) { LOG.warn("method terminated", e); ! } catch (DBException e) { ! LOG.warn("start() - " + "error while removing doc", e); ! } return null; } *************** *** 1253,1260 **** flush(); // remove the old nodes new DOMTransaction(this, domDb) { public Object start() { ! domDb.removeAll(firstChild.getInternalAddress()); return null; } --- 1255,1269 ---- flush(); + // checkTree(tempDoc); + // remove the old nodes new DOMTransaction(this, domDb) { public Object start() { ! domDb.removeAll(firstChild); ! try { ! domDb.flush(); ! } catch (DBException e) { ! LOG.warn("start() - " + "error while removing doc", e); ! } return null; } *************** *** 1262,1266 **** .run(); ! // LOG.debug(domDb.debugPages(tempDoc)); doc.copyChildren(tempDoc); --- 1271,1275 ---- .run(); ! // checkTree(tempDoc); doc.copyChildren(tempDoc); *************** *** 1272,1276 **** LOG.debug("new doc address = " + StorageAddress.toString(doc.getAddress())); closeDocument(); ! saveCollection(doc.getCollection()); LOG.debug("Defragmentation took " + (System.currentTimeMillis() - start) + "ms."); --- 1281,1290 ---- LOG.debug("new doc address = " + StorageAddress.toString(doc.getAddress())); closeDocument(); ! // new DOMTransaction(this, domDb, Lock.READ_LOCK) { ! // public Object start() throws ReadOnlyException { ! // LOG.debug("Pages used: " + domDb.debugPages(doc)); ! // return null; ! // } ! // }.run(); saveCollection(doc.getCollection()); LOG.debug("Defragmentation took " + (System.currentTimeMillis() - start) + "ms."); *************** *** 1321,1326 **** } ! public void checkTree(DocumentImpl doc) { LOG.debug("Checking DOM tree for document " + doc.getFileName()); NodeList nodes = doc.getChildNodes(); NodeImpl n; --- 1335,1347 ---- } ! public void checkTree(final DocumentImpl doc) { LOG.debug("Checking DOM tree for document " + doc.getFileName()); + new DOMTransaction(this, domDb, Lock.READ_LOCK) { + public Object start() throws ReadOnlyException { + LOG.debug("Pages used: " + domDb.debugPages(doc)); + return null; + } + }.run(); + NodeList nodes = doc.getChildNodes(); NodeImpl n; *************** *** 1333,1336 **** --- 1354,1372 ---- checkTree(iterator, n); } + NodeRef ref = new NodeRef(doc.getDocId()); + final IndexQuery idx = new IndexQuery(IndexQuery.TRUNC_RIGHT, ref); + new DOMTransaction(this, domDb) { + public Object start() { + try { + domDb.findKeys(idx); + } catch (BTreeException e) { + LOG.warn("start() - " + "error while removing doc", e); + } catch (IOException e) { + LOG.warn("start() - " + "error while removing doc", e); + } + return null; + } + } + .run(); } *************** *** 1523,1529 **** Value val = domDb.get(new NodeProxy((DocumentImpl) doc, gid)); if (val == null) { ! if(LOG.isDebugEnabled()) LOG.debug("node " + gid + " not found in document " + ((DocumentImpl)doc).getDocId()); ! //throw new RuntimeException("node " + gid + " not found"); return null; } --- 1559,1566 ---- Value val = domDb.get(new NodeProxy((DocumentImpl) doc, gid)); if (val == null) { ! if(LOG.isDebugEnabled()) { LOG.debug("node " + gid + " not found in document " + ((DocumentImpl)doc).getDocId()); ! Thread.dumpStack(); ! } return null; } *************** *** 1550,1557 **** Value val = domDb.get(p.getInternalAddress()); if (val == null) { ! LOG.debug("node " + p.gid + " not found in document " + p.doc.getCollection().getName() + '/' + p.doc.getFileName()); Thread.dumpStack(); ! return null; } NodeImpl node = --- 1587,1595 ---- Value val = domDb.get(p.getInternalAddress()); if (val == null) { ! LOG.debug("Node " + p.gid + " not found in document " + p.doc.getCollection().getName() + '/' + p.doc.getFileName()); + LOG.debug(domDb.debugPages(p.doc)); Thread.dumpStack(); ! return objectWith(p.doc, p.gid); } NodeImpl node = *************** *** 2160,2165 **** p = (NodeProxy) i.next(); try { - domDb.setOwnerObject(this); domDb.getLock().acquire(Lock.READ_LOCK); content = domDb.getNodeValue(p); } catch (LockException e) { --- 2198,2203 ---- p = (NodeProxy) i.next(); try { domDb.getLock().acquire(Lock.READ_LOCK); + domDb.setOwnerObject(this); content = domDb.getNodeValue(p); } catch (LockException e) { |
From: Wolfgang M. M. <wol...@us...> - 2004-07-12 17:17:52
|
Update of /cvsroot/exist/eXist-1.0/src/org/dbxml/core/filer In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv32480/src/org/dbxml/core/filer Modified Files: Paged.java Log Message: Various concurrency-related bug fixes: * there has been a conflicting access to the owner property in class DOMFile: in some cases, a second thread set the property to itself while another thread has been in the process of writing data. As the owner object is used to determine the current page in the document, the writing thread used a wrong data page (for a very short period). Thus, one or two document nodes got lost. * a number of small caching problems in dom.dbx led to inconsistencies in the db. Also, queries using string-equality comparisons did not use the cache, so increasing the cache size had no positive effect on query speed. Index: Paged.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/dbxml/core/filer/Paged.java,v retrieving revision 1.23 retrieving revision 1.24 diff -C2 -d -r1.23 -r1.24 *** Paged.java 14 Apr 2004 12:17:23 -0000 1.23 --- Paged.java 12 Jul 2004 17:17:43 -0000 1.24 *************** *** 456,463 **** protected final void writeValue(Page page, Value value) throws IOException { byte[] data = value.getData(); PageHeader hdr = page.getPageHeader(); hdr.dataLen = fileHeader.workSize; ! if (data.length < hdr.dataLen) hdr.dataLen = data.length; page.write(data); } --- 456,468 ---- protected final void writeValue(Page page, Value value) throws IOException { byte[] data = value.getData(); + writeValue(page, data); + } + + protected final void writeValue(Page page, byte[] data) throws IOException { PageHeader hdr = page.getPageHeader(); hdr.dataLen = fileHeader.workSize; ! if (data.length < hdr.dataLen) { hdr.dataLen = data.length; + } page.write(data); } *************** *** 820,824 **** this(); if(pageNum < 0) ! Thread.dumpStack(); setPageNum(pageNum); } --- 825,829 ---- this(); if(pageNum < 0) ! throw new IOException("Illegal page num: " + pageNum); setPageNum(pageNum); } *************** *** 865,882 **** } - /** - * Gets the pageNum attribute of the Page object - * - *@return The pageNum value - */ public long getPageNum() { return pageNum; } - /** - * Gets the refCount attribute of the Page object - * - *@return The refCount value - */ public int getRefCount() { return refCount; --- 870,877 ---- *************** *** 887,913 **** } - /** Description of the Method */ public void incRefCount() { refCount++; } - /** - * Description of the Method - * - *@exception IOException Description of the Exception - */ public byte[] read() throws IOException { try { if (raf.getFilePointer() != offset) { raf.seek(offset); } Arrays.fill(tempHeaderData, (byte)0); raf.read(tempHeaderData); ! // Read in the header header.read(tempHeaderData, 0); // Read the working data final byte[] workData = new byte[header.dataLen]; raf.read(workData); return workData; } catch(Exception e) { --- 882,906 ---- } public void incRefCount() { refCount++; } public byte[] read() throws IOException { try { + // dumpPage(); if (raf.getFilePointer() != offset) { raf.seek(offset); } + Arrays.fill(tempHeaderData, (byte)0); raf.read(tempHeaderData); ! // Read in the header header.read(tempHeaderData, 0); + // Read the working data final byte[] workData = new byte[header.dataLen]; raf.read(workData); + return workData; } catch(Exception e) { *************** *** 916,925 **** } } ! ! /** ! * Sets the pageNum attribute of the Page object ! * ! *@param pageNum The new pageNum value ! */ public void setPageNum(long pageNum) { this.pageNum = pageNum; --- 909,913 ---- } } ! public void setPageNum(long pageNum) { this.pageNum = pageNum; *************** *** 930,940 **** write(null); } ! private final void write(byte[] data) throws IOException { ! //System.out.println(getFile().getName() + " writing page " + pageNum); if(data == null) // Removed page: fill with 0 Arrays.fill(tempPageData, (byte)0); ! // Write out the header header.write(tempPageData, 0); --- 918,928 ---- write(null); } ! private final void write(byte[] data) throws IOException { ! if(data == null) // Removed page: fill with 0 Arrays.fill(tempPageData, (byte)0); ! // Write out the header header.write(tempPageData, 0); *************** *** 972,975 **** --- 960,971 ---- return -1; } + + public void dumpPage() throws IOException { + if (raf.getFilePointer() != offset) + raf.seek(offset); + byte[] data = new byte[fileHeader.pageSize]; + raf.read(data); + LOG.debug("Contents of page " + pageNum + ": " + hexDump(data)); + } } *************** *** 1035,1038 **** --- 1031,1044 ---- } + public int write(byte[] data, int offset) throws IOException { + data[offset++] = status; + ByteConversion.intToByte(dataLen, data, offset); + offset += 4; + ByteConversion.longToByte(nextPage, data, offset); + offset += 8; + dirty = false; + return offset; + } + /** * The length of the Data *************** *** 1068,1082 **** dirty = true; } ! ! public int write(byte[] data, int offset) throws IOException { ! data[offset++] = status; ! ByteConversion.intToByte(dataLen, data, offset); ! offset += 4; ! ByteConversion.longToByte(nextPage, data, offset); ! offset += 8; ! dirty = false; ! return offset; } } } --- 1074,1108 ---- dirty = true; } ! ! } ! ! private static String[] hex = {"0", "1", "2", "3", "4", "5", "6", "7", ! "8", "9", "a", "b", "c", "d", "e", "f"}; ! ! public static String hexDump(byte[] data) { ! StringBuffer buf = new StringBuffer(); ! buf.append("\r\n"); ! int columns = 0; ! for(int i = 0; i < data.length; i++, columns++) { ! byteToHex(buf, data[i]); ! if(columns == 16) { ! buf.append("\r\n"); ! columns = 0; ! } else ! buf.append(' '); } + return buf.toString(); } + private static void byteToHex( StringBuffer buf, byte b ) { + int n = b; + if ( n < 0 ) { + n = 256 + n; + } + int d1 = n / 16; + int d2 = n % 16; + buf.append( hex[d1] ); + buf.append( hex[d2] ); + } + } |
From: Wolfgang M. M. <wol...@us...> - 2004-07-12 17:17:52
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/util/hashtable In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv32480/src/org/exist/util/hashtable Modified Files: Long2ObjectHashMap.java Log Message: Various concurrency-related bug fixes: * there has been a conflicting access to the owner property in class DOMFile: in some cases, a second thread set the property to itself while another thread has been in the process of writing data. As the owner object is used to determine the current page in the document, the writing thread used a wrong data page (for a very short period). Thus, one or two document nodes got lost. * a number of small caching problems in dom.dbx led to inconsistencies in the db. Also, queries using string-equality comparisons did not use the cache, so increasing the cache size had no positive effect on query speed. Index: Long2ObjectHashMap.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/util/hashtable/Long2ObjectHashMap.java,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** Long2ObjectHashMap.java 13 Dec 2003 12:33:21 -0000 1.2 --- Long2ObjectHashMap.java 12 Jul 2004 17:17:43 -0000 1.3 *************** *** 100,104 **** return null; } ! public Object remove(long key) { int idx = hash(key) % tabSize; --- 100,104 ---- return null; } ! public Object remove(long key) { int idx = hash(key) % tabSize; |
From: Wolfgang M. M. <wol...@us...> - 2004-07-12 17:17:50
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/storage/cache In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv32480/src/org/exist/storage/cache Modified Files: ClockCache.java Log Message: Various concurrency-related bug fixes: * there has been a conflicting access to the owner property in class DOMFile: in some cases, a second thread set the property to itself while another thread has been in the process of writing data. As the owner object is used to determine the current page in the document, the writing thread used a wrong data page (for a very short period). Thus, one or two document nodes got lost. * a number of small caching problems in dom.dbx led to inconsistencies in the db. Also, queries using string-equality comparisons did not use the cache, so increasing the cache size had no positive effect on query speed. Index: ClockCache.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/storage/cache/ClockCache.java,v retrieving revision 1.17 retrieving revision 1.18 diff -C2 -d -r1.17 -r1.18 *** ClockCache.java 21 Jun 2004 15:27:36 -0000 1.17 --- ClockCache.java 12 Jul 2004 17:17:41 -0000 1.18 *************** *** 99,103 **** old.sync(); } - // System.out.println(old.getKey() + " -> " + item.getKey()); items[bucket] = item; map.put(item.getKey(), item); --- 99,102 ---- *************** *** 111,114 **** --- 110,114 ---- Cacheable item = (Cacheable) map.get(key); if (item == null) { + // LOG.debug("Page " + key + " not found in cache"); fails++; } else *************** *** 132,135 **** --- 132,136 ---- if (cacheable == null) return; + // LOG.debug("Removing from cache: " + key); for (int i = 0; i < count; i++) { if (items[i] != null && items[i].getKey() == key) { *************** *** 194,197 **** public void setFileName(String fileName) { } - } --- 195,197 ---- |
From: Wolfgang M. M. <wol...@us...> - 2004-07-12 16:58:04
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xmlrpc In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv28627/src/org/exist/xmlrpc Modified Files: RpcServer.java RpcAPI.java Log Message: Modified the query method in the XML-RPC interface to take the query as byte[] instead of a string. Index: RpcServer.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xmlrpc/RpcServer.java,v retrieving revision 1.40 retrieving revision 1.41 diff -C2 -d -r1.40 -r1.41 *** RpcServer.java 5 Jul 2004 20:02:48 -0000 1.40 --- RpcServer.java 12 Jul 2004 16:57:55 -0000 1.41 *************** *** 33,38 **** import java.util.Stack; import java.util.Vector; - import java.util.zip.GZIPInputStream; - import java.util.zip.GZIPOutputStream; import java.util.zip.ZipEntry; import java.util.zip.ZipInputStream; --- 33,36 ---- *************** *** 819,823 **** * */ ! public String query(User user, byte[] xquery, int howmany, int start, Hashtable parameters) throws EXistException, PermissionDeniedException { --- 817,821 ---- * */ ! public byte[] query(User user, byte[] xquery, int howmany, int start, Hashtable parameters) throws EXistException, PermissionDeniedException { *************** *** 833,837 **** try { result = con.query(user, xqueryStr, howmany, start, parameters); ! return result; } catch (Exception e) { handleException(e); --- 831,835 ---- try { result = con.query(user, xqueryStr, howmany, start, parameters); ! return result.getBytes("UTF-8"); } catch (Exception e) { handleException(e); Index: RpcAPI.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xmlrpc/RpcAPI.java,v retrieving revision 1.31 retrieving revision 1.32 diff -C2 -d -r1.31 -r1.32 *** RpcAPI.java 5 Jul 2004 20:02:48 -0000 1.31 --- RpcAPI.java 12 Jul 2004 16:57:55 -0000 1.32 *************** *** 280,284 **** * executeQuery() instead */ ! String query( User user, byte[] xquery, --- 280,284 ---- * executeQuery() instead */ ! byte[] query( User user, byte[] xquery, |
From: Jean-Marc V. <jm...@us...> - 2004-07-10 15:15:05
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/collections In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv7469/src/org/exist/collections Modified Files: Collection.java Log Message: Refactoring: extract method: private DocumentImpl determineTreeStructure(DBBroker broker, String name, DocumentImpl document, DocumentImpl oldDoc, XMLReader reader, InputSource source) Index: Collection.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/collections/Collection.java,v retrieving revision 1.38 retrieving revision 1.39 diff -C2 -d -r1.38 -r1.39 *** Collection.java 10 Jul 2004 13:57:15 -0000 1.38 --- Collection.java 10 Jul 2004 15:14:55 -0000 1.39 *************** *** 713,721 **** XMLReader reader; InputSource source = new InputSource(new StringReader(data)); try { ! oldDoc = checkPermissions(broker, name); - document = new DocumentImpl(broker, name, this); manageDocumentInformation(broker, name, oldDoc, document ); --- 713,775 ---- XMLReader reader; InputSource source = new InputSource(new StringReader(data)); + oldDoc = getDocument(broker, name); + document = new DocumentImpl(broker, name, this); + reader = getReader(broker); + // first pass: parse the document to determine tree structure + document = determineTreeStructure(broker, name, document, oldDoc, reader, source); + + // reset the input source + source = new InputSource(new StringReader(data)); + + // second pass: store the document + LOG.debug("storing document " + document.getDocId() + " ..."); try { ! try { ! reader.parse(source); ! } catch (IOException e) { ! throw new EXistException(e); ! } ! ! if(oldDoc == null) ! addDocument(broker, document); ! // broker.checkTree(document); ! broker.addDocument(this, document); ! broker.closeDocument(); ! broker.flush(); ! LOG.debug("document stored."); ! // if we are running in privileged mode (e.g. backup/restore) ! // notify the SecurityManager about changes ! if (getName().equals(SecurityManager.SYSTEM) && document.getFileName().equals(SecurityManager.ACL_FILE) ! && privileged == false) { ! // inform the security manager that system data has changed ! LOG.debug("users.xml changed"); ! broker.getBrokerPool().reloadSecurityManager(broker); ! } ! } finally { ! document.getUpdateLock().release(Lock.WRITE_LOCK); ! } ! broker.deleteObservers(); ! return document; ! } ! ! /** ! * @param broker ! * @param name ! * @param document ! * @param oldDoc ! * @param reader ! * @param source ! * @return ! * @throws LockException ! * @throws EXistException ! * @throws SAXException ! * @throws PermissionDeniedException ! * @throws TriggerException ! */ ! private DocumentImpl determineTreeStructure(DBBroker broker, String name, DocumentImpl document, DocumentImpl oldDoc, XMLReader reader, InputSource source) throws LockException, EXistException, SAXException, PermissionDeniedException, TriggerException { ! try { ! checkPermissions(broker, name, oldDoc); manageDocumentInformation(broker, name, oldDoc, document ); *************** *** 725,729 **** addObserversToIndexer(broker, indexer); ! reader = prepareSAXParser(broker, name, oldDoc, trigger, indexer); // first pass: parse the document to determine tree structure --- 779,783 ---- addObserversToIndexer(broker, indexer); ! prepareSAXParser(broker, name, oldDoc, trigger, indexer, reader ); // first pass: parse the document to determine tree structure *************** *** 770,805 **** lock.release(); } - - // reset the input source - source = new InputSource(new StringReader(data)); - - // second pass: store the document - LOG.debug("storing document " + document.getDocId() + " ..."); - try { - try { - reader.parse(source); - } catch (IOException e) { - throw new EXistException(e); - } - - if(oldDoc == null) - addDocument(broker, document); - // broker.checkTree(document); - broker.addDocument(this, document); - broker.closeDocument(); - broker.flush(); - LOG.debug("document stored."); - // if we are running in privileged mode (e.g. backup/restore) - // notify the SecurityManager about changes - if (getName().equals(SecurityManager.SYSTEM) && document.getFileName().equals(SecurityManager.ACL_FILE) - && privileged == false) { - // inform the security manager that system data has changed - LOG.debug("users.xml changed"); - broker.getBrokerPool().reloadSecurityManager(broker); - } - } finally { - document.getUpdateLock().release(Lock.WRITE_LOCK); - } - broker.deleteObservers(); return document; } --- 824,827 ---- *************** *** 811,815 **** * @param trigger * @param indexer ! * @return * @throws EXistException * @throws SAXException --- 833,837 ---- * @param trigger * @param indexer ! * @param reader the real source of the XML data * @throws EXistException * @throws SAXException *************** *** 818,825 **** * @throws TriggerException */ ! private XMLReader prepareSAXParser(DBBroker broker, String name, DocumentImpl oldDoc, Trigger trigger, Indexer indexer) throws EXistException, SAXException, SAXNotRecognizedException, SAXNotSupportedException, TriggerException { ! XMLReader reader; indexer.setValidating(true); ! reader = getReader(broker); reader.setEntityResolver(this); --- 840,848 ---- * @throws TriggerException */ ! private void prepareSAXParser(DBBroker broker, String name, DocumentImpl oldDoc, ! Trigger trigger, Indexer indexer, XMLReader reader) throws EXistException, SAXException, SAXNotRecognizedException, SAXNotSupportedException, TriggerException { ! //XMLReader reader; indexer.setValidating(true); ! // reader = getReader(broker); reader.setEntityResolver(this); *************** *** 844,848 **** } reader.setErrorHandler(indexer); ! return reader; } --- 867,871 ---- } reader.setErrorHandler(indexer); ! //return reader; } *************** *** 886,897 **** * @param broker * @param name ! * @return * @throws LockException * @throws PermissionDeniedException */ ! private DocumentImpl checkPermissions(DBBroker broker, String name) throws LockException, PermissionDeniedException { ! DocumentImpl oldDoc = null; lock.acquire(Lock.WRITE_LOCK); ! if (hasDocument(name) && (oldDoc = getDocument(broker, name)) != null) { // jmv: Note: this was only in addDocument(DBBroker broker, String name, String data,) --- 909,919 ---- * @param broker * @param name ! * @param oldDoc old Document existing in database prior to adding a new one with same name. * @throws LockException * @throws PermissionDeniedException */ ! private void checkPermissions(DBBroker broker, String name, DocumentImpl oldDoc) throws LockException, PermissionDeniedException { lock.acquire(Lock.WRITE_LOCK); ! if (hasDocument(name) && (oldDoc ) != null) { // jmv: Note: this was only in addDocument(DBBroker broker, String name, String data,) *************** *** 924,928 **** throw new PermissionDeniedException( "Not allowed to write to collection " + getName()); - return oldDoc; } --- 946,949 ---- *************** *** 941,1000 **** DocumentImpl document = null, oldDoc = null; XMLReader reader; ! try { ! oldDoc = checkPermissions(broker, name); ! ! document = new DocumentImpl(broker, name, this); ! manageDocumentInformation(broker, name, oldDoc, document ); ! ! Trigger trigger = setupTriggers(broker, name, oldDoc); ! Indexer indexer = new Indexer(broker); ! indexer.setDocument(document); ! ! addObserversToIndexer(broker, indexer); ! ! reader = prepareSAXParser(broker, name, oldDoc, trigger, indexer); ! ! // first pass: parse the document to determine tree structure ! LOG.debug("validating document " + name); ! try { ! reader.parse(source); ! } catch (IOException e) { ! throw new EXistException(e); ! } ! document.setMaxDepth(document.getMaxDepth() + 1);//ddddddddddddddddddddddddddddddd ! document.calculateTreeLevelStartPoints(); ! // new document is valid: remove old document ! if (oldDoc != null) { ! LOG.debug("removing old document " + oldDoc.getFileName()); ! if (oldDoc.getResourceType() == DocumentImpl.BINARY_FILE) ! broker.removeBinaryResource((BinaryDocument) oldDoc); ! else ! broker.removeDocument(getName() + '/' + oldDoc.getFileName(), false); ! oldDoc.copyOf(document); ! indexer.setDocumentObject(oldDoc); ! document = oldDoc; ! } else { ! document.getUpdateLock().acquire(Lock.WRITE_LOCK); ! document.setDocId(broker.getNextDocId(this)); ! } ! indexer.setValidating(false); ! if (trigger != null) ! trigger.setValidating(false); ! } catch(EXistException e) { ! if(oldDoc != null) oldDoc.getUpdateLock().release(Lock.WRITE_LOCK); ! throw e; ! } catch(SAXException e) { ! if(oldDoc != null) oldDoc.getUpdateLock().release(Lock.WRITE_LOCK); ! throw e; ! } catch(PermissionDeniedException e) { ! if(oldDoc != null) oldDoc.getUpdateLock().release(Lock.WRITE_LOCK); ! throw e; ! } catch(TriggerException e) { ! if(oldDoc != null) oldDoc.getUpdateLock().release(Lock.WRITE_LOCK); ! throw e; ! } finally { ! lock.release(); ! }//ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff // reset the input source --- 962,971 ---- DocumentImpl document = null, oldDoc = null; XMLReader reader; ! oldDoc = getDocument(broker, name); ! document = new DocumentImpl(broker, name, this); ! reader = getReader(broker); ! // first pass: parse the document to determine tree structure ! document = determineTreeStructure(broker, name, document, oldDoc, reader, source); // reset the input source *************** *** 1056,1064 **** DocumentImpl document, oldDoc = null; DOMStreamer streamer; try { - oldDoc = checkPermissions(broker, name); - - document = new DocumentImpl(broker, name, this); manageDocumentInformation(broker, name, oldDoc, document ); --- 1027,1036 ---- DocumentImpl document, oldDoc = null; DOMStreamer streamer; + oldDoc = getDocument(broker, name); + document = new DocumentImpl(broker, name, this); + try { + checkPermissions(broker, name, oldDoc); manageDocumentInformation(broker, name, oldDoc, document ); *************** *** 1180,1188 **** throw new PermissionDeniedException("Database is read-only"); BinaryDocument blob = null; try { ! ! DocumentImpl oldDoc = checkPermissions(broker, name); ! ! blob = new BinaryDocument(broker, name, this); manageDocumentInformation(broker, name, oldDoc, blob ); --- 1152,1159 ---- throw new PermissionDeniedException("Database is read-only"); BinaryDocument blob = null; + DocumentImpl oldDoc = getDocument(broker, name); + blob = new BinaryDocument(broker, name, this); try { ! checkPermissions(broker, name, oldDoc); manageDocumentInformation(broker, name, oldDoc, blob ); |
From: Jean-Marc V. <jm...@us...> - 2004-07-10 13:57:31
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/collections In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv26941/src/org/exist/collections Modified Files: Collection.java Log Message: Refactoring: extract method: private XMLReader prepareSAXParser(DBBroker broker, String name, DocumentImpl oldDoc, Trigger trigger, Indexer indexer) Index: Collection.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/collections/Collection.java,v retrieving revision 1.37 retrieving revision 1.38 diff -C2 -d -r1.37 -r1.38 *** Collection.java 10 Jul 2004 11:01:09 -0000 1.37 --- Collection.java 10 Jul 2004 13:57:15 -0000 1.38 *************** *** 712,716 **** DocumentImpl document, oldDoc = null; XMLReader reader; ! InputSource source; try { oldDoc = checkPermissions(broker, name); --- 712,717 ---- DocumentImpl document, oldDoc = null; XMLReader reader; ! InputSource source = new InputSource(new StringReader(data)); ! try { oldDoc = checkPermissions(broker, name); *************** *** 719,765 **** manageDocumentInformation(broker, name, oldDoc, document ); - // setup triggers Trigger trigger = setupTriggers(broker, name, oldDoc); Indexer indexer = new Indexer(broker); indexer.setDocument(document); ! // add observers to the indexer ! Observer observer; ! broker.deleteObservers(); ! if (observers != null) { ! for (Iterator i = observers.iterator(); i.hasNext(); ) { ! observer = (Observer) i.next(); ! indexer.addObserver(observer); ! broker.addObserver(observer); ! } ! } ! // prepare the SAX parser ! indexer.setValidating(true); ! reader = getReader(broker); ! reader.setEntityResolver(this); ! ! if (trigger != null && triggersEnabled) { ! reader.setContentHandler(trigger.getInputHandler()); ! reader.setProperty( ! "http://xml.org/sax/properties/lexical-handler", ! trigger.getLexicalInputHandler()); ! trigger.setOutputHandler(indexer); ! trigger.setValidating(true); ! // prepare the trigger ! trigger.prepare(oldDoc == null ! ? Trigger.STORE_DOCUMENT_EVENT ! : Trigger.UPDATE_DOCUMENT_EVENT, broker, name, oldDoc); ! } else { ! reader.setContentHandler(indexer); ! reader ! .setProperty( ! "http://xml.org/sax/properties/lexical-handler", ! indexer); ! } ! reader.setErrorHandler(indexer); // first pass: parse the document to determine tree structure LOG.debug("validating document " + name); - source = new InputSource(new StringReader(data)); try { reader.parse(source); --- 720,732 ---- manageDocumentInformation(broker, name, oldDoc, document ); Trigger trigger = setupTriggers(broker, name, oldDoc); Indexer indexer = new Indexer(broker); indexer.setDocument(document); ! addObserversToIndexer(broker, indexer); ! reader = prepareSAXParser(broker, name, oldDoc, trigger, indexer); // first pass: parse the document to determine tree structure LOG.debug("validating document " + name); try { reader.parse(source); *************** *** 803,806 **** --- 770,774 ---- lock.release(); } + // reset the input source source = new InputSource(new StringReader(data)); *************** *** 837,840 **** --- 805,866 ---- } + /** prepare the SAX parser + * @param broker + * @param name + * @param oldDoc + * @param trigger + * @param indexer + * @return + * @throws EXistException + * @throws SAXException + * @throws SAXNotRecognizedException + * @throws SAXNotSupportedException + * @throws TriggerException + */ + private XMLReader prepareSAXParser(DBBroker broker, String name, DocumentImpl oldDoc, Trigger trigger, Indexer indexer) throws EXistException, SAXException, SAXNotRecognizedException, SAXNotSupportedException, TriggerException { + XMLReader reader; + indexer.setValidating(true); + reader = getReader(broker); + reader.setEntityResolver(this); + + if (trigger != null && triggersEnabled) { + reader.setContentHandler(trigger.getInputHandler()); + reader.setProperty( + "http://xml.org/sax/properties/lexical-handler", + trigger.getLexicalInputHandler()); + trigger.setOutputHandler(indexer); + trigger.setLexicalOutputHandler(indexer); + trigger.setValidating(true); + // prepare the trigger + trigger.prepare(oldDoc == null + ? Trigger.STORE_DOCUMENT_EVENT + : Trigger.UPDATE_DOCUMENT_EVENT, broker, name, oldDoc); + } else { + reader.setContentHandler(indexer); + reader + .setProperty( + "http://xml.org/sax/properties/lexical-handler", + indexer); + } + reader.setErrorHandler(indexer); + return reader; + } + + /** add observers to the indexer + * @param broker + * @param indexer + */ + private void addObserversToIndexer(DBBroker broker, Indexer indexer) { + Observer observer; + broker.deleteObservers(); + if (observers != null) { + for (Iterator i = observers.iterator(); i.hasNext(); ) { + observer = (Observer) i.next(); + indexer.addObserver(observer); + broker.addObserver(observer); + } + } + } + /** If an old document exists, keep information about the document. * @param broker *************** *** 921,964 **** manageDocumentInformation(broker, name, oldDoc, document ); - // setup triggers Trigger trigger = setupTriggers(broker, name, oldDoc); ! Indexer parser = new Indexer(broker); ! parser.setDocument(document); ! // add observers to the indexer ! Observer observer; ! broker.deleteObservers(); ! if (observers != null) { ! for (Iterator i = observers.iterator(); i.hasNext(); ) { ! observer = (Observer) i.next(); ! parser.addObserver(observer); ! broker.addObserver(observer); ! } ! } ! // prepare the SAX parser ! parser.setValidating(true); ! reader = getReader(broker); ! reader.setEntityResolver(this); ! if (trigger != null && triggersEnabled) { ! reader.setContentHandler(trigger.getInputHandler()); ! reader.setProperty( ! "http://xml.org/sax/properties/lexical-handler", ! trigger.getLexicalInputHandler()); ! trigger.setOutputHandler(parser); ! trigger.setLexicalOutputHandler(parser); ! trigger.setValidating(true); ! // prepare the trigger ! trigger.prepare(oldDoc == null ! ? Trigger.STORE_DOCUMENT_EVENT ! : Trigger.UPDATE_DOCUMENT_EVENT, broker, name, oldDoc); ! } else { ! reader.setContentHandler(parser); ! reader ! .setProperty( ! "http://xml.org/sax/properties/lexical-handler", ! parser); ! } ! reader.setErrorHandler(parser); // first pass: parse the document to determine tree structure --- 947,957 ---- manageDocumentInformation(broker, name, oldDoc, document ); Trigger trigger = setupTriggers(broker, name, oldDoc); ! Indexer indexer = new Indexer(broker); ! indexer.setDocument(document); ! addObserversToIndexer(broker, indexer); ! reader = prepareSAXParser(broker, name, oldDoc, trigger, indexer); // first pass: parse the document to determine tree structure *************** *** 979,983 **** broker.removeDocument(getName() + '/' + oldDoc.getFileName(), false); oldDoc.copyOf(document); ! parser.setDocumentObject(oldDoc); document = oldDoc; } else { --- 972,976 ---- broker.removeDocument(getName() + '/' + oldDoc.getFileName(), false); oldDoc.copyOf(document); ! indexer.setDocumentObject(oldDoc); document = oldDoc; } else { *************** *** 986,990 **** } ! parser.setValidating(false); if (trigger != null) trigger.setValidating(false); --- 979,983 ---- } ! indexer.setValidating(false); if (trigger != null) trigger.setValidating(false); *************** *** 1058,1062 **** boolean privileged) throws EXistException, LockException, PermissionDeniedException, TriggerException, SAXException { ! Indexer parser = new Indexer(broker); if (broker.isReadOnly()) throw new PermissionDeniedException("Database is read-only"); --- 1051,1055 ---- boolean privileged) throws EXistException, LockException, PermissionDeniedException, TriggerException, SAXException { ! Indexer indexer = new Indexer(broker); if (broker.isReadOnly()) throw new PermissionDeniedException("Database is read-only"); *************** *** 1071,1092 **** Trigger trigger = setupTriggers(broker, name, oldDoc); ! parser.setDocument(document); ! // add observers to the indexer ! Observer observer; ! broker.deleteObservers(); ! if (observers != null) { ! for (Iterator i = observers.iterator(); i.hasNext(); ) { ! observer = (Observer) i.next(); ! parser.addObserver(observer); ! broker.addObserver(observer); ! } ! } ! parser.setValidating(true); streamer = new DOMStreamer(); if (trigger != null && triggersEnabled) { streamer.setContentHandler(trigger.getInputHandler()); streamer.setLexicalHandler(trigger.getLexicalInputHandler()); ! trigger.setOutputHandler(parser); trigger.setValidating(true); // prepare the trigger --- 1064,1076 ---- Trigger trigger = setupTriggers(broker, name, oldDoc); ! indexer.setDocument(document); ! addObserversToIndexer(broker, indexer); ! indexer.setValidating(true); streamer = new DOMStreamer(); if (trigger != null && triggersEnabled) { streamer.setContentHandler(trigger.getInputHandler()); streamer.setLexicalHandler(trigger.getLexicalInputHandler()); ! trigger.setOutputHandler(indexer); trigger.setValidating(true); // prepare the trigger *************** *** 1095,1100 **** : Trigger.UPDATE_DOCUMENT_EVENT, broker, name, oldDoc); } else { ! streamer.setContentHandler(parser); ! streamer.setLexicalHandler(parser); } --- 1079,1084 ---- : Trigger.UPDATE_DOCUMENT_EVENT, broker, name, oldDoc); } else { ! streamer.setContentHandler(indexer); ! streamer.setLexicalHandler(indexer); } *************** *** 1112,1116 **** broker.removeDocument(getName() + '/' + oldDoc.getFileName(), false); oldDoc.copyOf(document); ! parser.setDocumentObject(oldDoc); document = oldDoc; } else { --- 1096,1100 ---- broker.removeDocument(getName() + '/' + oldDoc.getFileName(), false); oldDoc.copyOf(document); ! indexer.setDocumentObject(oldDoc); document = oldDoc; } else { *************** *** 1119,1123 **** } ! parser.setValidating(false); if (trigger != null) trigger.setValidating(false); --- 1103,1107 ---- } ! indexer.setValidating(false); if (trigger != null) trigger.setValidating(false); *************** *** 1171,1175 **** */ private Trigger setupTriggers(DBBroker broker, String name, DocumentImpl oldDoc) { - // setup triggers Trigger trigger = null; if (triggersEnabled && !name.equals(COLLECTION_CONFIG_FILE)) { --- 1155,1158 ---- |
From: Jean-Marc V. <jm...@us...> - 2004-07-10 11:01:23
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/collections In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv32538/src/org/exist/collections Modified Files: Collection.java Log Message: Refactoring: extract method (by eclipse): private Trigger setupTriggers(DBBroker broker, String name, DocumentImpl oldDoc) { Index: Collection.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/collections/Collection.java,v retrieving revision 1.36 retrieving revision 1.37 diff -C2 -d -r1.36 -r1.37 *** Collection.java 9 Jul 2004 17:27:20 -0000 1.36 --- Collection.java 10 Jul 2004 11:01:09 -0000 1.37 *************** *** 720,739 **** // setup triggers ! Trigger trigger = null; ! if (triggersEnabled && !name.equals(COLLECTION_CONFIG_FILE)) { ! if (triggersEnabled) { ! CollectionConfiguration config = getConfiguration(broker); ! if (config != null) { ! if (oldDoc == null) ! trigger = config ! .getTrigger(Trigger.STORE_DOCUMENT_EVENT); ! else ! trigger = config ! .getTrigger(Trigger.UPDATE_DOCUMENT_EVENT); ! } ! } ! } else ! // set configuration to null if we are updating collection.xconf ! configuration = null; Indexer indexer = new Indexer(broker); indexer.setDocument(document); --- 720,724 ---- // setup triggers ! Trigger trigger = setupTriggers(broker, name, oldDoc); Indexer indexer = new Indexer(broker); indexer.setDocument(document); *************** *** 937,956 **** // setup triggers ! Trigger trigger = null; ! if (triggersEnabled && !name.equals(COLLECTION_CONFIG_FILE)) { ! if (triggersEnabled) { ! CollectionConfiguration config = getConfiguration(broker); ! if (config != null) { ! if (oldDoc == null) ! trigger = config ! .getTrigger(Trigger.STORE_DOCUMENT_EVENT); ! else ! trigger = config ! .getTrigger(Trigger.UPDATE_DOCUMENT_EVENT); ! } ! } ! } else ! // set configuration to null if we are updating collection.xconf ! configuration = null; Indexer parser = new Indexer(broker); parser.setDocument(document); --- 922,926 ---- // setup triggers ! Trigger trigger = setupTriggers(broker, name, oldDoc); Indexer parser = new Indexer(broker); parser.setDocument(document); *************** *** 1033,1037 **** } finally { lock.release(); ! } // reset the input source --- 1003,1007 ---- } finally { lock.release(); ! }//ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff // reset the input source *************** *** 1100,1120 **** manageDocumentInformation(broker, name, oldDoc, document ); ! // setup triggers ! Trigger trigger = null; ! if (triggersEnabled && !name.equals(COLLECTION_CONFIG_FILE)) { ! if (triggersEnabled) { ! CollectionConfiguration config = getConfiguration(broker); ! if (config != null) { ! if (oldDoc == null) ! trigger = config ! .getTrigger(Trigger.STORE_DOCUMENT_EVENT); ! else ! trigger = config ! .getTrigger(Trigger.UPDATE_DOCUMENT_EVENT); ! } ! } ! } else ! // set configuration to null if we are updating collection.xconf ! configuration = null; parser.setDocument(document); --- 1070,1074 ---- manageDocumentInformation(broker, name, oldDoc, document ); ! Trigger trigger = setupTriggers(broker, name, oldDoc); parser.setDocument(document); *************** *** 1148,1152 **** LOG.debug("validating document " + name); streamer.serialize(node, true); ! document.setMaxDepth(document.getMaxDepth() + 1); document.calculateTreeLevelStartPoints(); // new document is valid: remove old document --- 1102,1106 ---- LOG.debug("validating document " + name); streamer.serialize(node, true); ! document.setMaxDepth(document.getMaxDepth() + 1);//ddddddddddddddddddd document.calculateTreeLevelStartPoints(); // new document is valid: remove old document *************** *** 1182,1186 **** } finally { lock.release(); ! } try { // second pass: store the document --- 1136,1140 ---- } finally { lock.release(); ! }//ffffffffffffffffffffffffffffffffffffffffffffffff try { // second pass: store the document *************** *** 1210,1213 **** --- 1164,1194 ---- } + /** + * @param broker + * @param name + * @param oldDoc + * @return + */ + private Trigger setupTriggers(DBBroker broker, String name, DocumentImpl oldDoc) { + // setup triggers + Trigger trigger = null; + if (triggersEnabled && !name.equals(COLLECTION_CONFIG_FILE)) { + if (triggersEnabled) { + CollectionConfiguration config = getConfiguration(broker); + if (config != null) { + if (oldDoc == null) + trigger = config + .getTrigger(Trigger.STORE_DOCUMENT_EVENT); + else + trigger = config + .getTrigger(Trigger.UPDATE_DOCUMENT_EVENT); + } + } + } else + // set configuration to null if we are updating collection.xconf + configuration = null; + return trigger; + } + public BinaryDocument addBinaryResource(DBBroker broker, String name, byte[] data) throws EXistException, |
From: Jean-Marc V. <jm...@us...> - 2004-07-09 17:27:29
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/collections In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv20007/src/org/exist/collections Modified Files: Collection.java Log Message: Refactoring: extract method: checkPermissions(DBBroker broker, String name) Index: Collection.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/collections/Collection.java,v retrieving revision 1.35 retrieving revision 1.36 diff -C2 -d -r1.35 -r1.36 *** Collection.java 9 Jul 2004 09:19:32 -0000 1.35 --- Collection.java 9 Jul 2004 17:27:20 -0000 1.36 *************** *** 714,746 **** InputSource source; try { ! lock.acquire(Lock.WRITE_LOCK); ! if (hasDocument(name) && (oldDoc = getDocument(broker, name)) != null) { ! if(oldDoc.isLockedForWrite()) ! throw new PermissionDeniedException("Document " + name + ! " is locked for write"); ! // check if the document is locked by another user ! User lockUser = oldDoc.getUserLock(); ! if(lockUser != null && !lockUser.equals(broker.getUser())) ! throw new PermissionDeniedException("The document is locked by user " + ! lockUser.getName()); ! ! // check if the document is currently being changed by someone else ! Lock oldLock = oldDoc.getUpdateLock(); ! oldLock.acquire(Lock.WRITE_LOCK); ! ! // do we have permissions for update? ! if (!oldDoc.getPermissions().validate(broker.getUser(), ! Permission.UPDATE)) ! throw new PermissionDeniedException( ! "Document \""+name+"\" exists and update is not allowed"); ! if (!(getPermissions().validate(broker.getUser(), Permission.UPDATE) || ! getPermissions().validate(broker.getUser(), Permission.WRITE))) ! throw new PermissionDeniedException( ! "Document exists and update is not allowed for the collection"); ! // do we have write permissions? ! } else if (!getPermissions().validate(broker.getUser(), ! Permission.WRITE)) ! throw new PermissionDeniedException( ! "Not allowed to write to collection " + getName()); document = new DocumentImpl(broker, name, this); --- 714,718 ---- InputSource source; try { ! oldDoc = checkPermissions(broker, name); document = new DocumentImpl(broker, name, this); *************** *** 880,884 **** } ! /** If an old document exists, keep information about the document * @param broker * @param name --- 852,856 ---- } ! /** If an old document exists, keep information about the document. * @param broker * @param name *************** *** 900,903 **** --- 872,919 ---- } + /** Check Permissions about user and document, and throw exceptions if necessary. + * @param broker + * @param name + * @return + * @throws LockException + * @throws PermissionDeniedException + */ + private DocumentImpl checkPermissions(DBBroker broker, String name) throws LockException, PermissionDeniedException { + DocumentImpl oldDoc = null; + lock.acquire(Lock.WRITE_LOCK); + if (hasDocument(name) && (oldDoc = getDocument(broker, name)) != null) { + + // jmv: Note: this was only in addDocument(DBBroker broker, String name, String data,) + if(oldDoc.isLockedForWrite()) + throw new PermissionDeniedException("Document " + name + + " is locked for write"); + + // check if the document is locked by another user + User lockUser = oldDoc.getUserLock(); + if(lockUser != null && !lockUser.equals(broker.getUser())) + throw new PermissionDeniedException("The document is locked by user " + + lockUser.getName()); + + // check if the document is currently being changed by someone else + Lock oldLock = oldDoc.getUpdateLock(); + oldLock.acquire(Lock.WRITE_LOCK); + + // do we have permissions for update? + if (!oldDoc.getPermissions().validate(broker.getUser(), + Permission.UPDATE)) + throw new PermissionDeniedException( + "Document exists and update is not allowed"); + if (!(getPermissions().validate(broker.getUser(), Permission.UPDATE) || + getPermissions().validate(broker.getUser(), Permission.WRITE))) + throw new PermissionDeniedException( + "Document exists and update is not allowed for the collection"); + // do we have write permissions? + } else if (!getPermissions().validate(broker.getUser(), + Permission.WRITE)) + throw new PermissionDeniedException( + "Not allowed to write to collection " + getName()); + return oldDoc; + } + public DocumentImpl addDocument(DBBroker broker, String name, InputSource source) throws EXistException, LockException, *************** *** 915,944 **** XMLReader reader; try { ! lock.acquire(Lock.WRITE_LOCK); ! if (hasDocument(name) && (oldDoc = getDocument(broker, name)) != null) { ! // check if the document is locked by another user ! User lockUser = oldDoc.getUserLock(); ! if(lockUser != null && !lockUser.equals(broker.getUser())) ! throw new PermissionDeniedException("The document is locked by user " + ! lockUser.getName()); ! ! // check if the document is currently being changed by someone else ! Lock oldLock = oldDoc.getUpdateLock(); ! oldLock.acquire(Lock.WRITE_LOCK); ! ! // do we have permissions for update? ! if (!oldDoc.getPermissions().validate(broker.getUser(), ! Permission.UPDATE)) ! throw new PermissionDeniedException( ! "Document exists and update is not allowed"); ! if (!(getPermissions().validate(broker.getUser(), Permission.UPDATE) || ! getPermissions().validate(broker.getUser(), Permission.WRITE))) ! throw new PermissionDeniedException( ! "Document exists and update is not allowed for the collection"); ! // do we have write permissions? ! } else if (!getPermissions().validate(broker.getUser(), ! Permission.WRITE)) ! throw new PermissionDeniedException( ! "Not allowed to write to collection " + getName()); document = new DocumentImpl(broker, name, this); --- 931,935 ---- XMLReader reader; try { ! oldDoc = checkPermissions(broker, name); document = new DocumentImpl(broker, name, this); *************** *** 1008,1012 **** throw new EXistException(e); } ! document.setMaxDepth(document.getMaxDepth() + 1); document.calculateTreeLevelStartPoints(); // new document is valid: remove old document --- 999,1003 ---- throw new EXistException(e); } ! document.setMaxDepth(document.getMaxDepth() + 1);//ddddddddddddddddddddddddddddddd document.calculateTreeLevelStartPoints(); // new document is valid: remove old document *************** *** 1103,1131 **** DOMStreamer streamer; try { ! lock.acquire(Lock.WRITE_LOCK); ! if (hasDocument(name) && (oldDoc = getDocument(broker, name)) != null) { ! // check if the document is locked by another user ! User lockUser = oldDoc.getUserLock(); ! if(lockUser != null && !lockUser.equals(broker.getUser())) ! throw new PermissionDeniedException("The document is locked by user " + ! lockUser.getName()); ! ! // check if the document is currently being changed by someone else ! oldDoc.getUpdateLock().acquire(Lock.WRITE_LOCK); ! ! // do we have permissions for update? ! if (!oldDoc.getPermissions().validate(broker.getUser(), ! Permission.UPDATE)) ! throw new PermissionDeniedException( ! "document exists and update " + "is not allowed"); ! if (!(getPermissions().validate(broker.getUser(), Permission.UPDATE) || ! getPermissions().validate(broker.getUser(), Permission.WRITE))) ! throw new PermissionDeniedException( ! "Document exists and update is not allowed for the collection"); ! // no: do we have write permissions? ! } else if (!getPermissions().validate(broker.getUser(), ! Permission.WRITE)) ! throw new PermissionDeniedException( ! "not allowed to write to collection " + getName()); document = new DocumentImpl(broker, name, this); --- 1094,1099 ---- DOMStreamer streamer; try { ! ! oldDoc = checkPermissions(broker, name); document = new DocumentImpl(broker, name, this); *************** *** 1249,1291 **** BinaryDocument blob = null; try { ! lock.acquire(Lock.WRITE_LOCK); ! DocumentImpl oldDoc = getDocument(broker, name); ! if (oldDoc != null) { ! if(oldDoc.isLockedForWrite()) ! throw new PermissionDeniedException("Document " + name + ! " is already locked for write"); ! // check if the document is locked by another user ! User lockUser = oldDoc.getUserLock(); ! if(lockUser != null && !lockUser.equals(broker.getUser())) ! throw new PermissionDeniedException("The document is locked by user " + ! lockUser.getName()); ! // do we have permissions for update? ! if (!oldDoc.getPermissions().validate(broker.getUser(), ! Permission.UPDATE)) ! throw new PermissionDeniedException( ! "document exists and update is not allowed"); ! // no: do we have write permissions? ! } else if (!getPermissions().validate(broker.getUser(), ! Permission.WRITE)) ! throw new PermissionDeniedException( ! "not allowed to write to collection " + getName()); blob = new BinaryDocument(broker, name, this); - // if (oldDoc != null) { - // blob.setCreated(oldDoc.getCreated()); - // blob.setLastModified(System.currentTimeMillis()); - // blob.setPermissions(oldDoc.getPermissions()); - // - // LOG.debug("removing old document " + oldDoc.getFileName()); - // if (oldDoc instanceof BinaryDocument) - // broker.removeBinaryResource((BinaryDocument) oldDoc); - // else - // broker.removeDocument(getName() + '/' + oldDoc.getFileName()); - // } else { - // blob.setCreated(System.currentTimeMillis()); - // blob.getPermissions().setOwner(broker.getUser()); - // blob.getPermissions().setGroup( - // broker.getUser().getPrimaryGroup()); - // } manageDocumentInformation(broker, name, oldDoc, blob ); --- 1217,1224 ---- BinaryDocument blob = null; try { ! ! DocumentImpl oldDoc = checkPermissions(broker, name); blob = new BinaryDocument(broker, name, this); manageDocumentInformation(broker, name, oldDoc, blob ); |
From: Jean-Marc V. <jm...@us...> - 2004-07-09 16:44:55
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xmldb/test In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv7437/src/org/exist/xmldb/test Modified Files: ResourceSetTest.java ResourceTest.java AllTests.java Log Message: add more tests in AllTests Index: ResourceSetTest.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xmldb/test/ResourceSetTest.java,v retrieving revision 1.1 retrieving revision 1.2 diff -C2 -d -r1.1 -r1.2 *** ResourceSetTest.java 2 Apr 2003 13:11:36 -0000 1.1 --- ResourceSetTest.java 9 Jul 2004 16:44:41 -0000 1.2 *************** *** 45,52 **** // Currently (2003-04-02) fires an exception in FunPosition: ! XPathPrefix = "document('/db/test/hamlet.xml')/*/*"; query1 = XPathPrefix + "[position()>=5 ]"; query2 = XPathPrefix + "[position()<=10]"; ! expected = 6; // This validates OK: // XPathPrefix = "document('/db/test/hamlet.xml')//LINE"; --- 45,52 ---- // Currently (2003-04-02) fires an exception in FunPosition: ! XPathPrefix = "document('/db/test/shakes.xsl')/*/*"; // "document('/db/test/macbeth.xml')/*/*"; query1 = XPathPrefix + "[position()>=5 ]"; query2 = XPathPrefix + "[position()<=10]"; ! expected = 87; // This validates OK: // XPathPrefix = "document('/db/test/hamlet.xml')//LINE"; Index: AllTests.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xmldb/test/AllTests.java,v retrieving revision 1.3 retrieving revision 1.4 diff -C2 -d -r1.3 -r1.4 *** AllTests.java 2 Oct 2003 12:20:20 -0000 1.3 --- AllTests.java 9 Jul 2004 16:44:41 -0000 1.4 *************** *** 11,14 **** --- 11,16 ---- suite.addTest(new TestSuite(CreateCollectionsTest.class)); suite.addTest(new TestSuite(ResourceTest.class)); + suite.addTest(new TestSuite(ResourceSetTest.class)); + suite.addTest(new TestSuite(TestEXistXMLSerialize.class)); //$JUnit-END$ return suite; Index: ResourceTest.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xmldb/test/ResourceTest.java,v retrieving revision 1.7 retrieving revision 1.8 diff -C2 -d -r1.7 -r1.8 *** ResourceTest.java 29 Jan 2004 15:06:49 -0000 1.7 --- ResourceTest.java 9 Jul 2004 16:44:41 -0000 1.8 *************** *** 59,62 **** --- 59,63 ---- System.out.println("----------------------------------------"); } catch (Exception e) { + System.out.println("testReadResource(): Exception: " + e); fail(e.getMessage()); } |
From: Jean-Marc V. <jm...@us...> - 2004-07-09 14:44:05
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/test In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv12979/src/org/exist/xquery/test Modified Files: XQueryUseCasesTest.java AllTests.java XPathQueryTest.java Log Message: add all available tests in AllTests; comment out in UseCasesTest the 2 that don't (yet) pass Index: XPathQueryTest.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/test/XPathQueryTest.java,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** XPathQueryTest.java 7 Jul 2004 08:29:27 -0000 1.2 --- XPathQueryTest.java 9 Jul 2004 14:43:55 -0000 1.3 *************** *** 157,160 **** --- 157,161 ---- assertEquals(2, result.getSize()); } catch (XMLDBException e) { + System.out.println("testStrings(): XMLDBException: "+e); fail(e.getMessage()); } Index: XQueryUseCasesTest.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/test/XQueryUseCasesTest.java,v retrieving revision 1.1 retrieving revision 1.2 diff -C2 -d -r1.1 -r1.2 *** XQueryUseCasesTest.java 29 Jan 2004 15:06:42 -0000 1.1 --- XQueryUseCasesTest.java 9 Jul 2004 14:43:55 -0000 1.2 *************** *** 49,55 **** } ! public void testXMP() throws Exception { ! useCase.doTest("xmp"); ! } public void testSGML() throws Exception { --- 49,56 ---- } ! // jmv: to activate when we'll have function deep-equal() ! // public void testXMP() throws Exception { ! // useCase.doTest("xmp"); ! // } public void testSGML() throws Exception { *************** *** 76,82 **** useCase.doTest("seq"); } ! ! public void testR() throws Exception { ! useCase.doTest("r"); ! } } --- 77,85 ---- useCase.doTest("seq"); } ! ! // jmv: to activate when implemented ! // org.xmldb.api.base.XMLDBException: Cannot query constructed nodes. ! // public void testR() throws Exception { ! // useCase.doTest("r"); ! // } } Index: AllTests.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/test/AllTests.java,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** AllTests.java 28 May 2004 10:54:24 -0000 1.2 --- AllTests.java 9 Jul 2004 14:43:55 -0000 1.3 *************** *** 38,43 **** TestSuite suite = new TestSuite("Test for org.exist.xquery.test"); //$JUnit-BEGIN$ - suite.addTestSuite(LexerTest.class); suite.addTestSuite(XPathQueryTest.class); //$JUnit-END$ return suite; --- 38,44 ---- TestSuite suite = new TestSuite("Test for org.exist.xquery.test"); //$JUnit-BEGIN$ suite.addTestSuite(XPathQueryTest.class); + suite.addTestSuite(LexerTest.class); // jmv: Note: LexerTest needs /db/test created by XPathQueryTest + suite.addTestSuite(XQueryUseCasesTest.class); //$JUnit-END$ return suite; |
From: Jean-Marc V. <jm...@us...> - 2004-07-09 12:20:40
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xmldb/test In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv18026/src/org/exist/xmldb/test Modified Files: CreateCollectionsTest.java Log Message: add test for storing binary resource Index: CreateCollectionsTest.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xmldb/test/CreateCollectionsTest.java,v retrieving revision 1.7 retrieving revision 1.8 diff -C2 -d -r1.7 -r1.8 *** CreateCollectionsTest.java 9 Jul 2004 08:31:40 -0000 1.7 --- CreateCollectionsTest.java 9 Jul 2004 12:20:31 -0000 1.8 *************** *** 1,5 **** --- 1,7 ---- package org.exist.xmldb.test; + import java.io.BufferedInputStream; import java.io.File; + import java.io.FileInputStream; import java.io.IOException; import java.util.ArrayList; *************** *** 134,137 **** --- 136,145 ---- testCollection); + byte[] data = storeBinaryResourceFromFile( new File( "../webapp/logo.jpg"), testCollection); + Object content = testCollection.getResource("logo.jpg").getContent(); + byte[] dataStored = (byte[])content; + assertTrue("After storing binary resource, data out==data in", + Arrays.equals(dataStored, data) ); + } catch (XMLDBException e) { e.printStackTrace(); *************** *** 162,165 **** --- 170,199 ---- } + private byte[] storeBinaryResourceFromFile( + File file, + Collection testCollection) + throws XMLDBException, IOException { + System.out.println("storing " + file.getAbsolutePath()); + + Resource res = + (BinaryResource) testCollection.createResource( + file.getName(), + "BinaryResource" ); + assertNotNull("store binary Resource From File", res); + + // Get an array of bytes from the file: + FileInputStream istr = new FileInputStream(file); + BufferedInputStream bstr = new BufferedInputStream( istr ); // promote + int size = (int) file.length(); // get the file size (in bytes) + byte[] data = new byte[size]; // allocate byte array of right size + bstr.read( data, 0, size ); // read into byte array + bstr.close(); + + res.setContent(data); + testCollection.storeResource(res); + System.out.println("stored " + file.getAbsolutePath()); + return data; + } + public void testMultipleCreates() { try { |
From: Jean-Marc V. <jm...@us...> - 2004-07-09 09:19:42
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/collections In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv21188/src/org/exist/collections Modified Files: Collection.java Log Message: Refactoring: extract method private void manageDocumentInformation(DBBroker broker, String name, DocumentImpl oldDoc, DocumentImpl document ) Index: Collection.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/collections/Collection.java,v retrieving revision 1.34 retrieving revision 1.35 diff -C2 -d -r1.34 -r1.35 *** Collection.java 5 Jul 2004 20:02:46 -0000 1.34 --- Collection.java 9 Jul 2004 09:19:32 -0000 1.35 *************** *** 733,737 **** Permission.UPDATE)) throw new PermissionDeniedException( ! "Document exists and update is not allowed"); if (!(getPermissions().validate(broker.getUser(), Permission.UPDATE) || getPermissions().validate(broker.getUser(), Permission.WRITE))) --- 733,737 ---- Permission.UPDATE)) throw new PermissionDeniedException( ! "Document \""+name+"\" exists and update is not allowed"); if (!(getPermissions().validate(broker.getUser(), Permission.UPDATE) || getPermissions().validate(broker.getUser(), Permission.WRITE))) *************** *** 743,760 **** throw new PermissionDeniedException( "Not allowed to write to collection " + getName()); ! // if an old document exists, save the new document with a temporary ! // document name ! if (oldDoc != null) { ! document = new DocumentImpl(broker, name, this); ! document.setCreated(oldDoc.getCreated()); ! document.setLastModified(System.currentTimeMillis()); ! document.setPermissions(oldDoc.getPermissions()); ! } else { ! document = new DocumentImpl(broker, name, this); ! document.setCreated(System.currentTimeMillis()); ! document.getPermissions().setOwner(broker.getUser()); ! document.getPermissions().setGroup( ! broker.getUser().getPrimaryGroup()); ! } // setup triggers Trigger trigger = null; --- 743,750 ---- throw new PermissionDeniedException( "Not allowed to write to collection " + getName()); ! ! document = new DocumentImpl(broker, name, this); ! manageDocumentInformation(broker, name, oldDoc, document ); ! // setup triggers Trigger trigger = null; *************** *** 890,893 **** --- 880,903 ---- } + /** If an old document exists, keep information about the document + * @param broker + * @param name + * @param oldDoc + * @param document + */ + private void manageDocumentInformation(DBBroker broker, String name, DocumentImpl oldDoc, + DocumentImpl document ) { + if (oldDoc != null) { + document.setCreated(oldDoc.getCreated()); + document.setLastModified(System.currentTimeMillis()); + document.setPermissions(oldDoc.getPermissions()); + } else { + document.setCreated(System.currentTimeMillis()); + document.getPermissions().setOwner(broker.getUser()); + document.getPermissions().setGroup( + broker.getUser().getPrimaryGroup()); + } + } + public DocumentImpl addDocument(DBBroker broker, String name, InputSource source) throws EXistException, LockException, *************** *** 932,949 **** "Not allowed to write to collection " + getName()); ! // if an old document exists, save the new document with a temporary ! // document name ! if (oldDoc != null) { ! document = new DocumentImpl(broker, name, this); ! document.setCreated(oldDoc.getCreated()); ! document.setLastModified(System.currentTimeMillis()); ! document.setPermissions(oldDoc.getPermissions()); ! } else { ! document = new DocumentImpl(broker, name, this); ! document.setCreated(System.currentTimeMillis()); ! document.getPermissions().setOwner(broker.getUser()); ! document.getPermissions().setGroup( ! broker.getUser().getPrimaryGroup()); ! } // setup triggers Trigger trigger = null; --- 942,948 ---- "Not allowed to write to collection " + getName()); ! document = new DocumentImpl(broker, name, this); ! manageDocumentInformation(broker, name, oldDoc, document ); ! // setup triggers Trigger trigger = null; *************** *** 1129,1148 **** throw new PermissionDeniedException( "not allowed to write to collection " + getName()); ! // if an old document exists, save the new document with a temporary ! // document name ! if (oldDoc != null) { ! document = new DocumentImpl(broker, name, ! this); ! document.setCreated(oldDoc.getCreated()); ! document.setLastModified(System.currentTimeMillis()); ! document.setPermissions(oldDoc.getPermissions()); ! } else { ! document = new DocumentImpl(broker, name, ! this); ! document.setCreated(System.currentTimeMillis()); ! document.getPermissions().setOwner(broker.getUser()); ! document.getPermissions().setGroup( ! broker.getUser().getPrimaryGroup()); ! } // setup triggers Trigger trigger = null; --- 1128,1135 ---- throw new PermissionDeniedException( "not allowed to write to collection " + getName()); ! ! document = new DocumentImpl(broker, name, this); ! manageDocumentInformation(broker, name, oldDoc, document ); ! // setup triggers Trigger trigger = null; *************** *** 1285,1304 **** blob = new BinaryDocument(broker, name, this); ! if (oldDoc != null) { ! blob.setCreated(oldDoc.getCreated()); ! blob.setLastModified(System.currentTimeMillis()); ! blob.setPermissions(oldDoc.getPermissions()); ! LOG.debug("removing old document " + oldDoc.getFileName()); ! if (oldDoc instanceof BinaryDocument) ! broker.removeBinaryResource((BinaryDocument) oldDoc); ! else ! broker.removeDocument(getName() + '/' + oldDoc.getFileName()); ! } else { ! blob.setCreated(System.currentTimeMillis()); ! blob.getPermissions().setOwner(broker.getUser()); ! blob.getPermissions().setGroup( ! broker.getUser().getPrimaryGroup()); } broker.storeBinaryResource(blob, data); addDocument(broker, blob); --- 1272,1302 ---- blob = new BinaryDocument(broker, name, this); ! // if (oldDoc != null) { ! // blob.setCreated(oldDoc.getCreated()); ! // blob.setLastModified(System.currentTimeMillis()); ! // blob.setPermissions(oldDoc.getPermissions()); ! // ! // LOG.debug("removing old document " + oldDoc.getFileName()); ! // if (oldDoc instanceof BinaryDocument) ! // broker.removeBinaryResource((BinaryDocument) oldDoc); ! // else ! // broker.removeDocument(getName() + '/' + oldDoc.getFileName()); ! // } else { ! // blob.setCreated(System.currentTimeMillis()); ! // blob.getPermissions().setOwner(broker.getUser()); ! // blob.getPermissions().setGroup( ! // broker.getUser().getPrimaryGroup()); ! // } ! manageDocumentInformation(broker, name, oldDoc, blob ); ! ! if (oldDoc != null) { ! LOG.debug("removing old document " + oldDoc.getFileName()); ! if (oldDoc instanceof BinaryDocument) ! broker.removeBinaryResource((BinaryDocument) oldDoc); ! else ! broker.removeDocument(getName() + '/' + oldDoc.getFileName()); } + broker.storeBinaryResource(blob, data); addDocument(broker, blob); |
From: Jean-Marc V. <jm...@us...> - 2004-07-09 08:31:49
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xmldb/test In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv13344/src/org/exist/xmldb/test Modified Files: CreateCollectionsTest.java Log Message: This test was unable to pass a second time without manually erasing the DB. Index: CreateCollectionsTest.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xmldb/test/CreateCollectionsTest.java,v retrieving revision 1.6 retrieving revision 1.7 diff -C2 -d -r1.6 -r1.7 *** CreateCollectionsTest.java 2 Oct 2003 12:20:20 -0000 1.6 --- CreateCollectionsTest.java 9 Jul 2004 08:31:40 -0000 1.7 *************** *** 3,7 **** --- 3,11 ---- import java.io.File; import java.io.IOException; + import java.util.ArrayList; + import java.util.Arrays; + import java.util.Collections; import java.util.HashSet; + import java.util.Iterator; import junit.framework.TestCase; *************** *** 87,119 **** testCollection.isOpen()); System.out.println("---------------------------------------"); ! System.out.println("storing files ..."); System.out.println("---------------------------------------"); ! File f = new File("samples/shakespeare"); File files[] = f.listFiles(new XMLFilenameFilter()); for (int i = 0; i < files.length; i++) { - // XMLResource storeResourceFromFile(File f, Collection col) storeResourceFromFile(files[i], testCollection); } ! HashSet fileNames = new HashSet(); for (int i = 0; i < files.length; i++) { String file = files[i].toString(); int lastSeparator = file.lastIndexOf(File.separatorChar); ! fileNames.add(file.substring(lastSeparator + 1)); } ! System.out.println("fileNames: " + fileNames.toString()); String[] resourcesNames = testCollection.listResources(); int resourceCount = testCollection.getResourceCount(); ! System.out.println( ! "testCollection.getResourceCount()=" + resourceCount); ! for (int i = 0; i < resourceCount; i++) { ! System.out.println("resourcesNames[i]=" + resourcesNames[i]); ! assertTrue( ! "resourcesNames must contain fileNames just stored", ! fileNames.contains(resourcesNames[i])); } String fileToRemove = "macbeth.xml"; --- 91,123 ---- testCollection.isOpen()); + String directory = "samples/shakespeare"; System.out.println("---------------------------------------"); ! System.out.println("storing all XML files in directory " +directory+"..."); System.out.println("---------------------------------------"); ! File f = new File(directory); File files[] = f.listFiles(new XMLFilenameFilter()); for (int i = 0; i < files.length; i++) { storeResourceFromFile(files[i], testCollection); } ! HashSet fileNamesJustStored = new HashSet(); for (int i = 0; i < files.length; i++) { String file = files[i].toString(); int lastSeparator = file.lastIndexOf(File.separatorChar); ! fileNamesJustStored.add(file.substring(lastSeparator + 1)); } ! System.out.println("fileNames stored: " + fileNamesJustStored.toString()); String[] resourcesNames = testCollection.listResources(); int resourceCount = testCollection.getResourceCount(); ! System.out.println( "testCollection.getResourceCount()=" + resourceCount); ! ! ArrayList fileNamesPresentInDatabase = new ArrayList(); ! for (int i = 0; i < resourcesNames.length; i++) { ! fileNamesPresentInDatabase.add( resourcesNames[i]); } + assertTrue( "resourcesNames must contain fileNames just stored", + fileNamesPresentInDatabase. containsAll( fileNamesJustStored) ); String fileToRemove = "macbeth.xml"; *************** *** 127,131 **** storeResourceFromFile( new File( ! "samples/shakespeare" + File.separatorChar + fileToRemove), testCollection); --- 131,135 ---- storeResourceFromFile( new File( ! directory + File.separatorChar + fileToRemove), testCollection); |
From: Wolfgang M. M. <wol...@us...> - 2004-07-07 20:54:53
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/http/webdav/methods In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv28118/src/org/exist/http/webdav/methods Modified Files: Head.java Log Message: WebDAV method HEAD returned error for non-existing resource. Should return HTTP 404 (NOT_FOUND) Index: Head.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/http/webdav/methods/Head.java,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** Head.java 25 May 2004 09:26:04 -0000 1.2 --- Head.java 7 Jul 2004 20:54:44 -0000 1.3 *************** *** 47,52 **** DocumentImpl resource) throws ServletException, IOException { if(resource == null) { ! // GET is not available on collections ! response.sendError(HttpServletResponse.SC_FORBIDDEN, "GET is not available on collections"); return; } --- 47,51 ---- DocumentImpl resource) throws ServletException, IOException { if(resource == null) { ! response.sendError(HttpServletResponse.SC_NOT_FOUND); return; } |
From: Jean-Marc V. <jm...@us...> - 2004-07-07 17:15:56
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/value In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv12216/src/org/exist/xquery/value Modified Files: IntegerValue.java Log Message: finish implementing: private boolean checkType(BigInteger value2, int type2) Index: IntegerValue.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/value/IntegerValue.java,v retrieving revision 1.5 retrieving revision 1.6 diff -C2 -d -r1.5 -r1.6 *** IntegerValue.java 7 Jul 2004 13:56:54 -0000 1.5 --- IntegerValue.java 7 Jul 2004 17:15:46 -0000 1.6 *************** *** 38,43 **** private static final BigInteger ONE_BIGINTEGER = new BigInteger("1"); private static final BigInteger MINUS_ONE_BIGINTEGER = new BigInteger("1"); ! private static final BigInteger LARGEST_INT = new BigInteger("4294967295"); private static final BigInteger SMALLEST_INT = LARGEST_INT.negate(); private BigInteger value; --- 38,49 ---- private static final BigInteger ONE_BIGINTEGER = new BigInteger("1"); private static final BigInteger MINUS_ONE_BIGINTEGER = new BigInteger("1"); ! private static final BigInteger LARGEST_LONG = new BigInteger("9223372036854775808" ); ! private static final BigInteger SMALLEST_LONG = LARGEST_LONG.negate(); ! private static final BigInteger LARGEST_INT = new BigInteger("4294967296"); private static final BigInteger SMALLEST_INT = LARGEST_INT.negate(); + private static final BigInteger LARGEST_SHORT = new BigInteger("65536"); + private static final BigInteger SMALLEST_SHORT = LARGEST_SHORT.negate(); + private static final BigInteger LARGEST_BYTE = new BigInteger("256"); + private static final BigInteger SMALLEST_BYTE = LARGEST_BYTE.negate(); private BigInteger value; *************** *** 62,68 **** value = new BigInteger(stringValue); // Long.parseLong(stringValue); } catch (NumberFormatException e) { - // try { - // value = (long) Double.parseDouble(stringValue); - // } catch (NumberFormatException e1) { throw new XPathException( "failed to convert '" + stringValue + "' to an integer: " + e.getMessage(), e); --- 68,71 ---- *************** *** 76,82 **** value = new BigInteger(stringValue); // Long.parseLong(stringValue); } catch (NumberFormatException e) { - // try { - // value = (long) Double.parseDouble(stringValue); - // } catch (NumberFormatException e1) { throw new XPathException( "failed to convert '" + stringValue + "' to an integer: " + e.getMessage()); --- 79,82 ---- *************** *** 110,114 **** switch (type) { case Type.LONG : ! // jmv: add test LARGEST_LONG SMALLEST_LONG ???? case Type.INTEGER : case Type.DECIMAL : --- 110,116 ---- switch (type) { case Type.LONG : ! // jmv: add test since now long is not the default implementation anymore: ! return value.compareTo(SMALLEST_LONG) == 1 && ! value.compareTo(LARGEST_LONG ) == -1; case Type.INTEGER : case Type.DECIMAL : *************** *** 127,144 **** case Type.INT : return value.compareTo(SMALLEST_INT) == 1 && ! value.compareTo(LARGEST_INT) == -1; ! // >= -4294967295L && value <= 4294967295L; ! // case Type.SHORT : ! // return value >= -65535 && value <= 65535; ! // case Type.BYTE : ! // return value >= -255 && value <= 255; ! // case Type.UNSIGNED_LONG : ! // return value > -1; ! // case Type.UNSIGNED_INT: ! // return value > -1 && value <= 4294967295L; ! // case Type.UNSIGNED_SHORT : ! // return value > -1 && value <= 65535; ! // case Type.UNSIGNED_BYTE : ! // return value > -1 && value <= 255; } throw new XPathException("Unknown type: " + Type.getTypeName(type)); --- 129,152 ---- case Type.INT : return value.compareTo(SMALLEST_INT) == 1 && ! value.compareTo(LARGEST_INT) == -1; ! case Type.SHORT : ! return value.compareTo(SMALLEST_SHORT) == 1 && ! value.compareTo(LARGEST_SHORT) == -1; ! case Type.BYTE : ! return value.compareTo(SMALLEST_BYTE) == 1 && ! value.compareTo(LARGEST_BYTE) == -1; ! ! case Type.UNSIGNED_LONG : ! return value.compareTo(MINUS_ONE_BIGINTEGER) == 1 && ! value.compareTo(LARGEST_LONG ) == -1; ! case Type.UNSIGNED_INT: ! return value.compareTo(MINUS_ONE_BIGINTEGER) == 1 && ! value.compareTo(LARGEST_INT) == -1; ! case Type.UNSIGNED_SHORT : ! return value.compareTo(MINUS_ONE_BIGINTEGER) == 1 && ! value.compareTo(LARGEST_SHORT) == -1; ! case Type.UNSIGNED_BYTE : ! return value.compareTo(MINUS_ONE_BIGINTEGER) == 1 && ! value.compareTo(LARGEST_BYTE) == -1; } throw new XPathException("Unknown type: " + Type.getTypeName(type)); |
From: Jean-Marc V. <jm...@us...> - 2004-07-07 17:14:09
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/functions In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv11771/src/org/exist/xquery/functions Modified Files: FunMax.java FunMin.java Log Message: solves bug: number of arguments to function min doesn't match function signature (expected 1, got 2) Note: design: probably class FuctionPrototype should verify arguments Index: FunMax.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/functions/FunMax.java,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** FunMax.java 28 May 2004 10:54:09 -0000 1.2 --- FunMax.java 7 Jul 2004 17:13:59 -0000 1.3 *************** *** 51,55 **** new SequenceType[] { new SequenceType(Type.ATOMIC, Cardinality.ZERO_OR_MORE)}, ! new SequenceType(Type.ATOMIC, Cardinality.ZERO_OR_ONE)); /** --- 51,55 ---- new SequenceType[] { new SequenceType(Type.ATOMIC, Cardinality.ZERO_OR_MORE)}, ! new SequenceType(Type.ATOMIC, Cardinality.ZERO_OR_ONE), true /* overloaded=true jmv */ ); /** Index: FunMin.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/functions/FunMin.java,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** FunMin.java 28 May 2004 10:54:09 -0000 1.2 --- FunMin.java 7 Jul 2004 17:14:00 -0000 1.3 *************** *** 49,53 **** "the value of every other item in the input sequence.", new SequenceType[] { new SequenceType(Type.ATOMIC, Cardinality.ZERO_OR_MORE)}, ! new SequenceType(Type.ATOMIC, Cardinality.ZERO_OR_ONE)); /** --- 49,53 ---- "the value of every other item in the input sequence.", new SequenceType[] { new SequenceType(Type.ATOMIC, Cardinality.ZERO_OR_MORE)}, ! new SequenceType(Type.ATOMIC, Cardinality.ZERO_OR_ONE), true /* overloaded=true jmv */ ); /** |
From: Jean-Marc V. <jm...@us...> - 2004-07-07 13:57:03
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/value In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv5241/src/org/exist/xquery/value Modified Files: IntegerValue.java Log Message: Correct bug causing initialization error for the class Index: IntegerValue.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/value/IntegerValue.java,v retrieving revision 1.4 retrieving revision 1.5 diff -C2 -d -r1.4 -r1.5 *** IntegerValue.java 7 Jul 2004 08:29:27 -0000 1.4 --- IntegerValue.java 7 Jul 2004 13:56:54 -0000 1.5 *************** *** 38,42 **** private static final BigInteger ONE_BIGINTEGER = new BigInteger("1"); private static final BigInteger MINUS_ONE_BIGINTEGER = new BigInteger("1"); ! private static final BigInteger LARGEST_INT = new BigInteger("4294967295L"); private static final BigInteger SMALLEST_INT = LARGEST_INT.negate(); --- 38,42 ---- private static final BigInteger ONE_BIGINTEGER = new BigInteger("1"); private static final BigInteger MINUS_ONE_BIGINTEGER = new BigInteger("1"); ! private static final BigInteger LARGEST_INT = new BigInteger("4294967295"); private static final BigInteger SMALLEST_INT = LARGEST_INT.negate(); |
From: Jean-Marc V. <jm...@us...> - 2004-07-07 08:29:38
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/test In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv13064/src/org/exist/xquery/test Modified Files: XPathQueryTest.java Log Message: implement the missing constructors taking BigInteger as argument, and private boolean checkType(BigInteger value2, int type2) Index: XPathQueryTest.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/test/XPathQueryTest.java,v retrieving revision 1.1 retrieving revision 1.2 diff -C2 -d -r1.1 -r1.2 *** XPathQueryTest.java 29 Jan 2004 15:06:42 -0000 1.1 --- XPathQueryTest.java 7 Jul 2004 08:29:27 -0000 1.2 *************** *** 114,117 **** --- 114,127 ---- "/test/item[round(price + 3) > 60]"); assertEquals(result.getSize(), 1); + + result = + service.queryResource( + "numbers.xml", + "min( 123456789123456789123456789, " + + "123456789123456789123456789123456789123456789 )"); + assertEquals("minimum of big integers", + result.getResource(0).getContent(), + "123456789123456789123456789" ); + } catch (XMLDBException e) { fail(e.getMessage()); |
From: Jean-Marc V. <jm...@us...> - 2004-07-07 08:29:38
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/value In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv13064/src/org/exist/xquery/value Modified Files: IntegerValue.java Log Message: implement the missing constructors taking BigInteger as argument, and private boolean checkType(BigInteger value2, int type2) Index: IntegerValue.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/value/IntegerValue.java,v retrieving revision 1.3 retrieving revision 1.4 diff -C2 -d -r1.3 -r1.4 *** IntegerValue.java 6 Jul 2004 16:10:41 -0000 1.3 --- IntegerValue.java 7 Jul 2004 08:29:27 -0000 1.4 *************** *** 36,39 **** --- 36,43 ---- public final static IntegerValue ZERO = new IntegerValue(0); private static final BigInteger ZERO_BIGINTEGER = new BigInteger("0"); + private static final BigInteger ONE_BIGINTEGER = new BigInteger("1"); + private static final BigInteger MINUS_ONE_BIGINTEGER = new BigInteger("1"); + private static final BigInteger LARGEST_INT = new BigInteger("4294967295L"); + private static final BigInteger SMALLEST_INT = LARGEST_INT.negate(); private BigInteger value; *************** *** 83,92 **** /** ! * @param value2 * @param requiredType */ ! public IntegerValue(BigInteger value2, int requiredType) { ! ! // TODO Auto-generated constructor stub } --- 87,96 ---- /** ! * @param value * @param requiredType */ ! public IntegerValue(BigInteger value, int requiredType) { ! this.value = value; ! type = requiredType; } *************** *** 95,100 **** */ public IntegerValue(BigInteger integer) { ! ! // TODO Auto-generated constructor stub } --- 99,103 ---- */ public IntegerValue(BigInteger integer) { ! this.value = integer; } *************** *** 102,109 **** * @param value2 * @param type2 */ ! private void checkType(BigInteger value2, int type2) { ! // TODO Auto-generated method stub } --- 105,146 ---- * @param value2 * @param type2 + * @throws XPathException */ ! private boolean checkType(BigInteger value2, int type2) throws XPathException { ! switch (type) { ! case Type.LONG : ! // jmv: add test LARGEST_LONG SMALLEST_LONG ???? ! case Type.INTEGER : ! case Type.DECIMAL : ! return true; + case Type.POSITIVE_INTEGER : + return value.compareTo(ZERO_BIGINTEGER) == 1; // >0 + case Type.NON_NEGATIVE_INTEGER : + return value.compareTo(MINUS_ONE_BIGINTEGER) == 1; // > -1 + + case Type.NEGATIVE_INTEGER : + return value.compareTo(ZERO_BIGINTEGER) == -1; // <0 + case Type.NON_POSITIVE_INTEGER : + return value.compareTo(ONE_BIGINTEGER) == -1; // <1 + + case Type.INT : + return value.compareTo(SMALLEST_INT) == 1 && + value.compareTo(LARGEST_INT) == -1; + // >= -4294967295L && value <= 4294967295L; + // case Type.SHORT : + // return value >= -65535 && value <= 65535; + // case Type.BYTE : + // return value >= -255 && value <= 255; + // case Type.UNSIGNED_LONG : + // return value > -1; + // case Type.UNSIGNED_INT: + // return value > -1 && value <= 4294967295L; + // case Type.UNSIGNED_SHORT : + // return value > -1 && value <= 65535; + // case Type.UNSIGNED_BYTE : + // return value > -1 && value <= 255; + } + throw new XPathException("Unknown type: " + Type.getTypeName(type)); } *************** *** 135,139 **** return value > -1 && value <= 255; case Type.POSITIVE_INTEGER : ! return value >= 0; } throw new XPathException("Unknown type: " + Type.getTypeName(type)); --- 172,176 ---- return value > -1 && value <= 255; case Type.POSITIVE_INTEGER : ! return value > 0; // jmv >= 0; } throw new XPathException("Unknown type: " + Type.getTypeName(type)); |
From: Wolfgang M. M. <wol...@us...> - 2004-07-06 19:49:08
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/parser In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv17191/src/org/exist/xquery/parser Modified Files: XQuery.g Log Message: Index: XQuery.g =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/parser/XQuery.g,v retrieving revision 1.17 retrieving revision 1.18 diff -C2 -d -r1.17 -r1.18 *** XQuery.g 6 Jul 2004 12:05:30 -0000 1.17 --- XQuery.g 6 Jul 2004 19:48:56 -0000 1.18 *************** *** 2002,2006 **** i:INTEGER_LITERAL { ! step= new LiteralValue(context, new IntegerValue(Integer.parseInt(i.getText()))); step.setASTNode(i); } --- 2002,2006 ---- i:INTEGER_LITERAL { ! step= new LiteralValue(context, new IntegerValue(i.getText())); step.setASTNode(i); } |
From: Jean-Marc V. <jm...@us...> - 2004-07-06 16:10:51
|
Update of /cvsroot/exist/eXist-1.0/src/org/exist/xquery/parser In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv5576/src/org/exist/xquery/parser Modified Files: XQueryTreeParser.java Log Message: To be conformant with the spec, class org.exist.xquery.value.IntegerValue was modified to use a BigInteger instead of long. Index: XQueryTreeParser.java =================================================================== RCS file: /cvsroot/exist/eXist-1.0/src/org/exist/xquery/parser/XQueryTreeParser.java,v retrieving revision 1.13 retrieving revision 1.14 diff -C2 -d -r1.13 -r1.14 *** XQueryTreeParser.java 2 Jul 2004 16:50:57 -0000 1.13 --- XQueryTreeParser.java 6 Jul 2004 16:10:41 -0000 1.14 *************** *** 4932,4936 **** _t = _t.getNextSibling(); ! step= new LiteralValue(context, new IntegerValue(Integer.parseInt(i.getText()))); step.setASTNode(i); --- 4932,4938 ---- _t = _t.getNextSibling(); ! // jmv: trouble with bIg integer in XQuery source: step= new LiteralValue(context, new IntegerValue(Integer.parseInt(i.getText()))); ! step= new LiteralValue(context, new IntegerValue( i.getText() )); ! step.setASTNode(i); |