From: <tho...@us...> - 2011-02-27 21:14:11
|
Revision: 4256 http://bigdata.svn.sourceforge.net/bigdata/?rev=4256&view=rev Author: thompsonbry Date: 2011-02-27 21:14:03 +0000 (Sun, 27 Feb 2011) Log Message: ----------- Progress on the runtime optimizer. - Tracking more statistics in the JGraph trace. - Tracking the expected #of tuples read as well as the expected cardinality. This let's us look at a proxy for IO costs, but real IO estimates are tricker. - Made the limit dynamic on a per-patch basis and responsive when there is cardinality estimate underflow. - Never prune a path if the cardinality estimate has underflowed. Instead, increase the sample limit more quickly. Modified Paths: -------------- branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/EdgeSample.java branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/EstimatedCardinalityComparator.java branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/JGraph.java branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/Path.java branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/SampleBase.java branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/Vertex.java branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/VertexSample.java branches/QUADS_QUERY_BRANCH/bigdata-rdf/src/test/com/bigdata/bop/rdf/joinGraph/AbstractJoinGraphTestCase.java branches/QUADS_QUERY_BRANCH/bigdata-rdf/src/test/com/bigdata/bop/rdf/joinGraph/TestJoinGraphOnBSBMData.java branches/QUADS_QUERY_BRANCH/bigdata-rdf/src/test/com/bigdata/bop/rdf/joinGraph/TestJoinGraphOnBarData.java Modified: branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/EdgeSample.java =================================================================== --- branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/EdgeSample.java 2011-02-27 21:11:23 UTC (rev 4255) +++ branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/EdgeSample.java 2011-02-27 21:14:03 UTC (rev 4256) @@ -45,6 +45,11 @@ public final int inputCount; /** + * The #of tuples read from the access path when processing the cutoff join. + */ + public final long tuplesRead; + + /** * The #of binding sets generated before the join was cutoff. * <p> * Note: If the outputCount is zero then this is a good indicator that there @@ -53,10 +58,11 @@ */ public final long outputCount; - /** - * The #of tuples read from the access path when processing the cutoff join. - */ - public final long tuplesRead; + /** + * The adjusted cardinality estimate for the cutoff join (this is + * {@link #outputCount} as adjusted for a variety of edge conditions). + */ + public final long adjCard; /** * The ratio of the #of input samples consumed to the #of output samples @@ -64,37 +70,71 @@ */ public final double f; - /** - * Create an object which encapsulates a sample of an edge. - * - * @param sourceSample - * The input sample. - * @param limit - * The limit used to sample the edge (this is the limit on the - * #of solutions generated by the cutoff join used when this - * sample was taken). - * @param inputCount - * The #of binding sets out of the source sample vertex sample - * which were consumed. - * @param outputCount - * The #of binding sets generated before the join was cutoff. - * @param tuplesRead - * The #of tuples read from the access path when processing the - * cutoff join. - */ + /** + * The sum of the fast range count for each access path tested. + * <p> + * Note: We use pipeline joins to sample cutoff joins so there will be one + * access path read for each solution in. However, a hash join could be used + * when the operator is fully executed. The hash join will have one access + * path on which we read for all input solutions and the range count of the + * access path will be larger since the access path will be less + * constrained. + */ + public final long sumRangeCount; + + /** + * Estimated tuples read if the operator were fully executed. This is in + * contrast to {@link SampleBase#estCard}, which is the estimated output + * cardinality if the operator were fully executed. + * + * TODO The actual IOs depend on the join type (hash join versus pipeline + * join) and whether or not the file has index order (segment versus + * journal). A hash join will read once on the AP. A pipeline join will read + * once per input solution. A key-range read on a segment uses multi-block + * IO while a key-range read on a journal uses random IO. Also, remote + * access path reads are more expensive than sharded or hash partitioned + * access path reads in scale-out. + */ + public final long estRead; + + /** + * Create an object which encapsulates a sample of an edge. + * + * @param sourceSample + * The input sample. + * @param limit + * The limit used to sample the edge (this is the limit on the + * #of solutions generated by the cutoff join used when this + * sample was taken). + * @param inputCount + * The #of binding sets out of the source sample vertex sample + * which were consumed. + * @param tuplesRead + * The #of tuples read from the access path when processing the + * cutoff join. + * @param outputCount + * The #of binding sets generated before the join was cutoff. + * @param adjustedCard + * The adjusted cardinality estimate for the cutoff join (this is + * <i>outputCount</i> as adjusted for a variety of edge + * conditions). + */ EdgeSample(final SampleBase sourceSample,// final int inputCount, // + final long tuplesRead,// + final long sumRangeCount,// final long outputCount,// - final long tuplesRead,// + final long adjustedCard,// final double f, // // args to SampleBase - final long estimatedCardinality,// + final long estCard,// + final long estRead,// final int limit,// final EstimateEnum estimateEnum,// final IBindingSet[] sample// ) { - super(estimatedCardinality, limit, estimateEnum, sample); + super(estCard, limit, estimateEnum, sample); if (sourceSample == null) throw new IllegalArgumentException(); @@ -103,22 +143,31 @@ this.inputCount = inputCount; + this.tuplesRead = tuplesRead; + + this.sumRangeCount = sumRangeCount; + this.outputCount = outputCount; + + this.adjCard = adjustedCard; - this.tuplesRead = tuplesRead; + this.f = f; - this.f = f; + this.estRead = estRead; } @Override protected void toString(final StringBuilder sb) { - sb.append(", sourceEstimatedCardinality=" - + sourceSample.estimatedCardinality); - sb.append(", sourceEstimateEnum=" + sourceSample.estimateEnum); - sb.append(", inputCount=" + inputCount); - sb.append(", outputCount=" + outputCount); - sb.append(", f=" + f); - } + sb.append(", sourceEstCard=" + sourceSample.estCard); + sb.append(", sourceEstimateEnum=" + sourceSample.estimateEnum); + sb.append(", inputCount=" + inputCount); + sb.append(", tuplesRead=" + tuplesRead); + sb.append(", sumRangeCount=" + sumRangeCount); + sb.append(", outputCount=" + outputCount); + sb.append(", adjustedCard=" + adjCard); + sb.append(", f=" + f); + sb.append(", estRead=" + estRead); + } } Modified: branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/EstimatedCardinalityComparator.java =================================================================== --- branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/EstimatedCardinalityComparator.java 2011-02-27 21:11:23 UTC (rev 4255) +++ branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/EstimatedCardinalityComparator.java 2011-02-27 21:14:03 UTC (rev 4256) @@ -50,8 +50,8 @@ // o2 is not weighted. sort o2 to the end. return -1; } - final long id1 = o1.edgeSample.estimatedCardinality; - final long id2 = o2.edgeSample.estimatedCardinality; + final long id1 = o1.edgeSample.estCard; + final long id2 = o2.edgeSample.estCard; if (id1 < id2) return -1; if (id1 > id2) Modified: branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/JGraph.java =================================================================== --- branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/JGraph.java 2011-02-27 21:11:23 UTC (rev 4255) +++ branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/JGraph.java 2011-02-27 21:14:03 UTC (rev 4256) @@ -83,9 +83,9 @@ * dominated paths. * * TODO Compare the cumulative expected cardinality of a join path with the - * expected cost of a join path. The latter allows us to also explore - * alternative join strategies, such as the parallel subquery versus scan and - * filter decision for named graph and default graph SPARQL queries. + * tuples read of a join path. The latter allows us to also explore alternative + * join strategies, such as the parallel subquery versus scan and filter + * decision for named graph and default graph SPARQL queries. * * TODO Coalescing duplicate access paths can dramatically reduce the work * performed by a pipelined nested index subquery. (A hash join eliminates all @@ -93,14 +93,6 @@ * pipeline nested index subquery join, then should the runtime query optimizer * prefer paths with duplicate access paths? * - * TODO How can we handle things like lexicon joins. A lexicon join is is only - * evaluated when the dynamic type of a variable binding indicates that the RDF - * Value must be materialized by a join against the ID2T index. Binding sets - * having inlined values can simply be routed around the join against the ID2T - * index. Routing around saves network IO in scale-out where otherwise we would - * route binding sets having identifiers which do not need to be materialized to - * the ID2T shards. - * * @todo Examine the overhead of the runtime optimizer. Look at ways to prune * its costs. For example, by pruning the search, by recognizing when the * query is simple enough to execute directly, by recognizing when we have @@ -217,8 +209,10 @@ */ public class JGraph { - private static final transient Logger log = Logger.getLogger(JGraph.class); + private static final String NA = "N/A"; + private static final transient Logger log = Logger.getLogger(JGraph.class); + /** * Vertices of the join graph. */ @@ -254,32 +248,37 @@ return sb.toString(); } - /** - * - * @param v - * The vertices of the join graph. These are - * {@link IPredicate}s associated with required joins. - * @param constraints - * The constraints of the join graph (optional). Since all - * joins in the join graph are required, constraints are - * dynamically attached to the first join in which all of - * their variables are bound. - * - * @throws IllegalArgumentException - * if the vertices is <code>null</code>. - * @throws IllegalArgumentException - * if the vertices is an empty array. - * @throws IllegalArgumentException - * if any element of the vertices is <code>null</code>. - * @throws IllegalArgumentException - * if any constraint uses a variable which is never bound by - * the given predicates. - * @throws IllegalArgumentException - * if <i>sampleType</i> is <code>null</code>. - * - * @todo unit test for a constraint using a variable which is never - * bound. - */ + /** + * + * @param v + * The vertices of the join graph. These are {@link IPredicate}s + * associated with required joins. + * @param constraints + * The constraints of the join graph (optional). Since all joins + * in the join graph are required, constraints are dynamically + * attached to the first join in which all of their variables are + * bound. + * + * @throws IllegalArgumentException + * if the vertices is <code>null</code>. + * @throws IllegalArgumentException + * if the vertices is an empty array. + * @throws IllegalArgumentException + * if any element of the vertices is <code>null</code>. + * @throws IllegalArgumentException + * if any constraint uses a variable which is never bound by the + * given predicates. + * @throws IllegalArgumentException + * if <i>sampleType</i> is <code>null</code>. + * + * @todo unit test for a constraint using a variable which is never bound. + * the constraint should be attached at the last vertex in the join + * path. this will cause the query to fail unless the variable was + * already bound, e.g., by a parent query or in the solutions pumped + * into the {@link JoinGraph} operator. + * + * @todo unit test when the join graph has a single vertex. + */ public JGraph(final IPredicate<?>[] v, final IConstraint[] constraints, final SampleType sampleType) { @@ -557,30 +556,35 @@ */ sampleAllVertices(queryEngine, limit); - if (log.isDebugEnabled()) { - final StringBuilder sb = new StringBuilder(); - sb.append("Vertices:\n"); - for (Vertex v : V) { - sb.append(v.toString()); - sb.append("\n"); - } - log.debug(sb.toString()); - } + if (log.isInfoEnabled()) { + final StringBuilder sb = new StringBuilder(); + sb.append("Sampled vertices:\n"); + for (Vertex v : V) { + if (v.sample != null) { + sb.append("id="+v.pred.getId()+" : "); + sb.append(v.sample.toString()); + sb.append("\n"); + } + } + log.info(sb.toString()); + } /* * Estimate the cardinality for each edge. */ final Path[] a = estimateInitialEdgeWeights(queryEngine, limit); - if (log.isDebugEnabled()) { - final StringBuilder sb = new StringBuilder(); - sb.append("All possible initial paths:\n"); - for (Path x : a) { - sb.append(x.toString()); - sb.append("\n"); - } - log.debug(sb.toString()); - } +// if (log.isDebugEnabled()) { +// final StringBuilder sb = new StringBuilder(); +// sb.append("All possible initial paths:\n"); +// for (Path x : a) { +// sb.append(x.toString()); +// sb.append("\n"); +// } +// log.debug(sb.toString()); +// } + if (log.isInfoEnabled()) + log.info("\n*** Initial Paths\n" + JGraph.showTable(a)); /* * Choose the initial set of paths. @@ -681,12 +685,11 @@ if (a.length == 0) throw new IllegalArgumentException(); - // increment the limit by itself in each round. - final int limit = (round + 1) * limitIn; +// // increment the limit by itself in each round. +// final int limit = (round + 1) * limitIn; if (log.isDebugEnabled()) - log.debug("round=" + round + ", limit=" + limit - + ", #paths(in)=" + a.length); + log.debug("round=" + round + ", #paths(in)=" + a.length); /* * Re-sample the vertices which are the initial vertex of any of the @@ -703,10 +706,12 @@ * a NOP if the vertex has been fully materialized. */ if (log.isDebugEnabled()) - log.debug("Re-sampling in-use vertices: limit=" + limit); + log.debug("Re-sampling in-use vertices."); for (Path x : a) { + final int limit = x.getNewLimit(limitIn); + x.vertices[0].sample(queryEngine, limit, sampleType); } @@ -725,11 +730,22 @@ * a given path prefix more than once per round. */ if (log.isDebugEnabled()) - log.debug("Re-sampling in-use path segments: limit=" + limit); + log.debug("Re-sampling in-use path segments."); for (Path x : a) { - // The cutoff join sample of the one step shorter path segment. + /* + * Get the new sample limit for the path. + * + * TODO We only need to increase the sample limit starting at the + * vertex where we have a cardinality underflow or variability in + * the cardinality estimate. This is increasing the limit in each + * round of expansion, which means that we are reading more data + * than we really need to read. + */ + final int limit = x.getNewLimit(limitIn); + + // The cutoff join sample of the one step shorter path segment. EdgeSample priorEdgeSample = null; for (int segmentLength = 2; segmentLength <= x.vertices.length; segmentLength++) { @@ -775,6 +791,7 @@ queryEngine, limit,// x.getPathSegment(2),// 1st edge. C,// constraints + V.length == 2,// pathIsComplete x.vertices[0].sample// source sample. ); @@ -812,6 +829,7 @@ limit,// x.getPathSegment(ids.length()),// C, // constraints + V.length == ids.length(), // pathIsComplete priorEdgeSample// ); @@ -844,14 +862,19 @@ */ if (log.isDebugEnabled()) - log.debug("Expanding paths: limit=" + limit + ", #paths(in)=" - + a.length); + log.debug("Expanding paths: #paths(in)=" + a.length); final List<Path> tmp = new LinkedList<Path>(); for (Path x : a) { - /* + /* + * We already increased the sample limit for the path in the loop + * above. + */ + final int limit = x.edgeSample.limit; + + /* * The set of vertices used to expand this path in this round. */ final Set<Vertex> used = new LinkedHashSet<Vertex>(); @@ -916,9 +939,10 @@ // add the new vertex to the set of used vertices. used.add(tVertex); - // Extend the path to the new vertex. - final Path p = x.addEdge(queryEngine, limit, - tVertex, /*dynamicEdge,*/ C); + // Extend the path to the new vertex. + final Path p = x + .addEdge(queryEngine, limit, tVertex, /* dynamicEdge, */ + C, x.getVertexCount() + 1 == V.length/* pathIsComplete */); // Add to the set of paths for this round. tmp.add(p); @@ -954,8 +978,9 @@ final Vertex tVertex = nothingShared.iterator().next(); // Extend the path to the new vertex. - final Path p = x.addEdge(queryEngine, limit, - tVertex,/*dynamicEdge*/ C); + final Path p = x + .addEdge(queryEngine, limit, tVertex,/* dynamicEdge */ + C, x.getVertexCount() + 1 == V.length/* pathIsComplete */); // Add to the set of paths for this round. tmp.add(p); @@ -981,12 +1006,12 @@ final Path[] paths_tp1_pruned = pruneJoinPaths(paths_tp1, edgeSamples); if (log.isDebugEnabled()) - log.debug("\n*** round=" + round + ", limit=" + limit + log.debug("\n*** round=" + round + " : generated paths\n" + JGraph.showTable(paths_tp1, paths_tp1_pruned)); if (log.isInfoEnabled()) - log.info("\n*** round=" + round + ", limit=" + limit + log.info("\n*** round=" + round + ": paths{in=" + a.length + ",considered=" + paths_tp1.length + ",out=" + paths_tp1_pruned.length + "}\n" + JGraph.showTable(paths_tp1_pruned)); @@ -1012,15 +1037,31 @@ return null; } - /** - * Obtain a sample and estimated cardinality (fast range count) for each - * vertex. - * - * @param queryEngine - * The query engine. - * @param limit - * The sample size. - */ + /** + * Obtain a sample and estimated cardinality (fast range count) for each + * vertex. + * + * @param queryEngine + * The query engine. + * @param limit + * The sample size. + * + * TODO Only sample vertices with an index. + * + * TODO Consider other cases where we can avoid sampling a vertex + * or an initial edge. + * <p> + * Be careful about rejecting high cardinality vertices here as + * they can lead to good solutions (see the "bar" data set + * example). + * <p> + * BSBM Q5 provides a counter example where (unless we translate + * it into a key-range constraint on an index) some vertices do + * not share a variable directly and hence will materialize the + * full cross product before filtering which is *really* + * expensive. + * + */ public void sampleAllVertices(final QueryEngine queryEngine, final int limit) { for (Vertex v : V) { @@ -1068,7 +1109,9 @@ * create a join path with a single edge (v,vp) using the sample * obtained from the cutoff join. */ - + + final boolean pathIsComplete = 2 == V.length; + for (int i = 0; i < V.length; i++) { final Vertex v1 = V[i]; @@ -1106,7 +1149,7 @@ */ final Vertex v, vp; - if (v1.sample.estimatedCardinality < v2.sample.estimatedCardinality) { + if (v1.sample.estCard < v2.sample.estCard) { v = v1; vp = v2; } else { @@ -1143,6 +1186,7 @@ limit, // sample limit preds, // ordered path segment. C, // constraints + pathIsComplete,// v.sample // sourceSample ); @@ -1181,6 +1225,7 @@ */ public Path[] pruneJoinPaths(final Path[] a, final Map<PathIds, EdgeSample> edgeSamples) { + final boolean neverPruneUnderflow = true; /* * Find the length of the longest path(s). All shorter paths are * dropped in each round. @@ -1198,7 +1243,12 @@ final Path Pi = a[i]; if (Pi.edgeSample == null) throw new RuntimeException("Not sampled: " + Pi); - if (Pi.vertices.length < maxPathLen) { + if (neverPruneUnderflow + && Pi.edgeSample.estimateEnum == EstimateEnum.Underflow) { + // Do not prune if path has cardinality underflow. + continue; + } + if (Pi.vertices.length < maxPathLen) { /* * Only the most recently generated set of paths survive to * the next round. @@ -1214,16 +1264,21 @@ final Path Pj = a[j]; if (Pj.edgeSample == null) throw new RuntimeException("Not sampled: " + Pj); - if (pruned.contains(Pj)) + if (neverPruneUnderflow + && Pj.edgeSample.estimateEnum == EstimateEnum.Underflow) { + // Do not prune if path has cardinality underflow. + continue; + } + if (pruned.contains(Pj)) continue; final boolean isPiSuperSet = Pi.isUnorderedVariant(Pj); if (!isPiSuperSet) { // Can not directly compare these join paths. continue; } - final long costPi = Pi.cumulativeEstimatedCardinality; - final long costPj = Pj.cumulativeEstimatedCardinality; - final boolean lte = costPi <= costPj; + final long costPi = Pi.sumEstCard; + final long costPj = Pj.sumEstCard; + final boolean lte = costPi <= costPj; List<Integer> prunedByThisPath = null; if (lte) { prunedByThisPath = new LinkedList<Integer>(); @@ -1363,17 +1418,24 @@ static public String showTable(final Path[] a,final Path[] pruned) { final StringBuilder sb = new StringBuilder(); final Formatter f = new Formatter(sb); - f.format("%-6s %10s%1s * %10s (%6s %6s %6s) = %10s%1s : %10s %10s", + f.format("%-4s %10s%1s * %10s (%8s %8s %8s %8s %8s %8s) = %10s %10s%1s : %10s %10s %10s", "path",// - "sourceCard",// + "srcCard",// "",// sourceSampleExact "f",// + // ( "in",// - "read",// + "sumRgCt",//sumRangeCount + "tplsRead",// "out",// + "limit",// + "adjCard",// + // ) = + "estRead",// "estCard",// "",// estimateIs(Exact|LowerBound|UpperBound) - "sumEstCard",// + "sumEstRead",// sumEstimatedTuplesRead + "sumEstCard",// sumEstimatedCardinality "joinPath\n" ); for (int i = 0; i < a.length; i++) { @@ -1391,21 +1453,27 @@ } final EdgeSample edgeSample = x.edgeSample; if (edgeSample == null) { - f.format("%6d %10s%1s * %10s (%6s %6s %6s) = %10s%1s : %10s",// - i, "N/A", "", "N/A", "N/A", "N/A", "N/A", "N/A", "", - "N/A"); + f.format("%4d %10s%1s * %10s (%8s %8s %8s %8s %8s %8s) = %10s %10s%1s : %10s %10s",// + i, NA, "", NA, NA, NA, NA, NA, NA, NA, NA, NA, "", NA, + NA); } else { - f.format("%6d %10d%1s * % 10.2f (%6d %6d %6d) = % 10d%1s : % 10d", // + f.format("%4d %10d%1s * % 10.2f (%8d %8d %8d %8d %8d %8d) = %10d % 10d%1s : % 10d % 10d", // i,// - edgeSample.sourceSample.estimatedCardinality,// + edgeSample.sourceSample.estCard,// edgeSample.sourceSample.estimateEnum.getCode(),// edgeSample.f,// edgeSample.inputCount,// + edgeSample.sumRangeCount,// edgeSample.tuplesRead,// edgeSample.outputCount,// - edgeSample.estimatedCardinality,// + edgeSample.limit,// + edgeSample.adjCard,// + // = + edgeSample.estRead,// + edgeSample.estCard,// edgeSample.estimateEnum.getCode(),// - x.cumulativeEstimatedCardinality// + x.sumEstRead,// + x.sumEstCard// ); } sb.append(" ["); @@ -1443,70 +1511,103 @@ /* * @todo show limit on samples? */ - f.format("%6s %10s%1s * %10s (%6s %6s %6s) = %10s%1s : %10s",// - "vertex", - "sourceCard",// + f.format("%4s %10s%1s * %10s (%8s %8s %8s %8s %8s %8s) = %10s %10s%1s : %10s %10s",// %10s %10s",// + "vert", + "srcCard",// "",// sourceSampleExact "f",// + // ( "in",// - "read",// + "sumRgCt",// sumRangeCount + "tplsRead",// tuplesRead "out",// + "limit",// + "adjCard",// + // ) = + "estRead",// "estCard",// "",// estimateIs(Exact|LowerBound|UpperBound) + "sumEstRead",// "sumEstCard"// ); - long sumEstCard = 0; + long sumEstRead = 0; // sum(estRead), where estRead := tuplesRead*f + long sumEstCard = 0; // sum(estCard) +// double sumEstCost = 0; // sum(f(estCard,estRead)) for (int i = 0; i < x.vertices.length; i++) { final int[] ids = BOpUtility .getPredIds(x.getPathSegment(i + 1)); final int predId = x.vertices[i].pred.getId(); final SampleBase sample; - if(i==0) { - sample = x.vertices[i].sample; - } else { - // edge sample from the caller's map. - sample = edgeSamples.get(new PathIds(ids)); - } - if (sample != null) { - sumEstCard += sample.estimatedCardinality; - if (sample instanceof EdgeSample) - sumEstCard += ((EdgeSample) sample).tuplesRead; - } + if (i == 0) { + sample = x.vertices[i].sample; + if (sample != null) { + sumEstRead = sample.estCard; // dbls as estRead for vtx + sumEstCard = sample.estCard; + } + } else { + // edge sample from the caller's map. + sample = edgeSamples.get(new PathIds(ids)); + if (sample != null) { + sumEstRead+= ((EdgeSample) sample).estRead; + sumEstCard += ((EdgeSample) sample).estCard; + } + } sb.append("\n"); if (sample == null) { - f.format("% 6d %10s%1s * %10s (%6s %6s %6s) = %10s%1s : %10s",// + f.format("% 4d %10s%1s * %10s (%8s %8s %8s %8s %8s %8s) = %10s %10s%1s : %10s %10s",// %10s %10s",// predId,// - "N/A", "", "N/A", "N/A", "N/A", "N/A", "N/A", "", "N/A"); + NA, "", NA, NA, NA, NA, NA, NA, NA, NA, NA, "", NA, NA);//,NA,NA); } else if(sample instanceof VertexSample) { - // Show the vertex sample for the initial vertex. - f.format("% 6d %10s%1s * %10s (%6s %6s %6s) = % 10d%1s : %10d",// + /* + * Show the vertex sample for the initial vertex. + * + * Note: we do not store all fields for a vertex sample + * which are stored for an edge sample because so many of + * the values are redundant for a vertex sample. Therefore, + * this sets up local variables which are equivalent to the + * various edge sample columns that we will display. + */ + final long sumRangeCount = sample.estCard; + final long estRead = sample.estCard; + final long tuplesRead = Math.min(sample.estCard, sample.limit); + final long outputCount = Math.min(sample.estCard, sample.limit); + final long adjCard = Math.min(sample.estCard, sample.limit); + f.format("% 4d %10s%1s * %10s (%8s %8s %8s %8s %8s %8s) = % 10d % 10d%1s : %10d %10d",// %10d %10s",// predId,// - "N/A",//sample.sourceSample.estimatedCardinality,// - " ",//sample.sourceSample.isExact() ? "E" : "",// + " ",//srcSample.estCard + " ",//srcSample.estimateEnum " ",//sample.f,// - "N/A",//sample.inputCount,// - "N/A",//sample.tuplesRead,// - "N/A",//sample.outputCount,// - sample.estimatedCardinality,// + " ",//sample.inputCount, + sumRangeCount,// + tuplesRead,// + outputCount,// + sample.limit,// limit + adjCard,// adjustedCard + estRead,// estRead + sample.estCard,// estCard sample.estimateEnum.getCode(),// + sumEstRead,// sumEstCard// -// e.cumulativeEstimatedCardinality// ); } else { // Show the sample for a cutoff join with the 2nd+ vertex. final EdgeSample edgeSample = (EdgeSample)sample; - f.format("% 6d %10d%1s * % 10.2f (%6d %6d %6d) = % 10d%1s : %10d",// + f.format("% 4d %10d%1s * % 10.2f (%8d %8d %8d %8d %8d %8d) = % 10d % 10d%1s : %10d %10d",// %10d %10",// predId,// - edgeSample.sourceSample.estimatedCardinality,// + edgeSample.sourceSample.estCard,// edgeSample.sourceSample.estimateEnum.getCode(),// edgeSample.f,// edgeSample.inputCount,// + edgeSample.sumRangeCount,// edgeSample.tuplesRead,// edgeSample.outputCount,// - edgeSample.estimatedCardinality,// + edgeSample.limit,// + edgeSample.adjCard,// + edgeSample.estRead,// + edgeSample.estCard,// edgeSample.estimateEnum.getCode(),// + sumEstRead,// sumEstCard// -// e.cumulativeEstimatedCardinality// ); } } Modified: branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/Path.java =================================================================== --- branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/Path.java 2011-02-27 21:11:23 UTC (rev 4255) +++ branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/Path.java 2011-02-27 21:14:03 UTC (rev 4256) @@ -91,19 +91,43 @@ * is computed by the constructor and cached as it is used repeatedly. */ private final IPredicate<?>[] preds; + + /** + * The sample obtained by the step-wise cutoff evaluation of the ordered + * edges of the path. + * <p> + * Note: This sample is generated one edge at a time rather than by + * attempting the cutoff evaluation of the entire join path (the latter + * approach does allow us to limit the amount of work to be done to satisfy + * the cutoff). + * <p> + * Note: This is updated when we resample the path prior to expanding the + * path with another vertex. + */ + EdgeSample edgeSample;// TODO rename pathSample? + + /** + * Examine the path. If there is a cardinality underflow, then boost the + * sampling limit. Otherwise, increase the sample by the caller's value. + * + * @param limitIn + * The default increment for the sample limit. + * + * @return The limit to use when resampling this path. + */ + public int getNewLimit(final int limitIn) { + + if (edgeSample.estimateEnum == EstimateEnum.Underflow) { + + return edgeSample.limit * 2; + + } + + return edgeSample.limit + limitIn; + + } /** - * The sample obtained by the step-wise cutoff evaluation of the ordered - * edges of the path. - * <p> - * Note: This sample is generated one edge at a time rather than by - * attempting the cutoff evaluation of the entire join path (the latter - * approach does allow us to limit the amount of work to be done to - * satisfy the cutoff). - */ - EdgeSample edgeSample; - - /** * The cumulative estimated cardinality of the path. This is zero for an * empty path. For a path consisting of a single edge, this is the estimated * cardinality of that edge. When creating a new path by adding an edge to @@ -118,29 +142,36 @@ * vertex in path order. The EdgeSamples are maintained in a map * managed by JGraph during optimization. */ - final public long cumulativeEstimatedCardinality; + final public long sumEstCard; + /** + * The cumulative estimated #of tuples that would be read for this path if + * it were to be fully executed (sum(tuplesRead*f) for each step in the + * path). + */ + final public long sumEstRead; + + /** + * The expected cost of this join path if it were to be fully executed. This + * is a function of {@link #sumEstCard} and {@link #sumEstRead}. The + * former reflects the #of intermediate solutions generated. The latter + * reflects the #of tuples read from the disk. These two measures are + * tracked separately and then combined into the {@link #sumEstCost}. + */ + final public double sumEstCost; + /** - * Combine the cumulative estimated cost of the source path with the cost of - * the edge sample and return the cumulative estimated cost of the new path. + * Combine the cumulative expected cardinality and the cumulative expected + * tuples read to produce an overall measure of the expected cost of the + * join path if it were to be fully executed. * - * @param cumulativeEstimatedCardinality - * The cumulative estimated cost of the source path. - * @param edgeSample - * The cost estimate for the cutoff join required to extend the - * source path to the new path. - * @return The cumulative estimated cost of the new path. - * - * FIXME Figure out how to properly combine/weight the #of tuples - * read and the #of solutions produced! + * @return The cumulative estimated cost of the join path. */ - static private long add(final long cumulativeEstimatedCardinality, - final EdgeSample edgeSample) { + private static double getTotalCost(final Path p) { - final long total = cumulativeEstimatedCardinality + // - edgeSample.estimatedCardinality // -// + edgeSample.tuplesRead // - ; + final long total = p.sumEstCard + // + p.sumEstRead// + ; return total; @@ -162,7 +193,7 @@ // sb.append(e.getLabel()); // first = false; // } - sb.append("],cumEstCard=" + cumulativeEstimatedCardinality + sb.append("],cumEstCard=" + sumEstCard + ",sample=" + edgeSample + "}"); return sb.toString(); } @@ -205,48 +236,52 @@ if (edgeSample.getSample() == null) throw new IllegalArgumentException(); -// this.edges = Collections.singletonList(e); + this.vertices = new Vertex[]{v0,v1}; - this.vertices = new Vertex[]{v0,v1};//getVertices(edges); - this.preds = getPredicates(vertices); this.edgeSample = edgeSample; - this.cumulativeEstimatedCardinality = add(0L/*cumulativeEstimatedCardinality*/,edgeSample); -// edgeSample.estimatedCardinality +// -// edgeSample.tuplesRead// this is part of the cost too. -// ; + /* + * The estimated cardinality of the cutoff join of (v0,v1). + */ + this.sumEstCard = edgeSample.estCard; -// this.cumulativeEstimatedCardinality = // -// edgeSample.estimatedCardinality +// -// edgeSample.tuplesRead// this is part of the cost too. -// ; + /* + * The expected #of tuples read for the full join of (v0,v1). This is + * everything which could be visited for [v0] plus the #of tuples read + * from [v1] during the cutoff join times the (adjusted) join hit ratio. + */ + this.sumEstRead = v0.sample.estCard + edgeSample.estRead; + this.sumEstCost = getTotalCost(this); + } - /** - * Private constructor used when we extend a path. - * - * @param vertices - * The ordered array of vertices in the new path. The last entry - * in this array is the vertex which was used to extend the path. - * @param preds - * The ordered array of predicates in the new path (correlated - * with the vertices and passed in since it is already computed - * by the caller). - * @param cumulativeEstimatedCardinality - * The cumulative estimated cardinality of the new path. - * @param edgeSample - * The sample from the cutoff join of the last vertex added to - * this path. - */ + /** + * Private constructor used when we extend a path. + * + * @param vertices + * The ordered array of vertices in the new path. The last entry + * in this array is the vertex which was used to extend the path. + * @param preds + * The ordered array of predicates in the new path (correlated + * with the vertices and passed in since it is already computed + * by the caller). + * @param edgeSample + * The sample from the cutoff join of the last vertex added to + * this path. + * @param sumEstCard + * The cumulative estimated cardinality of the new path. + * @param sumEstRead + * The cumulative estimated tuples read of the new path. + */ private Path(// final Vertex[] vertices,// final IPredicate<?>[] preds,// -// final List<Edge> edges,// - final long cumulativeEstimatedCardinality,// - final EdgeSample edgeSample// + final EdgeSample edgeSample,// + final long sumEstCard,// + final long sumEstRead// ) { if (vertices == null) @@ -258,7 +293,7 @@ if (vertices.length != preds.length) throw new IllegalArgumentException(); - if (cumulativeEstimatedCardinality < 0) + if (sumEstCard < 0) throw new IllegalArgumentException(); if (edgeSample == null) @@ -267,15 +302,17 @@ if (edgeSample.getSample() == null) throw new IllegalArgumentException(); -// this.edges = Collections.unmodifiableList(edges); + this.vertices = vertices; - this.vertices = vertices; - - this.preds = preds; - - this.cumulativeEstimatedCardinality = cumulativeEstimatedCardinality; + this.preds = preds; - this.edgeSample = edgeSample; + this.edgeSample = edgeSample; + + this.sumEstCard = sumEstCard; + + this.sumEstRead = sumEstRead; + + this.sumEstCost = getTotalCost(this); } @@ -552,6 +589,9 @@ * The new vertex. * @param constraints * The join graph constraints (if any). + * @param pathIsComplete + * <code>true</code> iff all vertices in the join graph are + * incorporated into this path. * * @return The new path. The materialized sample for the new path is the * sample obtained by the cutoff join for the edge added to the @@ -560,7 +600,8 @@ * @throws Exception */ public Path addEdge(final QueryEngine queryEngine, final int limit, - final Vertex vnew, final IConstraint[] constraints) + final Vertex vnew, final IConstraint[] constraints, + final boolean pathIsComplete) throws Exception { if (vnew == null) @@ -626,59 +667,63 @@ limit, // preds2,// constraints,// + pathIsComplete,// this.edgeSample // the source sample. ); - { - final long cumulativeEstimatedCardinality = add( - this.cumulativeEstimatedCardinality, edgeSample2); + // Extend the path. + final Path tmp = new Path(// + vertices2,// + preds2,// + edgeSample2,// + this.sumEstCard + edgeSample2.estCard,// sumEstCard + this.sumEstRead + edgeSample2.estRead // sumEstRead + ); - // Extend the path. - final Path tmp = new Path(vertices2, preds2, - cumulativeEstimatedCardinality, edgeSample2); + return tmp; - return tmp; - - } - } - /** - * Cutoff join of the last vertex in the join path. - * <p> - * <strong>The caller is responsible for protecting against needless - * re-sampling.</strong> This includes cases where a sample already exists - * at the desired sample limit and cases where the sample is already exact. - * - * @param queryEngine - * The query engine. - * @param limit - * The limit for the cutoff join. - * @param path - * The path segment, which must include the target vertex as the - * last component of the path segment. - * @param constraints - * The constraints declared for the join graph (if any). The - * appropriate constraints will be applied based on the variables - * which are known to be bound as of the cutoff join for the last - * vertex in the path segment. - * @param sourceSample - * The input sample for the cutoff join. When this is a one-step - * estimation of the cardinality of the edge, then this sample is - * taken from the {@link VertexSample}. When the edge (vSource, - * vTarget) extends some {@link Path}, then this is taken from - * the {@link EdgeSample} for that {@link Path}. - * - * @return The result of sampling that edge. - * - * @throws Exception - */ + /** + * Cutoff join of the last vertex in the join path. + * <p> + * <strong>The caller is responsible for protecting against needless + * re-sampling.</strong> This includes cases where a sample already exists + * at the desired sample limit and cases where the sample is already exact. + * + * @param queryEngine + * The query engine. + * @param limit + * The limit for the cutoff join. + * @param path + * The path segment, which must include the target vertex as the + * last component of the path segment. + * @param constraints + * The constraints declared for the join graph (if any). The + * appropriate constraints will be applied based on the variables + * which are known to be bound as of the cutoff join for the last + * vertex in the path segment. + * @param pathIsComplete + * <code>true</code> iff all vertices in the join graph are + * incorporated into this path. + * @param sourceSample + * The input sample for the cutoff join. When this is a one-step + * estimation of the cardinality of the edge, then this sample is + * taken from the {@link VertexSample}. When the edge (vSource, + * vTarget) extends some {@link Path}, then this is taken from + * the {@link EdgeSample} for that {@link Path}. + * + * @return The result of sampling that edge. + * + * @throws Exception + */ static public EdgeSample cutoffJoin(// final QueryEngine queryEngine,// final int limit,// final IPredicate<?>[] path,// final IConstraint[] constraints,// + final boolean pathIsComplete,// final SampleBase sourceSample// ) throws Exception { @@ -702,8 +747,8 @@ // Figure out which constraints attach to each predicate. final IConstraint[][] constraintAttachmentArray = PartitionedJoinGroup - .getJoinGraphConstraints(path, constraints,null/*knownVariables*/, - false/*FIXME pathIsComplete*/); + .getJoinGraphConstraints(path, constraints, null/*knownBound*/, + pathIsComplete); // The constraint(s) (if any) for this join. final IConstraint[] c = constraintAttachmentArray[path.length - 1]; @@ -793,6 +838,7 @@ final List<IBindingSet> result = new LinkedList<IBindingSet>(); try { + int nresults = 0; try { IBindingSet bset = null; // Figure out the #of source samples consumed. @@ -801,10 +847,11 @@ while (itr.hasNext()) { bset = itr.next(); result.add(bset); + nresults++; // TODO break out if cutoff join over produces! } } finally { // verify no problems. - runningQuery.get(); + runningQuery.get(); // TODO CANCEL query once limit is satisfied THEN check the future for errors. } } finally { runningQuery.cancel(true/* mayInterruptIfRunning */); @@ -822,8 +869,11 @@ final int inputCount = (int) joinStats.inputSolutions.get(); // #of solutions out. - long outputCount = joinStats.outputSolutions.get(); + final long outputCount = joinStats.outputSolutions.get(); + // #of solutions out as adjusted for various edge conditions. + final long adjustedCard; + // cumulative range count of the sampled access paths. final long sumRangeCount = joinStats.accessPathRangeCount.get(); @@ -838,6 +888,7 @@ * number of output solutions. */ estimateEnum = EstimateEnum.Exact; + adjustedCard = outputCount; } else if (inputCount == 1 && outputCount == limit) { /* * If the inputCount is ONE (1) and the outputCount is the limit, @@ -856,11 +907,11 @@ * are really better to be dropped. */ // replace outputCount with the sum of the sampled range counts. - outputCount = sumRangeCount; + adjustedCard = sumRangeCount; estimateEnum = EstimateEnum.LowerBound; } else if ((sourceSample.estimateEnum != EstimateEnum.Exact) - && inputCount == Math.min(sourceSample.limit, - sourceSample.estimatedCardinality) && outputCount == 0) { + /*&& inputCount == Math.min(sourceSample.limit, + sourceSample.estimatedCardinality) */ && outputCount == 0) { /* * When the source sample was not exact, the inputCount is EQ to the * lesser of the source range count and the source sample limit, and @@ -874,10 +925,16 @@ * Note: An apparent join hit ratio of zero does NOT imply that the * join will be empty (unless the source vertex sample is actually * the fully materialized access path - this case is covered above). + * + * path sourceCard * f ( in read out limit adjCard) = estCard : sumEstCard joinPath + * 15 4800L * 0.00 ( 200 200 0 300 0) = 0 : 3633 [ 3 1 6 5 ] + */ estimateEnum = EstimateEnum.Underflow; + adjustedCard = outputCount; } else { estimateEnum = EstimateEnum.Normal; + adjustedCard = outputCount; } /* @@ -891,20 +948,43 @@ * read. */ final long tuplesRead = joinStats.accessPathUnitsIn.get(); - - final double f = outputCount == 0 ? 0 - : (outputCount / (double) inputCount); - final long estimatedCardinality = (long) (sourceSample.estimatedCardinality * f); + /* + * Compute the hit-join ratio based on the adjusted cardinality + * estimate. + */ + final double f = adjustedCard == 0 ? 0 + : (adjustedCard / (double) inputCount); +// final double f = outputCount == 0 ? 0 +// : (outputCount / (double) inputCount); + // estimated output cardinality of fully executed operator. + final long estCard = (long) (sourceSample.estCard * f); + + /* + * estimated tuples read for fully executed operator + * + * TODO The actual IOs depend on the join type (hash join versus + * pipeline join) and whether or not the file has index order (segment + * versus journal). A hash join will read once on the AP. A pipeline + * join will read once per input solution. A key-range read on a segment + * uses multi-block IO while a key-range read on a journal uses random + * IO. Also, remote access path reads are more expensive than sharded + * or hash partitioned access path reads in scale-out. + */ + final long estRead = (long) (sumRangeCount * f); + final EdgeSample edgeSample = new EdgeSample(// sourceSample,// inputCount,// + tuplesRead,// + sumRangeCount,// outputCount, // - tuplesRead,// + adjustedCard,// f, // // args to SampleBase - estimatedCardinality, // + estCard, // estimated output cardinality if fully executed. + estRead, // estimated tuples read if fully executed. limit, // estimateEnum,// result.toArray(new IBindingSet[result.size()])); Modified: branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/SampleBase.java =================================================================== --- branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/SampleBase.java 2011-02-27 21:11:23 UTC (rev 4255) +++ branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/SampleBase.java 2011-02-27 21:14:03 UTC (rev 4256) @@ -56,11 +56,17 @@ private static final transient Logger log = Logger .getLogger(SampleBase.class); - /** - * The estimated cardinality of the underlying access path (for a vertex) or - * the join (for a cutoff join). - */ - public final long estimatedCardinality; + /** + * The total estimated cardinality of the underlying access path (for a + * vertex) or the join path segment (for a cutoff join). + * + * TODO When using a non-perfect index, the estimated cardinality is only + * part of the cost. The #of tuples scanned is also important. Even when + * scanning and filtering in key order this could trigger random IOs unless + * the file has index order (an IndexSegment file has index order but a + * BTree on a journal does not). + */ + public final long estCard; /** * The limit used to produce the {@link #getSample() sample}. @@ -156,7 +162,7 @@ if (sample == null) throw new IllegalArgumentException(); - this.estimatedCardinality = estimatedCardinality; + this.estCard = estimatedCardinality; this.limit = limit; @@ -180,7 +186,7 @@ public String toString() { final StringBuilder sb = new StringBuilder(); sb.append(getClass().getSimpleName()); - sb.append("{estimatedCardinality=" + estimatedCardinality); + sb.append("{estCard=" + estCard); sb.append(",limit=" + limit); sb.append(",estimateEnum=" + estimateEnum); { Modified: branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/Vertex.java =================================================================== --- branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/Vertex.java 2011-02-27 21:11:23 UTC (rev 4255) +++ branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/Vertex.java 2011-02-27 21:14:03 UTC (rev 4256) @@ -153,7 +153,7 @@ final IAccessPath ap = context.getAccessPath(r, pred); final long rangeCount = oldSample == null ? ap - .rangeCount(false/* exact */) : oldSample.estimatedCardinality; + .rangeCount(false/* exact */) : oldSample.estCard; if (rangeCount <= limit) { Modified: branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/VertexSample.java =================================================================== --- branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/VertexSample.java 2011-02-27 21:11:23 UTC (rev 4255) +++ branches/QUADS_QUERY_BRANCH/bigdata/src/java/com/bigdata/bop/joinGraph/rto/VertexSample.java 2011-02-27 21:14:03 UTC (rev 4256) @@ -37,7 +37,7 @@ * historical view or even a time scale of query which is significantly * faster than update). * - * @param estimatedCardinality + * @param estCard * The estimated cardinality. * @param limit * The cutoff limit used to make that cardinality estimate. @@ -49,10 +49,10 @@ * @param sample * The sample. */ - public VertexSample(final long estimatedCardinality, final int limit, + public VertexSample(final long estCard, final int limit, final EstimateEnum estimateEnum, final IBindingSet[] sample) { - super(estimatedCardinality, limit, estimateEnum, sample); + super(estCard, limit, estimateEnum, sample); switch (estimateEnum) { case Normal: Modified: branches/QUADS_QUERY_BRANCH/bigdata-rdf/src/test/com/bigdata/bop/rdf/joinGraph/AbstractJoinGraphTestCase.java =================================================================== --- branches/QUADS_QUERY_BRANCH/bigdata-rdf/src/test/com/bigdata/bop/rdf/joinGraph/AbstractJoinGraphTestCase.java 2011-02-27 21:11:23 UTC (rev 4255) +++ branches/QUADS_QUERY_BRANCH/bigdata-rdf/src/test/com/bigdata/bop/rdf/joinGraph/AbstractJoinGraphTestCase.java 2011-02-27 21:14:03 UTC (rev 4256) @@ -39,6 +39,7 @@ import com.bigdata.bop.BOpContextBase; import com.bigdata.bop.BOpIdFactory; +import com.bigdata.bop.BOpUtility; import com.bigdata.bop.IBindingSet; import com.bigdata.bop.IConstraint; import com.bigdata.bop.IPredicate; @@ -89,8 +90,17 @@ /** The initial sampling limit. */ static private final int limit = 100; - - /** The #of edges considered for the initial paths. */ + + /** + * The #of edges considered for the initial paths. + * + * FIXME We need to consider all of the low cardinality vertices, e.g., BSBM + * Q5 has 3 such vertices. Also, we should not consider vertices when + * looking for the initial edges which are relatively unconstrained (e.g., + * 1-bound). This could be handled by sampling the top-N vertices in reverse + * rank order of their cardinality and any with a cardinality LT 10x the + * initial sample limit. + */ static private final int nedges = 2; static private final SampleType sampleType = SampleType.EVEN; @@ -240,9 +250,14 @@ final IPredicate<?>[] runtimePredOrder = runRuntimeQueryOptimizer( getQueryEngine(), limit, nedges, sampleType, preds, constraints); + long totalGivenTime = 0; long totalRuntimeTime = 0; long totalStaticTime = 0; + long givenSolutions = 0; + long runtimeSolutions = 0; + long staticSolutions = ... [truncated message content] |