<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Recent changes to Home</title><link>https://sourceforge.net/p/jacaddm/wiki/Home/</link><description>Recent changes to Home</description><atom:link href="https://sourceforge.net/p/jacaddm/wiki/Home/feed" rel="self"/><language>en</language><lastBuildDate>Mon, 19 Dec 2016 09:15:02 -0000</lastBuildDate><atom:link href="https://sourceforge.net/p/jacaddm/wiki/Home/feed" rel="self" type="application/rss+xml"/><item><title>Home modified by Xavier</title><link>https://sourceforge.net/p/jacaddm/wiki/Home/</link><description>&lt;div class="markdown_content"&gt;&lt;pre&gt;--- v7
+++ v8
@@ -1,4 +1,3 @@
-

 &lt;h1&gt;Oveview&lt;/h1&gt;

@@ -16,8 +15,8 @@
 &lt;ul&gt;
 &lt;li&gt;Installation: &lt;a&gt;https://sourceforge.net/p/jacaddm/wiki/Installation/&lt;/a&gt;&lt;/li&gt;
 &lt;li&gt;Running experiments (GUI): &lt;a&gt;https://sourceforge.net/p/jacaddm/wiki/Running%20experiments&lt;/a&gt;&lt;/li&gt;
-&lt;li&gt;Running experiments (headless):&lt;/li&gt;
-&lt;li&gt;Adding new Learning Strategies&lt;/li&gt;
-&lt;li&gt;Adding new Artifacts&lt;/li&gt;
+&lt;li&gt;Running experiments (headless): https://sourceforge.net/p/jacaddm/wiki/Running%20experiments%20in%20headless%20mode/ &lt;/li&gt;
+&lt;li&gt;Adding new Learning Strategies: https://sourceforge.net/p/jacaddm/wiki/Adding%20new%20Learning%20Strategies/&lt;/li&gt;
+&lt;li&gt;Adding new Artifacts: https://sourceforge.net/p/jacaddm/wiki/Adding%20new%20Artifacts/&lt;/li&gt;
 &lt;/ul&gt;

&lt;/pre&gt;
&lt;/div&gt;</description><dc:creator xmlns:dc="http://purl.org/dc/elements/1.1/">Xavier</dc:creator><pubDate>Mon, 19 Dec 2016 09:15:02 -0000</pubDate><guid>https://sourceforge.net4f52d1a465547fdaf09759b28f0193ce8f9df90d</guid></item><item><title>Home modified by Xavier</title><link>https://sourceforge.net/p/jacaddm/wiki/Home/</link><description>&lt;div class="markdown_content"&gt;&lt;pre&gt;--- v6
+++ v7
@@ -1,71 +1,23 @@
-This is a basic tutorial to get you started with JaCaDDM.

-&lt;h1&gt;System requirements&lt;/h1&gt;
- &lt;ul&gt;
-  &lt;li&gt;Java SE Runtime Environment 7 (JRE)&lt;/li&gt;
-   &lt;li&gt;CUDA 7 or greater (only for GPU strategies) &lt;/li&gt;
-   &lt;li&gt;Python2 or Python3 fot running experiments from the scripts&lt;/li&gt;
-&lt;/ul&gt; 

- &lt;h1&gt;Quick installation&lt;/h1&gt;
+&lt;h1&gt;Oveview&lt;/h1&gt;

- The zip file with JaCaDDM (&lt;a&gt;https://sourceforge.net/projects/jacaddm/files/latest/download&lt;/a&gt;) contains 3 main directories:
- &lt;ul&gt;
-   &lt;li&gt; node0: source, libraries, and binaries related to node0. It can be thought as the main project, where the configuration GUI is defined. To execute the GUI on Unix like systems run "run.sh".
-   You only need this directory on the computer that will control the experiment process.&lt;/li&gt;
+JaCaDDM is an agents &amp;amp; artifacts oriented Distributed Data Mining (DDM) tool that entails the creation and testing of Learning Strategies.  A Learning Strategy is an encapsulated  DDM workflow modeling the interaction of agents with their environment (which includes DM tools) and other agents, with the objective to create a classification model from distributed data.&lt;br/&gt;

-   &lt;li&gt;defaultNode: source, libraries, and binaries related to each node_1,...,node_n. A node represents a logical site where there is data that is wanted to be used on a DDM process. To execute on Unix like systems run "run.sh IP Port", where IP and Port are configurable parameters.  You have to copy this directory in each computer of interest (but a single computer may execute various instances of the program with different port numbers). &lt;/li&gt;
+Learning strategies are meant to be general in the sense that they can be applied to any  distributed setting, deployment details are managed by the JaCaDDM platform. In this sense, learning strategies are plug-and-play. The JaCaDDM distribution already comes with a set of different learning strategies to try, and it is also possible to add new ones.&lt;br/&gt;

-   &lt;li&gt; sampleProtocols: contains various directories, each one represents a learning strategy. Each of these directories contains an XML file with the definition of the strategy. You only need this directory on the same computer as the node0 directory. &lt;/li&gt;
-&lt;/ul&gt; 
+JaCaDDM considers as a distributed setting any kind of environment where data is split in various sites (even geographically distributed). With JaCaDDM is possible to configure and launch a deployment that takes into account the different sites, and their data, that participate in the DDM process. As mentioned, the actual process is encapsulated on the learning strategy, which may have some configurable parameters that can be set as part of the general configuration.&lt;br/&gt;

- &lt;h1&gt;Preparations for running an experiment&lt;/h1&gt;
- &lt;ul&gt;
-  &lt;li&gt;First of all make sure that you followed the  installation previously outlined&lt;/li&gt;
-  &lt;li&gt;If you intend to do a distributed experiment you have to make sure that your network and security settings are correct, in order to allow remote hosts to communicate between each other through their IP addresses and configured service ports &lt;/li&gt;&lt;br/&gt;
-  &lt;li&gt;Execute node0, this will open the configuration GUI. You can do this  through the run.sh script or using the following (while placed on the node0 directory) command:&lt;br/&gt;
-  java    -classpath lib/jason.jar:bin/classes:lib/weka.jar:lib/cartago.jar:lib/c4jason.jar:lib/jacamo.jar:lib/moa.jar:lib/moise.jar:lib/pentaho.jar:lib/sizeofag-1.0.0.jar jason.infra.centralised.RunCentralisedMAS experimenter.mas2j&lt;/li&gt;
+JaCaDDM provides a tool to experiment and do research with different DDM approaches, as it makes an evaluation of the produced classification model, yielding various performance statistics (total time, classification accuracy, network traffic produced, model complexity, confusion matrix).&lt;br/&gt;   

-  &lt;li&gt;Execute a defaultNode in each computer of interest (you can run various instances on the same computer using different ports), you need at least one of such nodes. The execution could be through the run.sh scrip inside the defaultNode  directory, the IP and Port must be passed. You can also execute a node through the following command (while placed on the defaultNode directory, you also need to pass the IP and Port):&lt;br/&gt;
-    java  -classpath bin:lib/cartago.jar:lib/experimenter.jar:lib/moa.jar:lib/weka.jar:lib/pentaho.jar:lib/modl_gkclass.jar defaultClient.Main IP:Port&lt;/li&gt;  
-&lt;/ul&gt; 
+JaCaDDM can be extended through the adding of new learning strategies and artifacts. Artifacts are first-class entities in the agent environment that encapsulate services, in the case of JaCaDDM, these services consists on DM related tools.&lt;br/&gt;

-&lt;h1&gt;How to configure and launch an experiment&lt;/h1&gt;
-The GUI interface present several configurable fields that must be set.
-![GUI](https://a.fsdn.com/con/app/proj/jacaddm/screenshots/main.png)
+More details can be found in the following wiki pages:
+&lt;ul&gt;
+&lt;li&gt;Installation: &lt;a&gt;https://sourceforge.net/p/jacaddm/wiki/Installation/&lt;/a&gt;&lt;/li&gt;
+&lt;li&gt;Running experiments (GUI): &lt;a&gt;https://sourceforge.net/p/jacaddm/wiki/Running%20experiments&lt;/a&gt;&lt;/li&gt;
+&lt;li&gt;Running experiments (headless):&lt;/li&gt;
+&lt;li&gt;Adding new Learning Strategies&lt;/li&gt;
+&lt;li&gt;Adding new Artifacts&lt;/li&gt;
+&lt;/ul&gt;

-These fields  are divided in the following categories:
-&lt;ul&gt;
-  &lt;li&gt;Main configuration: it is possible to load existing configurations or save a current one on an XML file&lt;/li&gt;
-  &lt;li&gt;Node0 configuration: to establish the IP where node0 is running, this is useful if the computer has more than one network interface.&lt;/li&gt;
-  &lt;li&gt;Nodes Configuration: this configuration allows node0 to recognize which  default nodes participate in an experiment. For each default node some configuration fields are in order:
-    &lt;ul&gt;
-      &lt;li&gt;name: a logical name to refer to the node &lt;/li&gt;
-      &lt;li&gt;IP and port: the IP and port of the node &lt;/li&gt;
-        &lt;li&gt;Data file path: the path where the data of the node is stored. This data is the data that will be used in the DDM process, and it is considered training data. &lt;/li&gt;
-    &lt;/ul&gt;&lt;/li&gt;
-  &lt;li&gt;Strategy Configuration: a strategy can be loaded via its XML definition file. When a strategy is successfully loaded, it is possible to establish the values of the special parameters that defines.&lt;/li&gt;
-  &lt;li&gt;Agents Distribution: this section can only be configured if the nodes configuration and the strategy configuration is done. In this section it is possible to configure, for each agent program (remember that the strategy defines agent programs), how many copies (if any) of that agent program are going to be assigned to a particular node. Note that this configuration is strategy dependent.&lt;/li&gt;
-  &lt;li&gt;Evaluation configuration: in this section it is possible to configure the data sources and the general evaluation mode. There are several main options (note that currently only WEKA arff files are supported):
-    &lt;ul&gt;
-      &lt;li&gt;Single file: the experiment will only consist of a single data file already present in each node (each node defines a single training file). When configuring the data path of each node, the configured path indicates this file. The test file residing in node0 has to be specified&lt;/li&gt;
-      &lt;li&gt;Round files: the experiment will consist of a series of files on the same directory for each node, the files are already present in each node. The files follow a naming convention. For example, australian1.arff indicates the first file, australian2.arff the second, an so on. The following sub fields must to be also configured:
-   &lt;ul&gt;
-     &lt;li&gt;Base train name: indicates the part of the file name that it is common to all the data files. Following the example of the naming conventions mentioned before, the appropriate value here would be australian.arff &lt;/li&gt;
-     &lt;li&gt;Test file base name with path: as there are several files for training, there are as many test files. In this field a path to the directory on node0 where the test data is located followed by a base name for each test file is supplied. An example of this field could be: /tmp/australian.arff&lt;/li&gt;
-     &lt;li&gt;Number of round files: indicates how many data files there are&lt;/li&gt;
-      &lt;/ul&gt;&lt;/li&gt;
-      &lt;li&gt;Hold out: this option takes into account that no training data is already present on each node, the data will be delivered to each node following the configuration:
-   &lt;ul&gt;
-     &lt;li&gt;Dataset File: a WEKA arff file that represents the data that will be partitioned, using the hold out method, into training and test fragments. The training fragment will be further split in several parts, one for each default node.&lt;/li&gt;
-     &lt;li&gt;Test data path: the path on node0 where the test files will be placed. Note that the training files will be placed on each default node on the directory indicated on the configuration of each node&lt;/li&gt;
-     &lt;li&gt;Train percentage: a hold out parameter, indicates the percentage of the data that will be used for training, the rest is reserved by testing.&lt;/li&gt;
-     &lt;li&gt;Number of repetitions: as the name indicates, how many times the process will be repeated. If, for example, the value 10 is indicated, that means that 10 training files will be created in each default node and 10 test files will be created in node0&lt;/li&gt;
-      &lt;/ul&gt;&lt;/li&gt;
-      &lt;li&gt;Cross validation: similar to hold out but using the Cross validation method. With this method it is necessary to indicate the number of folds, the number of folds also has an impact on the number of files shared. A value of 10 creates 10 training data files in each default node and 10 test files on node0. If the number of repetitions value is greater than 1, then the actual number of files shared is the product of the number of folds and repetitions. &lt;/li&gt;
-  Note that the first two options (single and round files) represents what we call "Static mode" data sources, which means the data files are already present on the nodes; and the last two (Hold out and Cross validation) represent "Dynamic mode" data sources, because these methods deliver data files to each node. Also note that combining Dynamic and Static mode is possible and desirable if the same data partitions are meant to be used with several different strategies in order to compare them. For this, simply run the first experiment with Dynamic mode and then switch to Static mode (most likely Round files) for the remaining experiments.   
-  &lt;/ul&gt;&lt;/li&gt;
-  &lt;li&gt;Dispatch Experiment: once all the other configurations had been set the experiment can be run. The results for each iteration and the final summary is presented in this section.&lt;/li&gt;
-&lt;/ul&gt;  
-
-![GUI](https://a.fsdn.com/con/app/proj/jacaddm/screenshots/experiment.png)
&lt;/pre&gt;
&lt;/div&gt;</description><dc:creator xmlns:dc="http://purl.org/dc/elements/1.1/">Xavier</dc:creator><pubDate>Thu, 15 Dec 2016 11:41:04 -0000</pubDate><guid>https://sourceforge.net13c87bd889066abe87eef8432552ba622c769afe</guid></item><item><title>Home modified by Xavier</title><link>https://sourceforge.net/p/jacaddm/wiki/Home/</link><description>&lt;div class="markdown_content"&gt;&lt;pre&gt;--- v5
+++ v6
@@ -4,6 +4,7 @@
  &lt;ul&gt;
   &lt;li&gt;Java SE Runtime Environment 7 (JRE)&lt;/li&gt;
    &lt;li&gt;CUDA 7 or greater (only for GPU strategies) &lt;/li&gt;
+   &lt;li&gt;Python2 or Python3 fot running experiments from the scripts&lt;/li&gt;
 &lt;/ul&gt;

  &lt;h1&gt;Quick installation&lt;/h1&gt;
&lt;/pre&gt;
&lt;/div&gt;</description><dc:creator xmlns:dc="http://purl.org/dc/elements/1.1/">Xavier</dc:creator><pubDate>Wed, 24 Aug 2016 14:40:08 -0000</pubDate><guid>https://sourceforge.netf5b954f95e0afb62fa31d32b5af8a4f03bac536c</guid></item><item><title>Home modified by Xavier</title><link>https://sourceforge.net/p/jacaddm/wiki/Home/</link><description>&lt;div class="markdown_content"&gt;&lt;pre&gt;--- v4
+++ v5
@@ -3,7 +3,7 @@
 &lt;h1&gt;System requirements&lt;/h1&gt;
  &lt;ul&gt;
   &lt;li&gt;Java SE Runtime Environment 7 (JRE)&lt;/li&gt;
-   &lt;li&gt;CUDA 7 or greater (only for GPU strategies) (JRE)&lt;/li&gt;
+   &lt;li&gt;CUDA 7 or greater (only for GPU strategies) &lt;/li&gt;
 &lt;/ul&gt;

  &lt;h1&gt;Quick installation&lt;/h1&gt;
&lt;/pre&gt;
&lt;/div&gt;</description><dc:creator xmlns:dc="http://purl.org/dc/elements/1.1/">Xavier</dc:creator><pubDate>Tue, 09 Aug 2016 18:53:41 -0000</pubDate><guid>https://sourceforge.net2cff248072a355b8423c2c8870258a32ccdef958</guid></item><item><title>Home modified by Xavier</title><link>https://sourceforge.net/p/jacaddm/wiki/Home/</link><description>&lt;div class="markdown_content"&gt;&lt;pre&gt;--- v3
+++ v4
@@ -1,9 +1,9 @@
-
 This is a basic tutorial to get you started with JaCaDDM.

 &lt;h1&gt;System requirements&lt;/h1&gt;
  &lt;ul&gt;
   &lt;li&gt;Java SE Runtime Environment 7 (JRE)&lt;/li&gt;
+   &lt;li&gt;CUDA 7 or greater (only for GPU strategies) (JRE)&lt;/li&gt;
 &lt;/ul&gt; 

  &lt;h1&gt;Quick installation&lt;/h1&gt;
&lt;/pre&gt;
&lt;/div&gt;</description><dc:creator xmlns:dc="http://purl.org/dc/elements/1.1/">Xavier</dc:creator><pubDate>Tue, 09 Aug 2016 18:53:19 -0000</pubDate><guid>https://sourceforge.neta9fa6e4294a86abe3f95e7d10f3ec881ec7decd2</guid></item><item><title>Home modified by Xavier</title><link>https://sourceforge.net/p/jacaddm/wiki/Home/</link><description>&lt;div class="markdown_content"&gt;&lt;pre&gt;--- v2
+++ v3
@@ -1,4 +1,3 @@
-

 This is a basic tutorial to get you started with JaCaDDM.

@@ -67,3 +66,5 @@
   
   &lt;li&gt;Dispatch Experiment: once all the other configurations had been set the experiment can be run. The results for each iteration and the final summary is presented in this section.&lt;/li&gt;
   
+
+![GUI](https://a.fsdn.com/con/app/proj/jacaddm/screenshots/experiment.png)
&lt;/pre&gt;
&lt;/div&gt;</description><dc:creator xmlns:dc="http://purl.org/dc/elements/1.1/">Xavier</dc:creator><pubDate>Thu, 26 Feb 2015 03:42:49 -0000</pubDate><guid>https://sourceforge.net8684a5132de87d236580c58f13f6753a2de4bc6f</guid></item><item><title>Home modified by Xavier</title><link>https://sourceforge.net/p/jacaddm/wiki/Home/</link><description>&lt;div class="markdown_content"&gt;&lt;pre&gt;--- v1
+++ v2
@@ -1,8 +1,69 @@
-Welcome to your wiki!

-This is the default page, edit it as you see fit. To add a new page simply reference it within brackets, e.g.: [SamplePage].

-The wiki uses [Markdown](/p/jacaddm/wiki/markdown_syntax/) syntax.
+This is a basic tutorial to get you started with JaCaDDM.

-[[members limit=20]]
-[[download_button]]
+&lt;h1&gt;System requirements&lt;/h1&gt;
+ &lt;ul&gt;
+  &lt;li&gt;Java SE Runtime Environment 7 (JRE)&lt;/li&gt;
+&lt;/ul&gt; 
+
+ &lt;h1&gt;Quick installation&lt;/h1&gt;
+
+ The zip file with JaCaDDM (&lt;a&gt;https://sourceforge.net/projects/jacaddm/files/latest/download&lt;/a&gt;) contains 3 main directories:
+ &lt;ul&gt;
+   &lt;li&gt; node0: source, libraries, and binaries related to node0. It can be thought as the main project, where the configuration GUI is defined. To execute the GUI on Unix like systems run "run.sh".
+   You only need this directory on the computer that will control the experiment process.&lt;/li&gt;
+
+   &lt;li&gt;defaultNode: source, libraries, and binaries related to each node_1,...,node_n. A node represents a logical site where there is data that is wanted to be used on a DDM process. To execute on Unix like systems run "run.sh IP Port", where IP and Port are configurable parameters.  You have to copy this directory in each computer of interest (but a single computer may execute various instances of the program with different port numbers). &lt;/li&gt;
+
+   &lt;li&gt; sampleProtocols: contains various directories, each one represents a learning strategy. Each of these directories contains an XML file with the definition of the strategy. You only need this directory on the same computer as the node0 directory. &lt;/li&gt;
+&lt;/ul&gt; 
+
+ &lt;h1&gt;Preparations for running an experiment&lt;/h1&gt;
+ &lt;ul&gt;
+  &lt;li&gt;First of all make sure that you followed the  installation previously outlined&lt;/li&gt;
+  &lt;li&gt;If you intend to do a distributed experiment you have to make sure that your network and security settings are correct, in order to allow remote hosts to communicate between each other through their IP addresses and configured service ports &lt;/li&gt;&lt;br /&gt;
+  &lt;li&gt;Execute node0, this will open the configuration GUI. You can do this  through the run.sh script or using the following (while placed on the node0 directory) command:&lt;br /&gt;
+  java    -classpath lib/jason.jar:bin/classes:lib/weka.jar:lib/cartago.jar:lib/c4jason.jar:lib/jacamo.jar:lib/moa.jar:lib/moise.jar:lib/pentaho.jar:lib/sizeofag-1.0.0.jar jason.infra.centralised.RunCentralisedMAS experimenter.mas2j&lt;/li&gt;
+
+  &lt;li&gt;Execute a defaultNode in each computer of interest (you can run various instances on the same computer using different ports), you need at least one of such nodes. The execution could be through the run.sh scrip inside the defaultNode  directory, the IP and Port must be passed. You can also execute a node through the following command (while placed on the defaultNode directory, you also need to pass the IP and Port):&lt;br /&gt;
+    java  -classpath bin:lib/cartago.jar:lib/experimenter.jar:lib/moa.jar:lib/weka.jar:lib/pentaho.jar:lib/modl_gkclass.jar defaultClient.Main IP:Port&lt;/li&gt;  
+&lt;/ul&gt; 
+
+&lt;h1&gt;How to configure and launch an experiment&lt;/h1&gt;
+The GUI interface present several configurable fields that must be set.
+![GUI](https://a.fsdn.com/con/app/proj/jacaddm/screenshots/main.png)
+
+These fields  are divided in the following categories:
+&lt;ul&gt;
+  &lt;li&gt;Main configuration: it is possible to load existing configurations or save a current one on an XML file&lt;/li&gt;
+  &lt;li&gt;Node0 configuration: to establish the IP where node0 is running, this is useful if the computer has more than one network interface.&lt;/li&gt;
+  &lt;li&gt;Nodes Configuration: this configuration allows node0 to recognize which  default nodes participate in an experiment. For each default node some configuration fields are in order:
+    &lt;ul&gt;
+      &lt;li&gt;name: a logical name to refer to the node &lt;/li&gt;
+      &lt;li&gt;IP and port: the IP and port of the node &lt;/li&gt;
+        &lt;li&gt;Data file path: the path where the data of the node is stored. This data is the data that will be used in the DDM process, and it is considered training data. &lt;/li&gt;
+    &lt;/ul&gt;&lt;/li&gt;
+  &lt;li&gt;Strategy Configuration: a strategy can be loaded via its XML definition file. When a strategy is successfully loaded, it is possible to establish the values of the special parameters that defines.&lt;/li&gt;
+  &lt;li&gt;Agents Distribution: this section can only be configured if the nodes configuration and the strategy configuration is done. In this section it is possible to configure, for each agent program (remember that the strategy defines agent programs), how many copies (if any) of that agent program are going to be assigned to a particular node. Note that this configuration is strategy dependent.&lt;/li&gt;
+  &lt;li&gt;Evaluation configuration: in this section it is possible to configure the data sources and the general evaluation mode. There are several main options (note that currently only WEKA arff files are supported):
+    &lt;ul&gt;
+      &lt;li&gt;Single file: the experiment will only consist of a single data file already present in each node (each node defines a single training file). When configuring the data path of each node, the configured path indicates this file. The test file residing in node0 has to be specified&lt;/li&gt;
+      &lt;li&gt;Round files: the experiment will consist of a series of files on the same directory for each node, the files are already present in each node. The files follow a naming convention. For example, australian1.arff indicates the first file, australian2.arff the second, an so on. The following sub fields must to be also configured:
+   &lt;ul&gt;
+     &lt;li&gt;Base train name: indicates the part of the file name that it is common to all the data files. Following the example of the naming conventions mentioned before, the appropriate value here would be australian.arff &lt;/li&gt;
+     &lt;li&gt;Test file base name with path: as there are several files for training, there are as many test files. In this field a path to the directory on node0 where the test data is located followed by a base name for each test file is supplied. An example of this field could be: /tmp/australian.arff&lt;/li&gt;
+     &lt;li&gt;Number of round files: indicates how many data files there are&lt;/li&gt;
+      &lt;/ul&gt;&lt;/li&gt;
+      &lt;li&gt;Hold out: this option takes into account that no training data is already present on each node, the data will be delivered to each node following the configuration:
+   &lt;ul&gt;
+     &lt;li&gt;Dataset File: a WEKA arff file that represents the data that will be partitioned, using the hold out method, into training and test fragments. The training fragment will be further split in several parts, one for each default node.&lt;/li&gt;
+     &lt;li&gt;Test data path: the path on node0 where the test files will be placed. Note that the training files will be placed on each default node on the directory indicated on the configuration of each node&lt;/li&gt;
+     &lt;li&gt;Train percentage: a hold out parameter, indicates the percentage of the data that will be used for training, the rest is reserved by testing.&lt;/li&gt;
+     &lt;li&gt;Number of repetitions: as the name indicates, how many times the process will be repeated. If, for example, the value 10 is indicated, that means that 10 training files will be created in each default node and 10 test files will be created in node0&lt;/li&gt;
+      &lt;/ul&gt;&lt;/li&gt;
+      &lt;li&gt;Cross validation: similar to hold out but using the Cross validation method. With this method it is necessary to indicate the number of folds, the number of folds also has an impact on the number of files shared. A value of 10 creates 10 training data files in each default node and 10 test files on node0. If the number of repetitions value is greater than 1, then the actual number of files shared is the product of the number of folds and repetitions. &lt;/li&gt;
+  Note that the first two options (single and round files) represents what we call "Static mode" data sources, which means the data files are already present on the nodes; and the last two (Hold out and Cross validation) represent "Dynamic mode" data sources, because these methods deliver data files to each node. Also note that combining Dynamic and Static mode is possible and desirable if the same data partitions are meant to be used with several different strategies in order to compare them. For this, simply run the first experiment with Dynamic mode and then switch to Static mode (most likely Round files) for the remaining experiments.   
+  &lt;/ul&gt;&lt;/li&gt;
+  &lt;li&gt;Dispatch Experiment: once all the other configurations had been set the experiment can be run. The results for each iteration and the final summary is presented in this section.&lt;/li&gt;
+&lt;/ul&gt;  
&lt;/pre&gt;
&lt;/div&gt;</description><dc:creator xmlns:dc="http://purl.org/dc/elements/1.1/">Xavier</dc:creator><pubDate>Thu, 26 Feb 2015 03:39:09 -0000</pubDate><guid>https://sourceforge.net8234cde016a5f177190032854a45711655cd9d87</guid></item><item><title>Discussion for Home page</title><link>https://sourceforge.net/p/jacaddm/wiki/Home/</link><description>&lt;div class="markdown_content"&gt;&lt;h1&gt; JaCa-DDM &lt;/h1&gt;
&lt;/div&gt;</description><dc:creator xmlns:dc="http://purl.org/dc/elements/1.1/">Xavier</dc:creator><pubDate>Wed, 25 Feb 2015 18:17:20 -0000</pubDate><guid>https://sourceforge.netc5a19afc7211106219a4c835fc4737943ec280d7</guid></item><item><title>Home modified by Xavier</title><link>https://sourceforge.net/p/jacaddm/wiki/Home/</link><description>&lt;div class="markdown_content"&gt;&lt;p&gt;Welcome to your wiki!&lt;/p&gt;
&lt;p&gt;This is the default page, edit it as you see fit. To add a new page simply reference it within brackets, e.g.: &lt;span&gt;[SamplePage]&lt;/span&gt;.&lt;/p&gt;
&lt;p&gt;The wiki uses &lt;a class="" href="/p/jacaddm/wiki/markdown_syntax"&gt;Markdown&lt;/a&gt; syntax.&lt;/p&gt;
&lt;p&gt;&lt;h6&gt;Project Members:&lt;/h6&gt;
&lt;ul class="md-users-list"&gt;
&lt;li&gt;&lt;a href="/u/xl666"&gt;Xavier&lt;/a&gt; (admin)&lt;/li&gt;
&lt;/ul&gt;&lt;br /&gt;
&lt;/p&gt;&lt;p&gt;&lt;span class="download-button-54ee0a467929e539af52ff45" style="margin-bottom: 1em; display: block;"&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;/p&gt;&lt;/div&gt;</description><dc:creator xmlns:dc="http://purl.org/dc/elements/1.1/">Xavier</dc:creator><pubDate>Wed, 25 Feb 2015 17:45:42 -0000</pubDate><guid>https://sourceforge.nete70fc46738779a1828e3d40e2e1d991b5e483b8e</guid></item></channel></rss>