From: Dagmar K. <da...@eb...> - 2009-05-20 15:44:18
|
Hej Jonathan, > Rules: > > 1. Parts A and C are very similar. I would be inclined to include > boundary conditions within A. Peter Hunter's comment refers to > PDE-based models, where the same model (in terms of PDEs) can be > discretised in many ways (even using different weak forms). How > much of this should be considered part of the model, and how > much part of the simulation algorithm, is debatable. This is > also covered to some extent by 2.E. So it might be best to > remove 1.C entirely, and just have an explanatory comment about > PDE models. > Hmm, not sure I agree or not. In any case it would make sense to swap the order of rules. Like 1B-1A-1C. Whether or not to merge 1A and 1C I cannot decide. > 1. > > > 2. For C I don't think the seed is required if results are only > required to be reproducible within a tolerance specified by the > simulation author. Of course, if they wish exactly matching > results, they will need to provide the seed. D.a is rather > repetitive. Perhaps D should change to read: > * D. If a model is referenced as a piece of implementation > code, then all information needed to simulate it correctly > must be provided. > 1. Use of open code is encouraged as much as possible, > as it is the best way to evaluate the quality of a > simulation. > 2. If closed code (black boxes) is used, then several > independent codes should provide the same result. > I'd propose: D. If a model is referenced as a piece of implementation code, then all information needed to simulate it correctly must be provided. 1. Use of open code is encouraged as much as possible, as it is the best way to evaluate the quality of a simulation. 2. If using closed code (black box), then all information needed to simulate it correctly must be provided. Several independent codes must provide the same result. > 1. > > > Information on the Models: > > * I'd be inclined to agree with MC's comment that little needs to > be said about changing parameters. Perhaps just "Model changes > do not only include atomic changes such as the assignment of new > values (such as covered under rule 1.A.)." > I think mentioning some concrete examples makes it easier for the reader. But if I am the only one we can well shorten the paragraph. > * Information on the Simulations: > > * Regarding parameter scans, perhaps "such as the range of > parameters considered in a parameter scan" > Could not find where this related to? > * > > > * Regarding KiSAO, perhaps add a comment that such vocabularies > are still at a very early stage of development? > added > * > > > * Regarding levels of compliance, I think the focus should indeed > be on defining a single minimum standard. However, we can still > encourage people to provide more information! > * I think, as I mentioned in my previous email and as MC has > commented, a MIASE-compliant description should also specify a > tolerance level on results. This would then define what is meant > by 'correct'. This could even be made a rule. > I think there is disagreement between people on the list, thus both points need to be discussed :-) Dagmar > > > On 08/05/09 09:48, Dagmar Köhn wrote: >> Dear all, >> >> there have been some improvements on the paper (hopefully!), thanks >> to many helpful comments by Mike Cooling and Frank Bergman (thanks!). >> I have attached an updated version of the paper draft for you (odt & >> pdf). >> >> Besides some remaining smaller issues, there are also some more >> striking ones which need further discussions. Summarised further down >> and waiting for your comments. Surely, I did forget to put issues, so >> feel free to open the discussion on them! >> >> If you have time to go through the paper (again), I'd say that the >> motivation, "what is not in the scope of miase", the repressilator >> example (which will be enhanced according to a suggestion by MC >> towards describing post-processing also), and discussion sections are >> fine for now. But the rules and the sections belonging to them need >> to be read again (especially "Information on the model" / "- >> simulation")! >> >> Best, >> Dagmar >> >> issues & required opinions: >> >> 1) future direction: Do you want me to include some comments on other >> existing approaches for the description of simulation experiments or >> is it irrelevant as we are talking about MIs? >> >> 2) the glossary: ... needs to be filled with definitions for >> simulation, experiment, MIASE-compliance, (model, model instance). >> Suggestions welcome. >> >> 3) results: the notion of correctness/reproducibility. >> In the paper we currently say that >> "The Minimum Information About a Simulation Experiment (MIASE) is a >> set of guidelines that serve as a recommendation for information to >> be provided together with models in order to enable fast, simple and >> reproducible simulation." >> But we do not define nor say what "reproduce"/"reproducible" means to >> us. >> >> Also, we say that >> "The description of a simulation can lead to different levels of >> correctness. To be MIASE compliant, a description should reproduce >> the result of an experiment in a correct manner." >> But we do not specify what "correct" means for us (When is a >> simulation result actually really corresponding to an earlier >> result?, thinking also about stochastic simulations). >> >> This directly relates to the next issue: >> 5) Information on the simulation: the MIASE-compliance level >> First, I thought it was a good idea to have different levels of MIASE >> compliance in order to make a difference between levels of details of >> the simulation description. But I think this is contra-productive as >> we call the whole thing an "MI" which should either be minimum or >> not!? So I suppose we should agree on the MI and not on different >> levels of it. >> However, there was the idea of using 2 different MIASE levels (which >> we could re-use as 2 different use cases where MIASE-compliant >> descriptions differ a lot, but both are "correct"): >> [comment Mike Cooling] >> "I suspect there could be two levels of MIASE information (this is >> orthogonal to what Mike Hucka was suggesting as in your point on >> progressive guidelines). One might be conceptual and mentions the >> algorithm used (not implementation), and general processes to achieve >> the example result. The second level might be more an inventory of >> how the example result was actually achieved, listing the >> implementation of said algorithm(s). >> The first level would be used by those wanting to provide some >> transparency and quality assessment, and allow those without the >> exact same technology some kind of shot at reproducing the same (or >> similar) results. The second ('implementation') level would provide >> total transparency which could be become useful if a) one doesn't >> have much time and just wants to exactly reproduce an example output >> or b) one tries to use one's own technology to produce the example >> result via the first level but for some reason can't - the exact >> details of an example run would be available to help the keen track >> down the difference between their output and the example. " >> >> 6) MIASE rules: please, could everyone go through them again and make >> sure they all make sense, they are "complete" and correspond to what >> was said in Auckland. >> >> 7) open comments from the Auckland meeting >> There are some of Nicolas' comments on the Auckland meeting left that >> I did not yet include in the paper, because basically I was not sure >> what they meant to say. Those are: >> - (in the rules section, including) Geometry, Space discretization, >> Physics and Constitutive parameters (PH) >> - (in the guidelines section, talking about MIASE applicability) >> Could domain-specific information be encoded in external ontology? (PH) >> - (in information on the simulation) "We need to mention procedures >> to ensure MIASE-compliance. This may be a problem in the case of >> proprietary code. A repository or a registry of simulation codes may >> be a way.” -> this goes to the compliance discussion actually! >> >> 8) KiSAO >> We had a bit of discussion on whether KiSAO at its current state >> could satisfy the needs for MIASE-compliant algorithm specification. >> But I think we found a work around for the example given in the paper >> and the rest of discussion about KiSAO should go to miase-discuss... >> Frank!? >> >> >> ------------------------------------------------------------------------ >> >> ------------------------------------------------------------------------------ >> The NEW KODAK i700 Series Scanners deliver under ANY circumstances! Your >> production scanning environment may not be a perfect world - but thanks to >> Kodak, there's a perfect scanner to get the job done! With the NEW KODAK i700 >> Series Scanner you'll get full speed at 300 dpi even with all image >> processing features enabled. http://p.sf.net/sfu/kodak-com >> ------------------------------------------------------------------------ >> >> _______________________________________________ >> Miase-discuss mailing list >> Mia...@li... >> https://lists.sourceforge.net/lists/listinfo/miase-discuss >> > |