From: Roy Stogner <roystgnr@ic...>  20120720 15:59:53

On Fri, 20 Jul 2012, John Peterson wrote: > On Fri, Jul 20, 2012 at 8:26 AM, Danny Lathouwers  TNW > <D.Lathouwers@...> wrote: > >> The 2 additional variables are handled by hadaption and I am >> interested in having Libmesh deal with the spatial part (possibly >> by hprefinement). The implication of our hrefinement scheme for >> the nonspace coordinates is that we have a varying number of >> unknowns per spatial element to deal with (may be highly variable >> from 8 to hundreds of unknowns). Can Libmesh handle this >> situation? > > Libmesh can do subdomainrestricted variables, but that may not be > finegrained enough for what you are attempting. The big limitations: The code would have to predeclare every variable. E.g. if you wanted to be able to go from a tensor product against zero spherical basis functions up to a tensor product against a couple dozen bases, then you'd need to call "add_variable" two dozen times and you'd need something in your "additional variable hadaptation" that would detect and bomb out whenever something tried to do a 25th refinement step. Worse: you don't just want to declare 2000 variables and be done with it, because IIRC we've got a few places in library code where there's a loop over elements nested inside a loop over all variables. The formulation would either need to be hierarchic (the variables in each refined subdomain would be the same as the variables in the previous subdomain plus independent extras) or would need to have redundant variables plus fancy transformations at every single interface between subdomains. I'd strongly recommend the former. We were originally planning to do something similar for light radiation, but tabled them when our QoI turned out to be relatively radiationinsensitive. So the underlying code ought to be set up in the right direction, but it's never been tested on this scale.  Roy 