You can subscribe to this list here.
2003 
_{Jan}

_{Feb}

_{Mar}

_{Apr}

_{May}

_{Jun}

_{Jul}

_{Aug}

_{Sep}
(2) 
_{Oct}
(2) 
_{Nov}
(27) 
_{Dec}
(31) 

2004 
_{Jan}
(6) 
_{Feb}
(15) 
_{Mar}
(33) 
_{Apr}
(10) 
_{May}
(46) 
_{Jun}
(11) 
_{Jul}
(21) 
_{Aug}
(15) 
_{Sep}
(13) 
_{Oct}
(23) 
_{Nov}
(1) 
_{Dec}
(8) 
2005 
_{Jan}
(27) 
_{Feb}
(57) 
_{Mar}
(86) 
_{Apr}
(23) 
_{May}
(37) 
_{Jun}
(34) 
_{Jul}
(24) 
_{Aug}
(17) 
_{Sep}
(50) 
_{Oct}
(24) 
_{Nov}
(10) 
_{Dec}
(60) 
2006 
_{Jan}
(47) 
_{Feb}
(46) 
_{Mar}
(127) 
_{Apr}
(19) 
_{May}
(26) 
_{Jun}
(62) 
_{Jul}
(47) 
_{Aug}
(51) 
_{Sep}
(61) 
_{Oct}
(42) 
_{Nov}
(50) 
_{Dec}
(33) 
2007 
_{Jan}
(60) 
_{Feb}
(55) 
_{Mar}
(77) 
_{Apr}
(102) 
_{May}
(82) 
_{Jun}
(102) 
_{Jul}
(169) 
_{Aug}
(117) 
_{Sep}
(80) 
_{Oct}
(37) 
_{Nov}
(51) 
_{Dec}
(43) 
2008 
_{Jan}
(71) 
_{Feb}
(94) 
_{Mar}
(98) 
_{Apr}
(125) 
_{May}
(54) 
_{Jun}
(119) 
_{Jul}
(60) 
_{Aug}
(111) 
_{Sep}
(118) 
_{Oct}
(125) 
_{Nov}
(119) 
_{Dec}
(94) 
2009 
_{Jan}
(109) 
_{Feb}
(38) 
_{Mar}
(93) 
_{Apr}
(88) 
_{May}
(29) 
_{Jun}
(57) 
_{Jul}
(53) 
_{Aug}
(48) 
_{Sep}
(68) 
_{Oct}
(151) 
_{Nov}
(23) 
_{Dec}
(35) 
2010 
_{Jan}
(84) 
_{Feb}
(60) 
_{Mar}
(184) 
_{Apr}
(112) 
_{May}
(60) 
_{Jun}
(90) 
_{Jul}
(23) 
_{Aug}
(70) 
_{Sep}
(119) 
_{Oct}
(27) 
_{Nov}
(47) 
_{Dec}
(54) 
2011 
_{Jan}
(22) 
_{Feb}
(19) 
_{Mar}
(92) 
_{Apr}
(93) 
_{May}
(35) 
_{Jun}
(91) 
_{Jul}
(32) 
_{Aug}
(61) 
_{Sep}
(7) 
_{Oct}
(69) 
_{Nov}
(81) 
_{Dec}
(23) 
2012 
_{Jan}
(64) 
_{Feb}
(95) 
_{Mar}
(35) 
_{Apr}
(36) 
_{May}
(63) 
_{Jun}
(98) 
_{Jul}
(70) 
_{Aug}
(171) 
_{Sep}
(149) 
_{Oct}
(64) 
_{Nov}
(67) 
_{Dec}
(126) 
2013 
_{Jan}
(108) 
_{Feb}
(104) 
_{Mar}
(171) 
_{Apr}
(133) 
_{May}
(108) 
_{Jun}
(100) 
_{Jul}
(93) 
_{Aug}
(126) 
_{Sep}
(74) 
_{Oct}
(59) 
_{Nov}
(145) 
_{Dec}
(93) 
2014 
_{Jan}
(38) 
_{Feb}
(45) 
_{Mar}
(26) 
_{Apr}
(41) 
_{May}
(125) 
_{Jun}
(70) 
_{Jul}
(61) 
_{Aug}
(66) 
_{Sep}
(60) 
_{Oct}
(110) 
_{Nov}
(27) 
_{Dec}
(30) 
2015 
_{Jan}
(43) 
_{Feb}
(67) 
_{Mar}
(71) 
_{Apr}
(92) 
_{May}
(39) 
_{Jun}
(15) 
_{Jul}
(46) 
_{Aug}
(63) 
_{Sep}
(84) 
_{Oct}
(82) 
_{Nov}
(68) 
_{Dec}

S  M  T  W  T  F  S 







1

2

3

4

5
(10) 
6
(6) 
7
(6) 
8
(3) 
9

10

11
(19) 
12
(6) 
13
(21) 
14
(1) 
15

16

17
(6) 
18
(7) 
19
(5) 
20
(3) 
21
(2) 
22

23
(2) 
24
(5) 
25
(1) 
26

27
(2) 
28
(2) 
29
(11) 
30
(1) 






From: Vijay S. Mahadevan <vijay.m@gm...>  20081107 18:13:45

That makes a lot more sense. I can loop over the elements depending on the flux type I use for my discretization and infer the coupled dofs I need. Thanks for the help Ben ! On Fri, Nov 7, 2008 at 11:50 AM, Benjamin Kirk <benjamin.kirk@...> wrote: >> A clarification regarding what you said: >> >>> The DofMap will add all face neighbor degrees of freedom to an element's >>> coupled dofs if >>> >>> (1) all the variables in the system are discontinuous, or >>> (2) the command line option 'implicit_neighbor_dofs' is specified. >> >> In my system, all variables use disconitnuous basis. And now I tried >> to run the code with implicit_neighbor_dofs in command line. But I >> still get only the local element dofs and not the neighboring element >> dofs when I populate dofs with the call "dof_map.dof_indices(...)". > > Oh, I see what you want. In your case the sparsity pattern is built up > properly using the neighbor degrees of freedom, but DofMap::dof_indices(...) > will always return only the degrees of freedom an element needs to compute > the solution in that element... > >> Is this what is to be expected or did I try something else than what >> you suggested ? I just need DofMap to tell me which other dof my >> current dofs are coupled to. From what I understand, your reply >> answers that but somehow I do not see the same in my code. Am i >> missing something ? > > So, in your case, when you are computing the face contribution between > 'elem' and 'neighbor', you need DofMap::dof_indices(elem,...) and > DofMap::dof_indices(neighbor,...). This is consistent with the fact that > you will need two FiniteElement objects, one on the local element and one > for the neighbor. > > The total list of Dofs coupled to a given element will be the union of > > DofMap::dof_indices(elem,...) > DofMap::dof_indices(neighbor_0,...) > ... > DofMap::dof_indices(neighbor_N,...) > > Does that help? > > Ben > > 
From: Benjamin Kirk <benjamin.kirk@na...>  20081107 17:50:27

> A clarification regarding what you said: > >> The DofMap will add all face neighbor degrees of freedom to an element's >> coupled dofs if >> >> (1) all the variables in the system are discontinuous, or >> (2) the command line option 'implicit_neighbor_dofs' is specified. > > In my system, all variables use disconitnuous basis. And now I tried > to run the code with implicit_neighbor_dofs in command line. But I > still get only the local element dofs and not the neighboring element > dofs when I populate dofs with the call "dof_map.dof_indices(...)". Oh, I see what you want. In your case the sparsity pattern is built up properly using the neighbor degrees of freedom, but DofMap::dof_indices(...) will always return only the degrees of freedom an element needs to compute the solution in that element... > Is this what is to be expected or did I try something else than what > you suggested ? I just need DofMap to tell me which other dof my > current dofs are coupled to. From what I understand, your reply > answers that but somehow I do not see the same in my code. Am i > missing something ? So, in your case, when you are computing the face contribution between 'elem' and 'neighbor', you need DofMap::dof_indices(elem,...) and DofMap::dof_indices(neighbor,...). This is consistent with the fact that you will need two FiniteElement objects, one on the local element and one for the neighbor. The total list of Dofs coupled to a given element will be the union of DofMap::dof_indices(elem,...) DofMap::dof_indices(neighbor_0,...) ... DofMap::dof_indices(neighbor_N,...) Does that help? Ben 
From: Vijay S. Mahadevan <vijay.m@gm...>  20081107 17:43:14

Ben, Thanks for the prompt reply. A clarification regarding what you said: > The DofMap will add all face neighbor degrees of freedom to an element's > coupled dofs if > > (1) all the variables in the system are discontinuous, or > (2) the command line option 'implicit_neighbor_dofs' is specified. In my system, all variables use disconitnuous basis. And now I tried to run the code with implicit_neighbor_dofs in command line. But I still get only the local element dofs and not the neighboring element dofs when I populate dofs with the call "dof_map.dof_indices(...)". Is this what is to be expected or did I try something else than what you suggested ? I just need DofMap to tell me which other dof my current dofs are coupled to. From what I understand, your reply answers that but somehow I do not see the same in my code. Am i missing something ? Vijay On Fri, Nov 7, 2008 at 11:31 AM, Benjamin Kirk <benjamin.kirk@...> wrote: > >> I'm not sure if this is a stupid question but I'm not entirely sure as >> to how you would find the nonzeros per row in a DG discretization. >> This is not obvious to me since in say an advection system, the flux >> at the boundaries couple the dofs in one cell to the next. And hence, >> the dof_map does not have any idea about this discretization or >> coupling of variables. In the continuous FEM case, the dof_map does >> provide the connectivity correctly since the mesh_nodes are the dofs >> in the discretization. > > In DG discretizations the degrees of freedom in any given element are > typically coupled to all the degrees of freedom in its faceneighbors. > >> Any ideas as to how I can possibly access the location of nonzeros per >> row by just looking at the final assembled matrix ? I saw that there >> is a update_sparsity_pattern routine in SparseMatrix but this is not >> overloaded in PetscMatrix. I guess I am looking for something like the >> sparsity_pattern object, or am I way off on my interpretation of what >> this means ? > > The DofMap will add all face neighbor degrees of freedom to an element's > coupled dofs if > > (1) all the variables in the system are discontinuous, or > (2) the command line option 'implicit_neighbor_dofs' is specified. > > The role of the sparsity pattern is to define the graph structure of the > system matrix. So, the DofMap builds this graph, and then passess it off to > any associated matrices. It is then destroyed because it takes a fair bit > of memory and, in theory, should not be needed after the matrices are > reinitialized with the proper structure. > > In the case of PETSc it does not actually need the graph of the matrix  the > number of on and offprocessor nonzeros per row will suffice. Of course, > internal to the DofMap the sparsity pattern is computed anyway to get this > information. > > If you look in the laspack and trilinos matrix implementations you will see > where the sparsity pattern is actually used. > > Hope this helps, > > Ben > > > 
From: Benjamin Kirk <benjamin.kirk@na...>  20081107 17:31:17

> I'm not sure if this is a stupid question but I'm not entirely sure as > to how you would find the nonzeros per row in a DG discretization. > This is not obvious to me since in say an advection system, the flux > at the boundaries couple the dofs in one cell to the next. And hence, > the dof_map does not have any idea about this discretization or > coupling of variables. In the continuous FEM case, the dof_map does > provide the connectivity correctly since the mesh_nodes are the dofs > in the discretization. In DG discretizations the degrees of freedom in any given element are typically coupled to all the degrees of freedom in its faceneighbors. > Any ideas as to how I can possibly access the location of nonzeros per > row by just looking at the final assembled matrix ? I saw that there > is a update_sparsity_pattern routine in SparseMatrix but this is not > overloaded in PetscMatrix. I guess I am looking for something like the > sparsity_pattern object, or am I way off on my interpretation of what > this means ? The DofMap will add all face neighbor degrees of freedom to an element's coupled dofs if (1) all the variables in the system are discontinuous, or (2) the command line option 'implicit_neighbor_dofs' is specified. The role of the sparsity pattern is to define the graph structure of the system matrix. So, the DofMap builds this graph, and then passess it off to any associated matrices. It is then destroyed because it takes a fair bit of memory and, in theory, should not be needed after the matrices are reinitialized with the proper structure. In the case of PETSc it does not actually need the graph of the matrix  the number of on and offprocessor nonzeros per row will suffice. Of course, internal to the DofMap the sparsity pattern is computed anyway to get this information. If you look in the laspack and trilinos matrix implementations you will see where the sparsity pattern is actually used. Hope this helps, Ben 
From: Vijay S. Mahadevan <vijay.m@gm...>  20081107 16:35:59

Hi, I'm not sure if this is a stupid question but I'm not entirely sure as to how you would find the nonzeros per row in a DG discretization. This is not obvious to me since in say an advection system, the flux at the boundaries couple the dofs in one cell to the next. And hence, the dof_map does not have any idea about this discretization or coupling of variables. In the continuous FEM case, the dof_map does provide the connectivity correctly since the mesh_nodes are the dofs in the discretization. Any ideas as to how I can possibly access the location of nonzeros per row by just looking at the final assembled matrix ? I saw that there is a update_sparsity_pattern routine in SparseMatrix but this is not overloaded in PetscMatrix. I guess I am looking for something like the sparsity_pattern object, or am I way off on my interpretation of what this means ? Thank you very much for any help in advance. Vijay 
From: Vijay S. Mahadevan <vijay.m@gm...>  20081107 15:51:10

Hi there, Before I even start, I know this is probably going to be a loaded question, but nevertheless, any help would be appreciated. Background: I'm working on solution to large scale PDE systems based on libmesh for discretization and petsc for solution procedures. My physics objects are derived from EquationSystems in libmesh datastructure and can contain one or more implicitsystems in it. I use the JacobianFree Krylov scheme for my nonlinear solve and hence the efficiency of the method is achieved from good preconditioning. I've tried using ILU to precondition my preconditioner (linearized version) of the original nonlinear system and this works well for serial runs. But this does not scale when I run in parallel. So my questions are 1) Is it is possible to use libmesh objects to perform geometric multigrid preconditioning and completely avoid the matrix creation except in the coarsest case ?! I've seen some of Ben's presentation that mentions using multigrid for stokes problems etc. and so am curious as to what this would entail ... For example, would I have to create multiple equationsystems object since the mesh is not associated with just the system. Hence for every multigrid level, I would have a new instance of the physics system but on a coarser mesh level. Is this how you would possibly implement this ?! 2) If I can afford to create the matrix in the finest level, can we use algebraic multigrid procedures to precondition this system in parallel ? Anyone know how this scales ? 3) Also, I was reading about Prometheus (http://www.columbia.edu/~ma2325/prom_intro.html) and it looks promising as both a blackbox solver and preconditioner when the matrix is available and geometric multigrid in 2D (not sure about 3D). Has anyone in this list used this package with Petsc and/or libMesh ?! Again, like I said before, there is a lot of things here you can comment on. Feel free to write your thoughts because I want to get as much input as possible before choosing anything specific. Thanks, Vijay 