You can subscribe to this list here.
2003 
_{Jan}

_{Feb}

_{Mar}

_{Apr}

_{May}

_{Jun}

_{Jul}

_{Aug}

_{Sep}
(2) 
_{Oct}
(2) 
_{Nov}
(27) 
_{Dec}
(31) 

2004 
_{Jan}
(6) 
_{Feb}
(15) 
_{Mar}
(33) 
_{Apr}
(10) 
_{May}
(46) 
_{Jun}
(11) 
_{Jul}
(21) 
_{Aug}
(15) 
_{Sep}
(13) 
_{Oct}
(23) 
_{Nov}
(1) 
_{Dec}
(8) 
2005 
_{Jan}
(27) 
_{Feb}
(57) 
_{Mar}
(86) 
_{Apr}
(23) 
_{May}
(37) 
_{Jun}
(34) 
_{Jul}
(24) 
_{Aug}
(17) 
_{Sep}
(50) 
_{Oct}
(24) 
_{Nov}
(10) 
_{Dec}
(60) 
2006 
_{Jan}
(47) 
_{Feb}
(46) 
_{Mar}
(127) 
_{Apr}
(19) 
_{May}
(26) 
_{Jun}
(62) 
_{Jul}
(47) 
_{Aug}
(51) 
_{Sep}
(61) 
_{Oct}
(42) 
_{Nov}
(50) 
_{Dec}
(33) 
2007 
_{Jan}
(60) 
_{Feb}
(55) 
_{Mar}
(77) 
_{Apr}
(102) 
_{May}
(82) 
_{Jun}
(102) 
_{Jul}
(169) 
_{Aug}
(117) 
_{Sep}
(80) 
_{Oct}
(37) 
_{Nov}
(51) 
_{Dec}
(43) 
2008 
_{Jan}
(71) 
_{Feb}
(94) 
_{Mar}
(98) 
_{Apr}
(125) 
_{May}
(54) 
_{Jun}
(119) 
_{Jul}
(60) 
_{Aug}
(111) 
_{Sep}
(118) 
_{Oct}
(125) 
_{Nov}
(119) 
_{Dec}
(94) 
2009 
_{Jan}
(109) 
_{Feb}
(38) 
_{Mar}
(93) 
_{Apr}
(88) 
_{May}
(29) 
_{Jun}
(57) 
_{Jul}
(53) 
_{Aug}
(48) 
_{Sep}
(68) 
_{Oct}
(151) 
_{Nov}
(23) 
_{Dec}
(35) 
2010 
_{Jan}
(84) 
_{Feb}
(60) 
_{Mar}
(184) 
_{Apr}
(112) 
_{May}
(60) 
_{Jun}
(90) 
_{Jul}
(23) 
_{Aug}
(70) 
_{Sep}
(119) 
_{Oct}
(27) 
_{Nov}
(47) 
_{Dec}
(54) 
2011 
_{Jan}
(22) 
_{Feb}
(19) 
_{Mar}
(92) 
_{Apr}
(93) 
_{May}
(35) 
_{Jun}
(91) 
_{Jul}
(32) 
_{Aug}
(61) 
_{Sep}
(7) 
_{Oct}
(69) 
_{Nov}
(81) 
_{Dec}
(23) 
2012 
_{Jan}
(64) 
_{Feb}
(95) 
_{Mar}
(35) 
_{Apr}
(36) 
_{May}
(63) 
_{Jun}
(98) 
_{Jul}
(70) 
_{Aug}
(171) 
_{Sep}
(149) 
_{Oct}
(64) 
_{Nov}
(67) 
_{Dec}
(126) 
2013 
_{Jan}
(108) 
_{Feb}
(104) 
_{Mar}
(171) 
_{Apr}
(133) 
_{May}
(108) 
_{Jun}
(100) 
_{Jul}
(93) 
_{Aug}
(126) 
_{Sep}
(74) 
_{Oct}
(59) 
_{Nov}
(145) 
_{Dec}
(93) 
2014 
_{Jan}
(38) 
_{Feb}
(45) 
_{Mar}
(26) 
_{Apr}
(41) 
_{May}
(125) 
_{Jun}
(70) 
_{Jul}
(61) 
_{Aug}
(66) 
_{Sep}
(60) 
_{Oct}
(110) 
_{Nov}
(27) 
_{Dec}
(30) 
2015 
_{Jan}
(43) 
_{Feb}
(67) 
_{Mar}
(71) 
_{Apr}
(92) 
_{May}
(39) 
_{Jun}
(15) 
_{Jul}
(46) 
_{Aug}
(63) 
_{Sep}
(84) 
_{Oct}
(82) 
_{Nov}
(69) 
_{Dec}
(45) 
2016 
_{Jan}
(92) 
_{Feb}
(91) 
_{Mar}
(148) 
_{Apr}
(43) 
_{May}
(58) 
_{Jun}
(117) 
_{Jul}
(92) 
_{Aug}
(140) 
_{Sep}
(49) 
_{Oct}
(33) 
_{Nov}
(85) 
_{Dec}
(40) 
2017 
_{Jan}
(41) 
_{Feb}
(36) 
_{Mar}
(49) 
_{Apr}
(41) 
_{May}
(51) 
_{Jun}

_{Jul}

_{Aug}

_{Sep}

_{Oct}

_{Nov}

_{Dec}

S  M  T  W  T  F  S 






1
(5) 
2

3

4

5

6
(12) 
7

8

9
(2) 
10

11
(1) 
12
(16) 
13
(23) 
14
(7) 
15
(10) 
16
(1) 
17

18
(14) 
19
(9) 
20
(2) 
21
(7) 
22
(2) 
23
(5) 
24

25
(17) 
26
(9) 
27

28

29

30
(3) 
From: Subramanya Sadasiva <potaman@ou...>  20131126 21:09:27

Hi Dmitry, Thanks very much. I will try if that works. Subramanya On Nov 26, 2013, at 4:03 PM, Dmitry Karpeyev <karpeev@...> wrote: > Ah, that might still be true, although that ought to be changed. > In any event, you can use the DM to set the bounds for the VI, > and set up PCFieldSplit as I explained in the previous email. > > Let me know if this works for you. > Cheers, > Dmitry. > > > On Tue, Nov 26, 2013 at 3:01 PM, Subramanya Sadasiva <potaman@...> wrote: > Hi Dmitry, > I was under the impression that I needed to use the DM to use the VI solver. Is that not the case? > Thanks, > Subramanya. > > On Nov 26, 2013, at 3:17 PM, Dmitry Karpeyev <karpeev@...> wrote: > >> I assume you are referring to using the Schurcomplement flavor of PCFieldSplit? >> How many variables do you have? Are they all C0? Is the basis nodal? If so, >> you don't necessarily need DMlibMesh: >> >> Assume you have m P1 variables (maybe m==2?). You can lay your degrees of freedom >> in the variablemajor order (by this I mean having contiguous nodal coefficients corresponding >> to a given node  maybe that's called 'nodemajor':). >> Then use these options: >> pc_type fieldsplit pc_fieldsplit_block_size m pc_fieldsplit_0_fields 0 pc_fieldsplit_1_fields 1\ >> pc_fieldsplit_type schur pc_fieldsplit_schur_factorization_type full >> >> This will put the 0th (in the zerobase Cspeak) variable into the 0th split, and the first variable >> in the first split. You can also permute the column blocks, if necessary, or put more variables >> into each split. You might try pc_fieldsplit_schur_factorization_type upper or lower. >> >> The crucial part, however, is to have a good preconditioner for S, which in the terminology of the notes >> you sent is S = D  C inv(A) B. By default PETSc will use D as the preconditioner for S. That may >> or may not be good enough. Otherwise you will need to assemble a preconditioner for S, things >> will get more hairy, and we might need to put more support for that into libMesh. >> >> Hope this helps. >> Dmitry. >> >> >> On Tue, Nov 26, 2013 at 1:56 PM, Subramanya Sadasiva <potaman@...> wrote: >> >> Hi Dmitry, >> I am using the DM solver to solve a Cahn Hilliard equation using a C0 discretization. I would like to use the preconditioner used in >> >> http://www.mcs.anl.gov/~anitescu/Presentations/2011/anitescu2011SIAMCSEDVI.pdf >> >> >> I have been able to use the PETSCDM solver with the VI solver to solve the Cahn Hilliard equation, but I haven’t been able to get a fieldsplit preconditioner to work. >> >> Thanks, >> Subramanya >> >> >> On Nov 26, 2013, at 2:07 PM, Dmitry Karpeyev <karpeev@...> wrote: >> >>> DMlibMesh should work, but it's in need of an overhaul and simplification. >>> Could you tell me how you intend to use it? That way we can figure out >>> what needs to be done to make it usable. >>> Thanks. >>> Dmitry. >>> >>> >>> On Thu, Nov 21, 2013 at 1:32 PM, subramanya sadasiva <potaman@...> wrote: >>> Hi , I want to use petsc fieldsplit preconditioners with the petscdm solver that has been implemented as part of libmesh. The last time I tried it, I had trouble because dmcreatefielddecomposition was not being called. Is there someone that could guide me with trying to correct this in libmesh? Thanks, Subramanya >>>  >>> Shape the Mobile Experience: Free Subscription >>> Software experts and developers: Be at the forefront of tech innovation. >>> Intel(R) Software Adrenaline delivers strategic insight and gamechanging >>> conversations that shape the rapidly evolving mobile landscape. Sign up now. >>> http://pubads.g.doubleclick.net/gampad/clk?id=63431311&iu=/4140/ostg.clktrk >>> _______________________________________________ >>> Libmeshusers mailing list >>> Libmeshusers@... >>> https://lists.sourceforge.net/lists/listinfo/libmeshusers >>> >>> >>> >>>  >>> Dmitry Karpeev >>> Mathematics and Computer Science >>> Argonne National Laboratory >>> Argonne, Illinois, USA >>> and >>> Computation Institute >>> University of Chicago >>> 5735 S. Ellis Avenue >>> Chicago, IL 60637 >>>  >>> Phone: 6302521229 >>> Fax: 6302525986 >> >> >> >> > > > > 
From: Dmitry Karpeyev <karpeev@mc...>  20131126 21:04:45

Ah, that might still be true, although that ought to be changed. In any event, you can use the DM to set the bounds for the VI, and set up PCFieldSplit as I explained in the previous email. Let me know if this works for you. Cheers, Dmitry. On Tue, Nov 26, 2013 at 3:01 PM, Subramanya Sadasiva <potaman@...>wrote: > Hi Dmitry, > I was under the impression that I needed to use the DM to use the VI > solver. Is that not the case? > Thanks, > Subramanya. > > On Nov 26, 2013, at 3:17 PM, Dmitry Karpeyev <karpeev@...> wrote: > > I assume you are referring to using the Schurcomplement flavor of > PCFieldSplit? > How many variables do you have? Are they all C0? Is the basis nodal? If > so, > you don't necessarily need DMlibMesh: > > Assume you have m P1 variables (maybe m==2?). You can lay your degrees of > freedom > in the variablemajor order (by this I mean having contiguous nodal > coefficients corresponding > to a given node  maybe that's called 'nodemajor':). > Then use these options: > pc_type fieldsplit pc_fieldsplit_block_size m pc_fieldsplit_0_fields 0 > pc_fieldsplit_1_fields 1\ > pc_fieldsplit_type schur pc_fieldsplit_schur_factorization_type full > > This will put the 0th (in the zerobase Cspeak) variable into the 0th > split, and the first variable > in the first split. You can also permute the column blocks, if necessary, > or put more variables > into each split. You might try pc_fieldsplit_schur_factorization_type > upper or lower. > > The crucial part, however, is to have a good preconditioner for S, which > in the terminology of the notes > you sent is S = D  C inv(A) B. By default PETSc will use D as the > preconditioner for S. That may > or may not be good enough. Otherwise you will need to assemble a > preconditioner for S, things > will get more hairy, and we might need to put more support for that into > libMesh. > > Hope this helps. > Dmitry. > > > On Tue, Nov 26, 2013 at 1:56 PM, Subramanya Sadasiva <potaman@...>wrote: > >> >> Hi Dmitry, >> I am using the DM solver to solve a Cahn Hilliard equation using a C0 >> discretization. I would like to use the preconditioner used in >> >> >> http://www.mcs.anl.gov/~anitescu/Presentations/2011/anitescu2011SIAMCSEDVI.pdf >> >> >> I have been able to use the PETSCDM solver with the VI solver to solve >> the Cahn Hilliard equation, but I haven’t been able to get a fieldsplit >> preconditioner to work. >> >> Thanks, >> Subramanya >> >> >> On Nov 26, 2013, at 2:07 PM, Dmitry Karpeyev <karpeev@...> wrote: >> >> DMlibMesh should work, but it's in need of an overhaul and simplification. >> Could you tell me how you intend to use it? That way we can figure out >> what needs to be done to make it usable. >> Thanks. >> Dmitry. >> >> >> On Thu, Nov 21, 2013 at 1:32 PM, subramanya sadasiva <potaman@... >> > wrote: >> >>> Hi , I want to use petsc fieldsplit preconditioners with the petscdm >>> solver that has been implemented as part of libmesh. The last time I tried >>> it, I had trouble because dmcreatefielddecomposition was not being called. >>> Is there someone that could guide me with trying to correct this in >>> libmesh? Thanks, Subramanya >>> >>>  >>> Shape the Mobile Experience: Free Subscription >>> Software experts and developers: Be at the forefront of tech innovation. >>> Intel(R) Software Adrenaline delivers strategic insight and gamechanging >>> conversations that shape the rapidly evolving mobile landscape. Sign up >>> now. >>> >>> http://pubads.g.doubleclick.net/gampad/clk?id=63431311&iu=/4140/ostg.clktrk >>> _______________________________________________ >>> Libmeshusers mailing list >>> Libmeshusers@... >>> https://lists.sourceforge.net/lists/listinfo/libmeshusers >>> >> >> >> >>  >> Dmitry Karpeev >> Mathematics and Computer Science >> Argonne National Laboratory >> Argonne, Illinois, USA >> and >> Computation Institute >> University of Chicago >> 5735 S. Ellis Avenue >> Chicago, IL 60637 >>  >> Phone: 6302521229 >> Fax: 6302525986 >> >> >> > > > > 
From: Subramanya Sadasiva <potaman@ou...>  20131126 21:01:34

Hi Dmitry, I was under the impression that I needed to use the DM to use the VI solver. Is that not the case? Thanks, Subramanya. On Nov 26, 2013, at 3:17 PM, Dmitry Karpeyev <karpeev@...> wrote: > I assume you are referring to using the Schurcomplement flavor of PCFieldSplit? > How many variables do you have? Are they all C0? Is the basis nodal? If so, > you don't necessarily need DMlibMesh: > > Assume you have m P1 variables (maybe m==2?). You can lay your degrees of freedom > in the variablemajor order (by this I mean having contiguous nodal coefficients corresponding > to a given node  maybe that's called 'nodemajor':). > Then use these options: > pc_type fieldsplit pc_fieldsplit_block_size m pc_fieldsplit_0_fields 0 pc_fieldsplit_1_fields 1\ > pc_fieldsplit_type schur pc_fieldsplit_schur_factorization_type full > > This will put the 0th (in the zerobase Cspeak) variable into the 0th split, and the first variable > in the first split. You can also permute the column blocks, if necessary, or put more variables > into each split. You might try pc_fieldsplit_schur_factorization_type upper or lower. > > The crucial part, however, is to have a good preconditioner for S, which in the terminology of the notes > you sent is S = D  C inv(A) B. By default PETSc will use D as the preconditioner for S. That may > or may not be good enough. Otherwise you will need to assemble a preconditioner for S, things > will get more hairy, and we might need to put more support for that into libMesh. > > Hope this helps. > Dmitry. > > > On Tue, Nov 26, 2013 at 1:56 PM, Subramanya Sadasiva <potaman@...> wrote: > > Hi Dmitry, > I am using the DM solver to solve a Cahn Hilliard equation using a C0 discretization. I would like to use the preconditioner used in > > http://www.mcs.anl.gov/~anitescu/Presentations/2011/anitescu2011SIAMCSEDVI.pdf > > > I have been able to use the PETSCDM solver with the VI solver to solve the Cahn Hilliard equation, but I haven’t been able to get a fieldsplit preconditioner to work. > > Thanks, > Subramanya > > > On Nov 26, 2013, at 2:07 PM, Dmitry Karpeyev <karpeev@...> wrote: > >> DMlibMesh should work, but it's in need of an overhaul and simplification. >> Could you tell me how you intend to use it? That way we can figure out >> what needs to be done to make it usable. >> Thanks. >> Dmitry. >> >> >> On Thu, Nov 21, 2013 at 1:32 PM, subramanya sadasiva <potaman@...> wrote: >> Hi , I want to use petsc fieldsplit preconditioners with the petscdm solver that has been implemented as part of libmesh. The last time I tried it, I had trouble because dmcreatefielddecomposition was not being called. Is there someone that could guide me with trying to correct this in libmesh? Thanks, Subramanya >>  >> Shape the Mobile Experience: Free Subscription >> Software experts and developers: Be at the forefront of tech innovation. >> Intel(R) Software Adrenaline delivers strategic insight and gamechanging >> conversations that shape the rapidly evolving mobile landscape. Sign up now. >> http://pubads.g.doubleclick.net/gampad/clk?id=63431311&iu=/4140/ostg.clktrk >> _______________________________________________ >> Libmeshusers mailing list >> Libmeshusers@... >> https://lists.sourceforge.net/lists/listinfo/libmeshusers >> >> >> >>  >> Dmitry Karpeev >> Mathematics and Computer Science >> Argonne National Laboratory >> Argonne, Illinois, USA >> and >> Computation Institute >> University of Chicago >> 5735 S. Ellis Avenue >> Chicago, IL 60637 >>  >> Phone: 6302521229 >> Fax: 6302525986 > > > > 
From: Dmitry Karpeyev <karpeev@mc...>  20131126 20:53:33

Dear All, My apologies in advance for a misuse of this list. However, I wanted to draw your attention to a new postdoctoral position at Argonne National Laboratory, which may be of interest to some of you. This job involves research into efficient algorithms for numerical modeling and simulation of solutions of charged particles in confined geometries. Experience with PETSc, libMesh or MOOSE will be a big plus here, hence, this message. The formal position description can be found here: http://web.anl.gov/jobsearch/detail.jsp?userreqid=321577+MCS&lsBrowse=POSTDOC In plainer terms, however, we are interested in the quantitative understanding of the dynamics of DNA molecules suspended in solutions of varying ionic strength, and confined to channels of 10100nm linear crosssection, with charged walls. If you know your way around FEM models of Stokesian fluids in nontrivial geometries, have experience with electrostatic force calculations, and are interested in making these algorithms run fast, this may be the job for you. We are interested in accelerating these simulations to be able to resolve the longtime fluctuation dynamics of the DNA molecules translocating (i.e., flopping their way through) the nanochannels. Some modeling issues will likely need to be addressed as well: at various ionic strengths the finitesize effects of ions in the solution may become significant near the channel walls or the solute (DNA) molecules. Thus, an interest and experience in particletocontinuum coupling is a plus. A working knowledge of C/C++ is required. Familiarity with the GPU/multicore/manycore architectures is preferred. Experience with the PETSc library (http://www.mcs.anl.gov/petsc) is a huge plus. Ditto libMesh and MOOSE. A successful candidate will work closely with the University of Chicago scientists at the Computation Institute as well as the Institute for Molecular Engineering. If you have questions, please feel free to contact me directly. I apologize, if you are getting this message more than once. Best regards, Dmitry.  Dmitry Karpeyev Mathematics and Computer Science Argonne National Laboratory Argonne, Illinois, USA and Computation Institute University of Chicago 5735 S. Ellis Avenue Chicago, IL 60637  Phone: 6302521229 Fax: 6302525986 
From: Dmitry Karpeyev <karpeev@mc...>  20131126 20:18:21

I assume you are referring to using the Schurcomplement flavor of PCFieldSplit? How many variables do you have? Are they all C0? Is the basis nodal? If so, you don't necessarily need DMlibMesh: Assume you have m P1 variables (maybe m==2?). You can lay your degrees of freedom in the variablemajor order (by this I mean having contiguous nodal coefficients corresponding to a given node  maybe that's called 'nodemajor':). Then use these options: pc_type fieldsplit pc_fieldsplit_block_size m pc_fieldsplit_0_fields 0 pc_fieldsplit_1_fields 1\ pc_fieldsplit_type schur pc_fieldsplit_schur_factorization_type full This will put the 0th (in the zerobase Cspeak) variable into the 0th split, and the first variable in the first split. You can also permute the column blocks, if necessary, or put more variables into each split. You might try pc_fieldsplit_schur_factorization_type upper or lower. The crucial part, however, is to have a good preconditioner for S, which in the terminology of the notes you sent is S = D  C inv(A) B. By default PETSc will use D as the preconditioner for S. That may or may not be good enough. Otherwise you will need to assemble a preconditioner for S, things will get more hairy, and we might need to put more support for that into libMesh. Hope this helps. Dmitry. On Tue, Nov 26, 2013 at 1:56 PM, Subramanya Sadasiva <potaman@...>wrote: > > Hi Dmitry, > I am using the DM solver to solve a Cahn Hilliard equation using a C0 > discretization. I would like to use the preconditioner used in > > > http://www.mcs.anl.gov/~anitescu/Presentations/2011/anitescu2011SIAMCSEDVI.pdf > > > I have been able to use the PETSCDM solver with the VI solver to solve > the Cahn Hilliard equation, but I haven’t been able to get a fieldsplit > preconditioner to work. > > Thanks, > Subramanya > > > On Nov 26, 2013, at 2:07 PM, Dmitry Karpeyev <karpeev@...> wrote: > > DMlibMesh should work, but it's in need of an overhaul and simplification. > Could you tell me how you intend to use it? That way we can figure out > what needs to be done to make it usable. > Thanks. > Dmitry. > > > On Thu, Nov 21, 2013 at 1:32 PM, subramanya sadasiva <potaman@...>wrote: > >> Hi , I want to use petsc fieldsplit preconditioners with the petscdm >> solver that has been implemented as part of libmesh. The last time I tried >> it, I had trouble because dmcreatefielddecomposition was not being called. >> Is there someone that could guide me with trying to correct this in >> libmesh? Thanks, Subramanya >> >>  >> Shape the Mobile Experience: Free Subscription >> Software experts and developers: Be at the forefront of tech innovation. >> Intel(R) Software Adrenaline delivers strategic insight and gamechanging >> conversations that shape the rapidly evolving mobile landscape. Sign up >> now. >> >> http://pubads.g.doubleclick.net/gampad/clk?id=63431311&iu=/4140/ostg.clktrk >> _______________________________________________ >> Libmeshusers mailing list >> Libmeshusers@... >> https://lists.sourceforge.net/lists/listinfo/libmeshusers >> > > > >  > Dmitry Karpeev > Mathematics and Computer Science > Argonne National Laboratory > Argonne, Illinois, USA > and > Computation Institute > University of Chicago > 5735 S. Ellis Avenue > Chicago, IL 60637 >  > Phone: 6302521229 > Fax: 6302525986 > > > 
From: Subramanya Sadasiva <potaman@ou...>  20131126 19:56:13

Hi Dmitry, I am using the DM solver to solve a Cahn Hilliard equation using a C0 discretization. I would like to use the preconditioner used in http://www.mcs.anl.gov/~anitescu/Presentations/2011/anitescu2011SIAMCSEDVI.pdf I have been able to use the PETSCDM solver with the VI solver to solve the Cahn Hilliard equation, but I haven’t been able to get a fieldsplit preconditioner to work. Thanks, Subramanya On Nov 26, 2013, at 2:07 PM, Dmitry Karpeyev <karpeev@...> wrote: > DMlibMesh should work, but it's in need of an overhaul and simplification. > Could you tell me how you intend to use it? That way we can figure out > what needs to be done to make it usable. > Thanks. > Dmitry. > > > On Thu, Nov 21, 2013 at 1:32 PM, subramanya sadasiva <potaman@...> wrote: > Hi , I want to use petsc fieldsplit preconditioners with the petscdm solver that has been implemented as part of libmesh. The last time I tried it, I had trouble because dmcreatefielddecomposition was not being called. Is there someone that could guide me with trying to correct this in libmesh? Thanks, Subramanya >  > Shape the Mobile Experience: Free Subscription > Software experts and developers: Be at the forefront of tech innovation. > Intel(R) Software Adrenaline delivers strategic insight and gamechanging > conversations that shape the rapidly evolving mobile landscape. Sign up now. > http://pubads.g.doubleclick.net/gampad/clk?id=63431311&iu=/4140/ostg.clktrk > _______________________________________________ > Libmeshusers mailing list > Libmeshusers@... > https://lists.sourceforge.net/lists/listinfo/libmeshusers > > > >  > Dmitry Karpeev > Mathematics and Computer Science > Argonne National Laboratory > Argonne, Illinois, USA > and > Computation Institute > University of Chicago > 5735 S. Ellis Avenue > Chicago, IL 60637 >  > Phone: 6302521229 > Fax: 6302525986 
From: Derek Gaston <friedmud@gm...>  20131126 19:43:00

On Tue, Nov 26, 2013 at 11:59 AM, Dmitry Karpeyev <karpeev@...>wrote: > LU, naturally, should do better, and we should at least see quicker linear > convergence. > If not, it's an indication that your problem is singular. > LU should only take 1 (ish) linear iteration. However, I still suspect that the issue here is that his Jacobian is wrong... which accounts for the degraded nonlinear convergence. Lorenzo: Have you double checked your Jacobian statements? Have you checked them against your friend doing FEAP? Your statements should look very similar to theirs... Derek 
From: Dmitry Karpeyev <karpeev@mc...>  20131126 19:08:45

DMlibMesh should work, but it's in need of an overhaul and simplification. Could you tell me how you intend to use it? That way we can figure out what needs to be done to make it usable. Thanks. Dmitry. On Thu, Nov 21, 2013 at 1:32 PM, subramanya sadasiva <potaman@...>wrote: > Hi , I want to use petsc fieldsplit preconditioners with the petscdm > solver that has been implemented as part of libmesh. The last time I tried > it, I had trouble because dmcreatefielddecomposition was not being called. > Is there someone that could guide me with trying to correct this in > libmesh? Thanks, Subramanya > >  > Shape the Mobile Experience: Free Subscription > Software experts and developers: Be at the forefront of tech innovation. > Intel(R) Software Adrenaline delivers strategic insight and gamechanging > conversations that shape the rapidly evolving mobile landscape. Sign up > now. > http://pubads.g.doubleclick.net/gampad/clk?id=63431311&iu=/4140/ostg.clktrk > _______________________________________________ > Libmeshusers mailing list > Libmeshusers@... > https://lists.sourceforge.net/lists/listinfo/libmeshusers >  Dmitry Karpeev Mathematics and Computer Science Argonne National Laboratory Argonne, Illinois, USA and Computation Institute University of Chicago 5735 S. Ellis Avenue Chicago, IL 60637  Phone: 6302521229 Fax: 6302525986 
From: Dmitry Karpeyev <karpeev@mc...>  20131126 19:00:06

1. It looks like you are using petsc3.x or older. I would recommend upgrading to petsc3.4, since we don't really maintain earlier versions. 2. Could you run with snes_view so we know exactly what solvers you are using? 3. For a small problem running with pc_type lu, could you post the nonlinear and linear convergence history? snes_monitor ksp_monitor The below convergence history seems to be have been produced with the default solver options, which for the linear solver are ksp_type gmres pc_type ilu. In particular, ILU seems to be a bad preconditioner for this system, but it's not really surprising. LU, naturally, should do better, and we should at least see quicker linear convergence. If not, it's an indication that your problem is singular. Thanks. Dmitry. On Mon, Nov 25, 2013 at 6:23 AM, Lorenzo Zanon <zanon@...>wrote: > Hello, > > Thanks for the detailed answer! > > I've just rerun the executable taking off the snes_mf_operator and LU > options, but it doesn't look any better... : > > Running ./exampleopt snes_type ls snes_linesearch_type basic ksp_rtol > 1e4 > NL step 0, residual_2 = 1.936492e05 > NL step 1, residual_2 = 1.938856e05 > NL step 2, residual_2 = 1.941222e05 > NL step 3, residual_2 = 1.943592e05 > NL step 4, residual_2 = 1.945964e05 > > Then I stopped it because it didn't look like converging, after one hour > or so. > > Monitoring the linear iterations showed: > > Running ./exampleopt ksp_rtol 1e3 ksp_monitor > NL step 0, residual_2 = 1.936492e05 > 0 KSP Residual norm 7.859921803229e07 > 1 KSP Residual norm 3.815524538428e07 > 2 KSP Residual norm 3.364144277753e07 > 3 KSP Residual norm 2.909576661282e07 > 4 KSP Residual norm 2.772044053816e07 > ... > 1654 KSP Residual norm 8.626050416153e10 > 1655 KSP Residual norm 8.145093822144e10 > 1656 KSP Residual norm 6.346659238618e10 > > The same happens with the GAMG options (either pc_gamg_type agg or > pc_gamg_agg_nsmooths 1, from PETSc manual). But I could still reconfigure > PETSc and libMesh with the hypre option. > > I still haven't asked my FEAP colleague about the specifications she's > using, but she told me it takes 4 or 5 nonlinear iterations to converge, > depending on the loading. > > Thanks! > Lorenzo > > On Nov 22, 2013, at 7:10 PM, Derek Gaston wrote: > > > snes_mf_operator is a PETSc option specifying that you want to do > preconditioned Jacobian Free Newton Krylov (JFNK). Is the guy using FEAP > doing JFNK? > > > > JFNK means that every linear iteration inside a Newton step you must > recompute your residual. That means a full sweep over the mesh, > reevaluating your material properties and residual statements at every > quadrature point to assemble a full residual vector. This is expensive. > > > > If you are just doing a single physics problem (and it sounds like you > are) and you have the ability to compute the exact Jacobian (which it > sounds like you do) then using snes_mf_operator will more than likely be > MUCH slower than just solving using Newton's method like normal (where you > assemble a matrix and a RHS just _once_ per Newton step  then throw a > linear solver at it). > > > > To use "regular" Newton just leave that option off. > > > > Further, you have specified "pc_type lu"... which is generally a bad > idea for performance. With LU you are doing a direct inversion of your > Jacobian matrix (oldschool style!) and using it as a preconditioner. It > is generally much better to use an "inexact" Newton formulation where you > don't perfectly invert your Jacobian matrix and instead let your Krylov > solver solve to some (fairly loose) tolerance (ie use ksp_rtol 1e4 or > larger) so that you are not "oversolving" your linear problem inside each > Newton step. > > > > Instead of using LU I highly recommend using an algebraic multigrid > preconditioner for solid mechanics. Look into using GAMG in PETSc or use > downloadhypre when configuring PETSc and use: pc_type hypre > pc_hypre_type boomeramg. > > > > > > Basically: Your solver options are nonoptimal. Make sure you are > solving using _exactly_ the same solver options between libMesh and FEAP > before doing any comparisons. You should probably run with ksp_monitor to > show you how many linear iterations you're taking and then make sure that > both your libMesh and FEAP implementations are taking the same (or VERY > similar) number of both nonlinear _and_ linear iterations > > > > > > Derek > > > > > > > > On Fri, Nov 22, 2013 at 10:08 AM, Lorenzo Zanon < > zanon@...> wrote: > > Hello, > > > > I have an example based on miscellaneous_ex3.C, where I implemented a > nonlinear elastic problem with StVenant stressstrain law on a 3D domain > 5x1x1, blocked at x=0 and with an applied load at x=5. The problem is, the > computation of the displacement (along x y and z) is really slow (hours), > on our cluster it goes out of CPU time already with a mesh of 64x8x1 (the > loading acts only on the ydirection, so no more elements are needed along > z). 4 or 5 Newton steps should be enough for solving the problem... > > > > A colleague of mine implemented the same problem on the software called > FEAP, and it takes only a few minutes there. I think my implementation is > correct because the results on a very coarse mesh (10x2x1) are roughly > similar to the FEAP ones on a finer mesh. I don't have any problems for the > 2D case also. > > > > Is there anything I can do? I'm running in opt mode with the following > options concerning the Newton solver: > > > > snes_type ls snes_linesearch_type basic snes_mf_operator pc_type lu > pc_factor_nonzeros_along_diagonal > > > > Thanks! > > Lorenzo > > >  > > Shape the Mobile Experience: Free Subscription > > Software experts and developers: Be at the forefront of tech innovation. > > Intel(R) Software Adrenaline delivers strategic insight and gamechanging > > conversations that shape the rapidly evolving mobile landscape. Sign up > now. > > > http://pubads.g.doubleclick.net/gampad/clk?id=63431311&iu=/4140/ostg.clktrk > > _______________________________________________ > > Libmeshusers mailing list > > Libmeshusers@... > > https://lists.sourceforge.net/lists/listinfo/libmeshusers > > > > >  > Shape the Mobile Experience: Free Subscription > Software experts and developers: Be at the forefront of tech innovation. > Intel(R) Software Adrenaline delivers strategic insight and gamechanging > conversations that shape the rapidly evolving mobile landscape. Sign up > now. > http://pubads.g.doubleclick.net/gampad/clk?id=63431311&iu=/4140/ostg.clktrk > _______________________________________________ > Libmeshusers mailing list > Libmeshusers@... > https://lists.sourceforge.net/lists/listinfo/libmeshusers >  Dmitry Karpeev Mathematics and Computer Science Argonne National Laboratory Argonne, Illinois, USA and Computation Institute University of Chicago 5735 S. Ellis Avenue Chicago, IL 60637  Phone: 6302521229 Fax: 6302525986 