History log of /petsc/include/petscpc.h (Results 151 – 175 of 1091)
Revision Date Author Comments
# 3cf3a049 20-Jun-2019 Jakub Kruzik <jakub.kruzik@gmail.com>

set projection nullsp


# e924b002 19-Jun-2019 Jakub Kruzik <jakub.kruzik@gmail.com>

set coarse problem mat


# 268c6673 19-Jun-2019 Jakub Kruzik <jakub.kruzik@gmail.com>

get additional PC


# 5558ce3d 17-Jun-2019 Jakub Kruzik <jakub.kruzik@gmail.com>

remove deflation type


# 22b0793e 17-Jun-2019 Jakub Kruzik <jakub.kruzik@gmail.com>

additional PC


# 5c0c31c5 24-May-2019 Jakub Kruzik <jakub.kruzik@gmail.com>

add SetLvl


# e662bc50 07-May-2019 Jakub Kruzik <jakub.kruzik@gmail.com>

set deflation space


# 37eeb815 07-May-2019 Jakub Kruzik <jakub.kruzik@gmail.com>

init commit


# 1c575b32 07-Jul-2019 Barry Smith <bsmith@mcs.anl.gov>

Merge branch 'maint'


# 26bd1501 05-Jul-2019 Barry Smith <bsmith@mcs.anl.gov>

Remove use of _ and __ in front of PETSc include guards. Reason: C99 Reserved Identifiers

Commit-type: portability-fix


# e41697d3 01-Jun-2019 Fande Kong <fdkong.jd@gmail.com>

Merged in Fande-Kong/feature_hmg (pull request #1682)

Hybrid of PETSc preconditioners (such as ASM, BJacobi, SOR, etc.) and Hypre BoomerAMG

Approved-by: BarryFSmith <bsmith@mcs.anl.gov>


# fd2dd295 29-May-2019 Fande Kong <fdkong.jd@gmail.com>

Improve interfaces and make HMG more general

Now HMG can take any PC as an inner PC as long as the PC
provides PCGetInterpolations and PCGetCoarseOperators

If hypre is available, it will be used b

Improve interfaces and make HMG more general

Now HMG can take any PC as an inner PC as long as the PC
provides PCGetInterpolations and PCGetCoarseOperators

If hypre is available, it will be used by default, otherwise use GAMG.
Users can override the setting using -hmg_inner_pc_type

show more ...


# e75ffff3 28-May-2019 Fande Kong <fdkong.jd@gmail.com>

Move declarations to header since fortran rountine needs these


# 8a2c336b 28-May-2019 Fande Kong <fdkong.jd@gmail.com>

Update interfance and support GAMG


# 360ee056 17-Sep-2018 Fande Kong <fdkong.jd@gmail.com>

Implemented an AMG that uses Hypre to coarsen matrix

and generate a sequence of coarse matrices. These coarse
matrices are used to setup a PETSc PCMG.

There are two benefits:

(1) We can use any PE

Implemented an AMG that uses Hypre to coarsen matrix

and generate a sequence of coarse matrices. These coarse
matrices are used to setup a PETSc PCMG.

There are two benefits:

(1) We can use any PETSc preconditioners such as ASM, BJacobi as
level smoothers (solvers). For some applicaitons, the combination
of ASM and AMG works great while only AMG or AMS does not work at all.

(2) For multicomponent problems, we could just coarsen one submatrix
associated with one particular component. In this way, the setup of the
preconditioner is significantly improved. One typical use case is
neutron transport equations. There are many variables on each mesh
vertex due to the discretization of angle and energy. Each variable,
in fact, corresponds to the same PDEs but with different material
properties.

show more ...


# 5065da2f 13-May-2019 Barry Smith <bsmith@mcs.anl.gov>

Merge branch 'master' of bitbucket.org:petsc/petsc


# d1240337 12-May-2019 BarryFSmith <bsmith@mcs.anl.gov>

Merged in barry/update-deprecate-functions (pull request #1654)

Change PETSC_DEPRECATED to PETSC_DEPRECATED_FUNCTION and PETSC_DEPRECATED_TYPEDEF for code clarity


# 25ef9dfe 11-May-2019 Barry Smith <bsmith@mcs.anl.gov>

Change PETSC_DEPRECATED to PETSC_DEPRECATED_FUNCTION and PETSC_DEPRECATED_TYPEDEF for code clarity

and to make the macros match the ones for ENUM and MACRO. Add version information for almost all de

Change PETSC_DEPRECATED to PETSC_DEPRECATED_FUNCTION and PETSC_DEPRECATED_TYPEDEF for code clarity

and to make the macros match the ones for ENUM and MACRO. Add version information for almost all deprecations
in a single consistent format. Remove a couple of unneeded deprecated functions that could be inlined.

Commit-type: style-fix

show more ...


# f17a9363 24-Apr-2019 Satish Balay <balay@mcs.anl.gov>

Merged in stefano_zampini/fix-matcomputeexplicitop (pull request #1570)

Allow specifying an operator type when computing operators explicitly

Approved-by: BarryFSmith <bsmith@mcs.anl.gov>
Approved-

Merged in stefano_zampini/fix-matcomputeexplicitop (pull request #1570)

Allow specifying an operator type when computing operators explicitly

Approved-by: BarryFSmith <bsmith@mcs.anl.gov>
Approved-by: Patrick Sanan <patrick.sanan@gmail.com>

show more ...


# 0bacdada 23-Apr-2019 Stefano Zampini <stefano.zampini@gmail.com>

Deprecate old {Mat|PC|KSP}ComputeExplicitOperator in favor of {Mat|PC|KSP}ComputeOperator


# 186a3e20 21-Apr-2019 Stefano Zampini <stefano.zampini@gmail.com>

{Mat|PC|KSP}ComputeExplicitOperator: add extra argument for matrix type


# 8d0323b9 16-Apr-2019 Pierre Jolivet <pierre.jolivet@enseeiht.fr>

Merged in jolivet/feature-getisbyindex (pull request #1544)

New function PCFieldSplitGetISByIndex

Approved-by: Matthew Knepley <knepley@gmail.com>
Approved-by: BarryFSmith <bsmith@mcs.anl.gov>


# 998f007d 15-Apr-2019 Pierre Jolivet <pierre.jolivet@enseeiht.fr>

Add new function PCFieldSplitGetISByIndex()


# 8000f006 15-Mar-2019 Barry Smith <bsmith@mcs.anl.gov>

Merge branch 'master' of bitbucket.org:petsc/petsc


# af4ac8ca 15-Mar-2019 Karl Rupp <me@karlrupp.net>

Merge branch 'pr1417/wence/feature-patch-all-at-once/master' [PR #1417]

* pr1417/wence/feature-patch-all-at-once/master:
Some new features in PCPatch: interior face terms, and an option for "all at

Merge branch 'pr1417/wence/feature-patch-all-at-once/master' [PR #1417]

* pr1417/wence/feature-patch-all-at-once/master:
Some new features in PCPatch: interior face terms, and an option for "all at once" assembly.

This adds two new features to PCPatch:

You can now also set callbacks for interior face terms. We need these for problems with IP discretisations.
(Optionally, defaulting to false) do an “all at once” computation of all the element tensors. In our benchmarking this is quite a bit faster (problem dependent).

We have, once again, not hooked up the new callbacks with Plex (there is a TODO + SETERRQ in this case): hopefully it does not require much.
As a result, tests are in Firedrake again: https://github.com/firedrakeproject/firedrake/pull/1383

show more ...


12345678910>>...44