Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • jwallwork23/petsc
  • caxwl/petsc
  • florian98765/petsc
  • pierre.gosselet/petsc
  • terraferma/petsc
  • josephpu/petsc
  • petsc/petsc
  • djanekovic/petsc
  • a814802615/petsc
  • valeriabarra/petsc
  • adamqc/petsc
  • mara2596arts/petsc
  • benbow/petsc
  • njchoi94/petsc
  • jkruzik/petsc
  • taupalosaurus/petsc
  • lindad/petsc
  • house.of.jules/petsc
  • negf/petsc
  • fabien.evrard/petsc
  • tkonolige/petsc
  • eromero-vlc/petsc
  • ole.rohne/petsc
  • bratekai/petsc
  • josh_hanophy/petsc
  • tchen01/petsc
  • rayaotb/petsc
  • erinellefsen/petsc
  • khuck/petsc
  • vangohao/petsc
  • smaclachlan/petsc
  • ArashMehraban/petsc
  • claas1/petsc
  • ZedThree/petsc
  • bwhitchurch/petsc
  • pakodot/petsc
  • fanronghong/petsc
  • mose/petsc
  • lrtfm/petsc
  • acolinisi/petsc
  • IvanYashchuk/petsc
  • danofinn/petsc
  • seanp6/petsc
  • fsimonis/petsc
  • amelvill-umich/petsc
  • ranocha/petsc
  • slebdaou/petsc
  • kaushikcfd/petsc
  • manasi-t24/petsc
  • ermore/petsc
  • BarrySmith/petsc
  • xiaodizhang29/petsc
  • nimdanaoto/petsc
  • marekdam/petsc
  • salazardetroya/petsc
  • ksagiyam/petsc
  • tbridel/petsc
  • PhilipFackler/petsc
  • connorjward/petsc
  • felker/petsc
  • arcowie/petsc
  • pwang234/petsc
  • jrt5493/petsc
  • dmancy/petsc
  • eric.chamberland/petsc
  • pedro-ricardo/petsc
  • LeilaGhaffari/petsc
  • huyi/petsc
  • amishra95/petsc
  • mlohry/petsc
  • tdelarue/petsc
  • miguel_salazar/petsc
  • danshapero/petsc
  • daia.dolci/petsc
  • pbrubeck/petsc
  • mmcgurn/petsc
  • PierreMarchand20/petsc
  • uphoffc1/petsc
  • reykoki/petsc
  • geyan3566/petsc
  • xiangmin.jiao/petsc
  • samreynoldsmath/petsc
  • davidsal/petsc
  • drwells/petsc
  • nathawani/petsc
  • KennethEJansen1/petsc-kjansen-fork
  • olivecha/petsc
  • qinglonghe/petsc
  • zhaog6/petsc
  • pbartholomew08/petsc
  • JDBetteridge/petsc
  • MarDiehl/petsc
  • pghysels/petsc
  • frankt716/petsc
  • zhoulei3d/petsc
  • jhyun1/petsc
  • sebastiangrimberg/petsc
  • thecasterian/petsc
  • sukeya/petsc
  • Hofer-Julian1/petsc
  • rbeucher/petsc
  • G-Leak/petsc
  • ak47wwk/petsc
  • dr-robertk/petsc
  • reykoki/petsc-slug
  • teoretyk/petsc
  • Dokken92/petsc
  • narnoldm/petsc
  • UZerbinati1/petsc
  • Condemorl/petsc
  • matthew.woehlke/petsc
  • AdelekeBankole/petsc
  • ftrigaux/petsc
  • reykoki/pet
  • minrk/petsc
  • ferranj2/petsc
  • agreatvoyage/petsc
  • kerrykey/petsc
  • suyashtn/petsc
  • miguel.salazar/petsc
  • khsa1/petsc
  • etpsoutt/petsc
  • jrwrigh/petsc
  • petsc/amd-petsc
  • rezgar_shakeri/petsc
  • th4nos/petsc
  • angus-g/petsc
  • tisaac/petsc
  • bantingl/petsc
  • KhaledNabilSharafeldin/petsc
  • armbiant/partial-differential-equations
  • l18340091052/petsc
  • wspear/petsc
  • rjj/petsc
  • hexiaofeng2/petsc
  • jychang48/petsc
  • hillyuan/petsc
  • prudhomm/petsc
  • fspacheco/petsc
  • gregoryw94/petsc
  • antongorenko/petsc
  • huangsc/petsc
  • michal.habera/petsc
  • margherita.guido/petsc
  • cianwilson/petsc
  • jeremylt/petsc
  • pakodot/petsc-2
  • aatmdelissen/petsc
  • iulian787/petsc
  • _DenverCoder9/petsc
  • malachiphillips/petsc
  • bldenton/petsc
  • IgorBaratta/petsc
  • stephankramer/petsc
  • ashish-onscale/petsc
  • dprada85/petsc
  • andrsd/petsc
  • nabarnaf/petsc
  • victorapm/petsc
  • dschwoerer/petsc
  • chris_richardson/petsc
  • ahmes134/petsc
  • garth-wells/petsc
  • liuyangzhuan/petsc
  • myoung.space.science/petsc-meeting-2023
  • zhuyidong-YP/petsc
  • ilya-fursov/petsc
  • matthewkolbe/petsc
  • jdhughes-usgs/petsc
  • LucasVega777/petsc
  • david-kamensky/petsc
  • eduardocardenas97/petsc
  • dpk31/petsc
  • leonhostetler/petsc-staggered-fermions
  • RECHE23/petsc
  • kothah/petsc
  • jeremyonscale/petsc
  • lmoresi/petsc
  • NkAl00oTn9/petsc
  • myoung.space.science/petsc
  • xiaodongspace/petsc
  • wbonelli/petsc
  • bilke/petsc
  • nilsfriess/petsc
  • pledac/petsc
  • msahin.ae00/petsc
  • luspi/petsc
  • carlthore/petsc
  • rchopade/petsc
  • pruvostflorent/petsc
  • Jeff-Hadley/petsc
  • audictheprogrammer/internship-2024
  • cekees/petsc
  • helloworldyzr/petsc-fork
  • boris-martin/petsc
  • juliusgh/petsc
  • etiennemlb_fork/petsc
  • xieshanggao23/petsc
  • dmay/petsc-playground
  • raphaelzanella/petsc
  • Schaudge/petsc
  • mosaddeq.ub/petsc
  • VictorEijkhout/petsc
  • myd7349/petsc
  • BeteixZ/petsc
  • boriskaus/petsc
  • dmentock/petsc
  • dewenyushu/petsc
  • JHopeCollins/petsc
  • jonas-heinzmann/petsc
  • keceli/petsc
  • nmnobre/petsc
  • armbiant/android-petsc
  • mleoni/petsc
  • ansarisaad83299/petsc
  • stevendargaville/petsc
  • tapashreepradhan/petsc
  • bsidihida/petsc
218 results
Show changes
Commits on Source (17)
Showing with 125 additions and 74 deletions
......@@ -1057,15 +1057,15 @@ mswin-gnu:
# variables:
# TEST_ARCH: arch-ci-mswin-intel
macos-cxx-cmplx-pkgs-dbg:
macos-cxx-cmplx-pkgs-dbg-arm:
extends:
- .stage-3
- .macos_test
- .coverage-disable
tags:
- os:macos-x64
- os:macos-arm
variables:
TEST_ARCH: arch-ci-macos-cxx-cmplx-pkgs-dbg
TEST_ARCH: arch-ci-macos-cxx-cmplx-pkgs-dbg-arm
INIT_SCRIPT: .zprofile
macos-cxx-pkgs-opt-arm:
......
......@@ -12,6 +12,12 @@ class Configure(config.package.GNUPackage):
self.downloadonWindows = 1
return
def setupDependencies(self, framework):
config.package.Package.setupDependencies(self, framework)
self.mathlib = framework.require('config.packages.mathlib',self)
self.deps = [self.mathlib]
return
def Install(self):
macos_deployment = ''
if 'MACOSX_DEPLOYMENT_TARGET' in os.environ:
......
......@@ -7940,7 +7940,7 @@ PetscErrorCode DMSetFineDM(DM dm, DM fdm)
. Nv - The number of `DMLabel` values for constrained points
. values - An array of values for constrained points
. field - The field to constrain
. Nc - The number of constrained field components (0 will constrain all fields)
. Nc - The number of constrained field components (0 will constrain all components)
. comps - An array of constrained component numbers
. bcFunc - A pointwise function giving boundary values
. bcFunc_t - A pointwise function giving the time deriative of the boundary values, or NULL
......
......@@ -999,7 +999,7 @@ static PetscErrorCode MatProductSetFromOptions_SchurComplement_Dense(Mat C)
Mat_Product *product = C->product;
PetscFunctionBegin;
PetscCheck(product->type == MATPRODUCT_AB, PetscObjectComm((PetscObject)C), PETSC_ERR_PLIB, "Not for product type %s", MatProductTypes[product->type]);
if (product->type != MATPRODUCT_AB) PetscFunctionReturn(PETSC_SUCCESS);
C->ops->productsymbolic = MatProductSymbolic_SchurComplement_Dense;
PetscFunctionReturn(PETSC_SUCCESS);
}
......
......@@ -715,6 +715,7 @@ static PetscErrorCode PCView_Deflation(PC pc, PetscViewer viewer)
PetscBool iascii;
PetscFunctionBegin;
if (!pc->setupcalled) PetscFunctionReturn(PETSC_SUCCESS);
PetscCall(PetscObjectTypeCompare((PetscObject)viewer, PETSCVIEWERASCII, &iascii));
if (iascii) {
if (def->correct) PetscCall(PetscViewerASCIIPrintf(viewer, "using CP correction, factor = %g+%gi\n", (double)PetscRealPart(def->correctfact), (double)PetscImaginaryPart(def->correctfact)));
......
......@@ -689,9 +689,6 @@ PetscErrorCode PCSetFromOptions_MG(PC pc, PetscOptionItems *PetscOptionsObject)
mgctype = (PCMGCycleType)mglevels[0]->cycles;
PetscCall(PetscOptionsEnum("-pc_mg_cycle_type", "V cycle or for W-cycle", "PCMGSetCycleType", PCMGCycleTypes, (PetscEnum)mgctype, (PetscEnum *)&mgctype, &flg));
if (flg) PetscCall(PCMGSetCycleType(pc, mgctype));
gtype = mg->galerkin;
PetscCall(PetscOptionsEnum("-pc_mg_galerkin", "Use Galerkin process to compute coarser operators", "PCMGSetGalerkin", PCMGGalerkinTypes, (PetscEnum)gtype, (PetscEnum *)&gtype, &flg));
if (flg) PetscCall(PCMGSetGalerkin(pc, gtype));
coarseSpaceType = mg->coarseSpaceType;
PetscCall(PetscOptionsEnum("-pc_mg_adapt_interp_coarse_space", "Type of adaptive coarse space: none, polynomial, harmonic, eigenvector, generalized_eigenvector, gdsw", "PCMGSetAdaptCoarseSpaceType", PCMGCoarseSpaceTypes, (PetscEnum)coarseSpaceType, (PetscEnum *)&coarseSpaceType, &flg));
if (flg) PetscCall(PCMGSetAdaptCoarseSpaceType(pc, coarseSpaceType));
......@@ -703,6 +700,8 @@ PetscErrorCode PCSetFromOptions_MG(PC pc, PetscOptionItems *PetscOptionsObject)
flg = PETSC_FALSE;
PetscCall(PetscOptionsBool("-pc_mg_distinct_smoothup", "Create separate smoothup KSP and append the prefix _up", "PCMGSetDistinctSmoothUp", PETSC_FALSE, &flg, NULL));
if (flg) PetscCall(PCMGSetDistinctSmoothUp(pc));
PetscCall(PetscOptionsEnum("-pc_mg_galerkin", "Use Galerkin process to compute coarser operators", "PCMGSetGalerkin", PCMGGalerkinTypes, (PetscEnum)mg->galerkin, (PetscEnum *)&gtype, &flg));
if (flg) PetscCall(PCMGSetGalerkin(pc, gtype));
mgtype = mg->am;
PetscCall(PetscOptionsEnum("-pc_mg_type", "Multigrid type", "PCMGSetType", PCMGTypes, (PetscEnum)mgtype, (PetscEnum *)&mgtype, &flg));
if (flg) PetscCall(PCMGSetType(pc, mgtype));
......
......@@ -1116,6 +1116,7 @@ PetscErrorCode PCSetUpOnBlocks(PC pc)
PetscValidHeaderSpecific(pc, PC_CLASSID, 1);
if (!pc->setupcalled) PetscCall(PCSetUp(pc)); /* "if" to prevent -info extra prints */
if (!pc->ops->setuponblocks) PetscFunctionReturn(PETSC_SUCCESS);
PetscCall(MatSetErrorIfFailure(pc->pmat, pc->erroriffailure));
PetscCall(PetscLogEventBegin(PC_SetUpOnBlocks, pc, 0, 0, 0));
PetscCall(PCLogEventsDeactivatePush());
PetscUseTypeMethod(pc, setuponblocks);
......
......@@ -21,9 +21,8 @@ typedef struct {
#endif
PetscInt partial_dim;
fftw_plan p_forward, p_backward;
unsigned p_flag; /* planner flags, FFTW_ESTIMATE,FFTW_MEASURE, FFTW_PATIENT, FFTW_EXHAUSTIVE */
PetscScalar *finarray, *foutarray, *binarray, *boutarray; /* keep track of arrays because fftw plan should be
executed for the arrays with which the plan was created */
unsigned p_flag; /* planner flags, FFTW_ESTIMATE, FFTW_MEASURE, FFTW_PATIENT, FFTW_EXHAUSTIVE */
PetscScalar *finarray, *foutarray, *binarray, *boutarray; /* keep track of arrays because fftw_execute() can only be executed for the arrays with which the plan was created */
} Mat_FFTW;
extern PetscErrorCode MatMult_SeqFFTW(Mat, Vec, Vec);
......@@ -42,6 +41,7 @@ PetscErrorCode MatMult_SeqFFTW(Mat A, Vec x, Vec y)
Mat_FFTW *fftw = (Mat_FFTW *)fft->data;
const PetscScalar *x_array;
PetscScalar *y_array;
Vec xx;
#if defined(PETSC_USE_COMPLEX)
#if defined(PETSC_USE_64BIT_INDICES)
fftw_iodim64 *iodims;
......@@ -53,7 +53,13 @@ PetscErrorCode MatMult_SeqFFTW(Mat A, Vec x, Vec y)
PetscInt ndim = fft->ndim, *dim = fft->dim;
PetscFunctionBegin;
PetscCall(VecGetArrayRead(x, &x_array));
if (!fftw->p_forward && fftw->p_flag != FFTW_ESTIMATE) {
/* The data in the in/out arrays is overwritten so need a dummy array for computation, see FFTW manual sec2.1 or sec4 */
PetscCall(VecDuplicate(x, &xx));
PetscCall(VecGetArrayRead(xx, &x_array));
} else {
PetscCall(VecGetArrayRead(x, &x_array));
}
PetscCall(VecGetArray(y, &y_array));
if (!fftw->p_forward) { /* create a plan, then execute it */
switch (ndim) {
......@@ -108,21 +114,25 @@ PetscErrorCode MatMult_SeqFFTW(Mat A, Vec x, Vec y)
#endif
break;
}
fftw->finarray = (PetscScalar *)x_array;
fftw->foutarray = y_array;
/* Warning: if (fftw->p_flag!==FFTW_ESTIMATE) The data in the in/out arrays is overwritten!
planning should be done before x is initialized! See FFTW manual sec2.1 or sec4 */
fftw_execute(fftw->p_forward);
} else { /* use existing plan */
if (fftw->finarray != x_array || fftw->foutarray != y_array) { /* use existing plan on new arrays */
if (fftw->p_flag != FFTW_ESTIMATE) {
/* The data in the in/out arrays is overwritten so need a dummy array for computation, see FFTW manual sec2.1 or sec4 */
PetscCall(VecRestoreArrayRead(xx, &x_array));
PetscCall(VecDestroy(&xx));
PetscCall(VecGetArrayRead(x, &x_array));
} else {
fftw->finarray = (PetscScalar *)x_array;
fftw->foutarray = y_array;
}
}
if (fftw->finarray != x_array || fftw->foutarray != y_array) { /* use existing plan on new arrays */
#if defined(PETSC_USE_COMPLEX)
fftw_execute_dft(fftw->p_forward, (fftw_complex *)x_array, (fftw_complex *)y_array);
fftw_execute_dft(fftw->p_forward, (fftw_complex *)x_array, (fftw_complex *)y_array);
#else
fftw_execute_dft_r2c(fftw->p_forward, (double *)x_array, (fftw_complex *)y_array);
fftw_execute_dft_r2c(fftw->p_forward, (double *)x_array, (fftw_complex *)y_array);
#endif
} else {
fftw_execute(fftw->p_forward);
}
} else {
fftw_execute(fftw->p_forward);
}
PetscCall(VecRestoreArray(y, &y_array));
PetscCall(VecRestoreArrayRead(x, &x_array));
......@@ -136,6 +146,7 @@ PetscErrorCode MatMultTranspose_SeqFFTW(Mat A, Vec x, Vec y)
const PetscScalar *x_array;
PetscScalar *y_array;
PetscInt ndim = fft->ndim, *dim = fft->dim;
Vec xx;
#if defined(PETSC_USE_COMPLEX)
#if defined(PETSC_USE_64BIT_INDICES)
fftw_iodim64 *iodims = fftw->iodims;
......@@ -145,7 +156,13 @@ PetscErrorCode MatMultTranspose_SeqFFTW(Mat A, Vec x, Vec y)
#endif
PetscFunctionBegin;
PetscCall(VecGetArrayRead(x, &x_array));
if (!fftw->p_backward && fftw->p_flag != FFTW_ESTIMATE) {
/* The data in the in/out arrays is overwritten so need a dummy array for computation, see FFTW manual sec2.1 or sec4 */
PetscCall(VecDuplicate(x, &xx));
PetscCall(VecGetArrayRead(xx, &x_array));
} else {
PetscCall(VecGetArrayRead(x, &x_array));
}
PetscCall(VecGetArray(y, &y_array));
if (!fftw->p_backward) { /* create a plan, then execute it */
switch (ndim) {
......@@ -182,19 +199,24 @@ PetscErrorCode MatMultTranspose_SeqFFTW(Mat A, Vec x, Vec y)
#endif
break;
}
fftw->binarray = (PetscScalar *)x_array;
fftw->boutarray = y_array;
fftw_execute(fftw->p_backward);
} else { /* use existing plan */
if (fftw->binarray != x_array || fftw->boutarray != y_array) { /* use existing plan on new arrays */
if (fftw->p_flag != FFTW_ESTIMATE) {
/* The data in the in/out arrays is overwritten so need a dummy array for computation, see FFTW manual sec2.1 or sec4 */
PetscCall(VecRestoreArrayRead(xx, &x_array));
PetscCall(VecDestroy(&xx));
PetscCall(VecGetArrayRead(x, &x_array));
} else {
fftw->binarray = (PetscScalar *)x_array;
fftw->boutarray = y_array;
}
}
if (fftw->binarray != x_array || fftw->boutarray != y_array) { /* use existing plan on new arrays */
#if defined(PETSC_USE_COMPLEX)
fftw_execute_dft(fftw->p_backward, (fftw_complex *)x_array, (fftw_complex *)y_array);
fftw_execute_dft(fftw->p_backward, (fftw_complex *)x_array, (fftw_complex *)y_array);
#else
fftw_execute_dft_c2r(fftw->p_backward, (fftw_complex *)x_array, (double *)y_array);
fftw_execute_dft_c2r(fftw->p_backward, (fftw_complex *)x_array, (double *)y_array);
#endif
} else {
fftw_execute(fftw->p_backward);
}
} else {
fftw_execute(fftw->p_backward);
}
PetscCall(VecRestoreArray(y, &y_array));
PetscCall(VecRestoreArrayRead(x, &x_array));
......@@ -210,10 +232,17 @@ PetscErrorCode MatMult_MPIFFTW(Mat A, Vec x, Vec y)
PetscScalar *y_array;
PetscInt ndim = fft->ndim, *dim = fft->dim;
MPI_Comm comm;
Vec xx;
PetscFunctionBegin;
PetscCall(PetscObjectGetComm((PetscObject)A, &comm));
PetscCall(VecGetArrayRead(x, &x_array));
if (!fftw->p_forward && fftw->p_flag != FFTW_ESTIMATE) {
/* The data in the in/out arrays is overwritten so need a dummy array for computation, see FFTW manual sec2.1 or sec4 */
PetscCall(VecDuplicate(x, &xx));
PetscCall(VecGetArrayRead(xx, &x_array));
} else {
PetscCall(VecGetArrayRead(x, &x_array));
}
PetscCall(VecGetArray(y, &y_array));
if (!fftw->p_forward) { /* create a plan, then execute it */
switch (ndim) {
......@@ -246,18 +275,21 @@ PetscErrorCode MatMult_MPIFFTW(Mat A, Vec x, Vec y)
#endif
break;
}
fftw->finarray = (PetscScalar *)x_array;
fftw->foutarray = y_array;
/* Warning: if (fftw->p_flag!==FFTW_ESTIMATE) The data in the in/out arrays is overwritten!
planning should be done before x is initialized! See FFTW manual sec2.1 or sec4 */
fftw_execute(fftw->p_forward);
} else { /* use existing plan */
if (fftw->finarray != x_array || fftw->foutarray != y_array) { /* use existing plan on new arrays */
fftw_execute_dft(fftw->p_forward, (fftw_complex *)x_array, (fftw_complex *)y_array);
if (fftw->p_flag != FFTW_ESTIMATE) {
/* The data in the in/out arrays is overwritten so need a dummy array for computation, see FFTW manual sec2.1 or sec4 */
PetscCall(VecRestoreArrayRead(xx, &x_array));
PetscCall(VecDestroy(&xx));
PetscCall(VecGetArrayRead(x, &x_array));
} else {
fftw_execute(fftw->p_forward);
fftw->finarray = (PetscScalar *)x_array;
fftw->foutarray = y_array;
}
}
if (fftw->finarray != x_array || fftw->foutarray != y_array) { /* use existing plan on new arrays */
fftw_execute_dft(fftw->p_forward, (fftw_complex *)x_array, (fftw_complex *)y_array);
} else {
fftw_execute(fftw->p_forward);
}
PetscCall(VecRestoreArray(y, &y_array));
PetscCall(VecRestoreArrayRead(x, &x_array));
PetscFunctionReturn(PETSC_SUCCESS);
......@@ -271,10 +303,17 @@ PetscErrorCode MatMultTranspose_MPIFFTW(Mat A, Vec x, Vec y)
PetscScalar *y_array;
PetscInt ndim = fft->ndim, *dim = fft->dim;
MPI_Comm comm;
Vec xx;
PetscFunctionBegin;
PetscCall(PetscObjectGetComm((PetscObject)A, &comm));
PetscCall(VecGetArrayRead(x, &x_array));
if (!fftw->p_backward && fftw->p_flag != FFTW_ESTIMATE) {
/* The data in the in/out arrays is overwritten so need a dummy array for computation, see FFTW manual sec2.1 or sec4 */
PetscCall(VecDuplicate(x, &xx));
PetscCall(VecGetArrayRead(xx, &x_array));
} else {
PetscCall(VecGetArrayRead(x, &x_array));
}
PetscCall(VecGetArray(y, &y_array));
if (!fftw->p_backward) { /* create a plan, then execute it */
switch (ndim) {
......@@ -307,16 +346,21 @@ PetscErrorCode MatMultTranspose_MPIFFTW(Mat A, Vec x, Vec y)
#endif
break;
}
fftw->binarray = (PetscScalar *)x_array;
fftw->boutarray = y_array;
fftw_execute(fftw->p_backward);
} else { /* use existing plan */
if (fftw->binarray != x_array || fftw->boutarray != y_array) { /* use existing plan on new arrays */
fftw_execute_dft(fftw->p_backward, (fftw_complex *)x_array, (fftw_complex *)y_array);
if (fftw->p_flag != FFTW_ESTIMATE) {
/* The data in the in/out arrays is overwritten so need a dummy array for computation, see FFTW manual sec2.1 or sec4 */
PetscCall(VecRestoreArrayRead(xx, &x_array));
PetscCall(VecDestroy(&xx));
PetscCall(VecGetArrayRead(x, &x_array));
} else {
fftw_execute(fftw->p_backward);
fftw->binarray = (PetscScalar *)x_array;
fftw->boutarray = y_array;
}
}
if (fftw->binarray != x_array || fftw->boutarray != y_array) { /* use existing plan on new arrays */
fftw_execute_dft(fftw->p_backward, (fftw_complex *)x_array, (fftw_complex *)y_array);
} else {
fftw_execute(fftw->p_backward);
}
PetscCall(VecRestoreArray(y, &y_array));
PetscCall(VecRestoreArrayRead(x, &x_array));
PetscFunctionReturn(PETSC_SUCCESS);
......
......@@ -2846,11 +2846,14 @@ static PetscErrorCode MatSetLocalToGlobalMapping_IS(Mat A, ISLocalToGlobalMappin
static PetscErrorCode MatSetUp_IS(Mat A)
{
Mat_IS *is = (Mat_IS *)A->data;
ISLocalToGlobalMapping rmap, cmap;
PetscFunctionBegin;
PetscCall(MatGetLocalToGlobalMapping(A, &rmap, &cmap));
if (!rmap && !cmap) PetscCall(MatSetLocalToGlobalMapping(A, NULL, NULL));
if (!is->sf) {
PetscCall(MatGetLocalToGlobalMapping(A, &rmap, &cmap));
PetscCall(MatSetLocalToGlobalMapping(A, rmap, cmap));
}
PetscFunctionReturn(PETSC_SUCCESS);
}
......
......@@ -15,7 +15,6 @@ int main(int argc, char **args)
Mat A; /* FFT Matrix */
Vec x, y, z; /* Work vectors */
Vec x1, y1, z1; /* Duplicate vectors */
PetscInt i, k; /* for iterating over dimensions */
PetscRandom rdm; /* for creating random input */
PetscScalar a; /* used to scale output */
PetscReal enorm; /* norm for sanity check */
......@@ -31,10 +30,10 @@ int main(int argc, char **args)
PetscCall(PetscRandomSetFromOptions(rdm));
/* Iterate over dimensions, use PETSc-FFTW interface */
for (i = 1; i < 5; i++) {
for (PetscInt i = 1; i < 5; i++) {
DIM = i;
N = 1;
for (k = 0; k < i; k++) {
for (PetscInt k = 0; k < i; k++) {
dim[k] = n;
N *= n;
}
......@@ -72,7 +71,7 @@ int main(int argc, char **args)
PetscCall(VecScale(z1, a));
PetscCall(VecAXPY(z1, -1.0, x));
PetscCall(VecNorm(z1, NORM_1, &enorm));
if (enorm > 1.e-9) PetscCall(PetscPrintf(PETSC_COMM_WORLD, " Error norm of |x - z1| %g\n", enorm));
if (enorm > 1.e-9) PetscCall(PetscPrintf(PETSC_COMM_WORLD, " Error norm of |x - z1| %g dimension %" PetscInt_FMT "\n", enorm, i));
/* free spaces */
PetscCall(VecDestroy(&x1));
......
......@@ -2,10 +2,7 @@
1 dimensions: FFTW on vector of size 12
2 dimensions: FFTW on vector of size 144
Error norm of |x - z1| 106.931
3 dimensions: FFTW on vector of size 1728
Error norm of |x - z1| 107.435
4 dimensions: FFTW on vector of size 20736
Error norm of |x - z1| 109.66
1 dimensions: FFTW on vector of size 10
Error norm of |x - z1| 7.23451
2 dimensions: FFTW on vector of size 100
Error norm of |x - z1| 74.7894
3 dimensions: FFTW on vector of size 1000
Error norm of |x - z1| 739.169
4 dimensions: FFTW on vector of size 10000
Error norm of |x - z1| 7661.66
1 dimensions: FFTW on vector of size 5
Error norm of |x - z1| 2.77388
2 dimensions: FFTW on vector of size 25
Error norm of |x - z1| 21.1472
3 dimensions: FFTW on vector of size 125
Error norm of |x - z1| 91.4129
4 dimensions: FFTW on vector of size 625
Error norm of |x - z1| 471.544
......@@ -227,13 +227,22 @@ static PetscErrorCode TSStep_Theta(TS ts)
PetscCall(TSAdaptCheckStage(ts->adapt, ts, th->stage_time, th->X, &stageok));
if (!stageok) goto reject_step;
th->status = TS_STEP_PENDING;
if (th->endpoint || th->Theta == 1) {
if (th->endpoint) {
PetscCall(VecCopy(th->X, ts->vec_sol));
} else {
PetscCall(VecAXPBYPCZ(th->Xdot, -th->shift, th->shift, 0, th->X0, th->X));
PetscCall(VecAXPY(ts->vec_sol, ts->time_step, th->Xdot));
PetscCall(VecAXPBYPCZ(th->Xdot, -th->shift, th->shift, 0, th->X0, th->X)); /* th->Xdot is needed by TSInterpolate_Theta */
if (th->Theta == 1.0) PetscCall(VecCopy(th->X, ts->vec_sol)); /* BEULER, stage already checked */
else {
PetscCall(VecAXPY(ts->vec_sol, ts->time_step, th->Xdot));
PetscCall(TSAdaptCheckStage(ts->adapt, ts, ts->ptime + ts->time_step, ts->vec_sol, &stageok));
if (!stageok) {
PetscCall(VecCopy(th->X0, ts->vec_sol));
goto reject_step;
}
}
}
th->status = TS_STEP_PENDING;
PetscCall(TSAdaptChoose(ts->adapt, ts, ts->time_step, NULL, &next_time_step, &accept));
th->status = accept ? TS_STEP_COMPLETE : TS_STEP_INCOMPLETE;
if (!accept) {
......