Now that the compilation is okay. I have tried to solve one benchmark problem using PETSC. I have chosen KueperProblem-PS for the test from trunk/benchmarks/PETSc/KueperProblem-PS.
[nbeyond@WinterSugar KueperProblem-PS] ls
kueper\.bc kueper\.msh kueper\_partitioned\_nodes\_3\.msh
kueper\.gli kueper\.num kueper\.pcs
kueper\.ic kueper\.out kueper\.st
kueper\.mesh kueper\_partitioned\_cfg3\.msh kueper\.tim
kueper\.mfp kueper\_partitioned\_elems\_3\.msh
kueper\.mmp kueper\_partitioned\.msh
\[nbeyond@WinterSugar KueperProblem\-PS\] partmesh mm--
Task option (--metis2ogs or --ogs2metis) is not given. Stop now.
[nbeyond@WinterSugar KueperProblem-PS]$ *partmesh --ogs2metis kueper*
File name is: kueper
File path is: ./
***Total CPU time elapsed: 0.01s
[nbeyond@WinterSugar KueperProblem-PS] more kueper\.mesh
2240
1 2 59 58
2 3 60 59
3 4 61 60
4 5 62 61
5 6 63 62
6 7 64 63
7 8 65 64
8 9 66 65
9 10 67 66
10 11 68 67
11 12 69 68
12 13 70 69
13 14 71 70
14 15 72 71
15 16 73 72
16 17 74 73
17 18 75 74
18 19 76 75
19 20 77 76
20 21 78 77
21 22 79 78
22 23 80 79
23 24 81 80
24 25 82 81
25 26 83 82
26 27 84 83
27 28 85 84
28 29 86 85
29 30 87 86
30 31 88 87
31 32 89 88
32 33 90 89
33 34 91 90
34 35 92 91
35 36 93 92
36 37 94 93
37 38 95 94
38 39 96 95
39 40 97 96
40 41 98 97
41 42 99 98
42 43 100 99
43 44 101 100
44 45 102 101
45 46 103 102
\[nbeyond@WinterSugar KueperProblem\-PS\] *partmesh --metis2ogs -np 4 -n kueper*
File name is: kueper
File path is: ./
***Compute mesh topology
CPU time elapsed in constructing topology of grids: 0.02s
******************************************************************************
METIS 5.0 Copyright 1998-11, Regents of the University of Minnesota
size of idx_t: 32bits, real_t: 32bits, idx_t *: 64bits
Mesh Information ------------------------------------------------------------
Name: kueper.mesh, #Elements: 2240, #Nodes: 2337, #Parts: 4
Options ---------------------------------------------------------------------
ptype=kway, objtype=cut, ctype=shem, rtype=greedy, iptype=metisrb
dbglvl=0, ufactor=1.030, minconn=NO, contig=NO, nooutput=NO
seed=-1, niter=10, ncuts=1
gtype=dual, ncommon=1, niter=10, ncuts=1
Direct k-way Partitioning ---------------------------------------------------
- Edgecut: 285.
Timing Information ----------------------------------------------------------
I/O: 0.001 sec
Partitioning: 0.002 sec (METIS time)
Reporting: 0.001 sec
Memory Information ----------------------------------------------------------
Max memory used: 0.457 MB
******************************************************************************
***Prepare subdomain mesh
Process partition: 0
Process partition: 1
Process partition: 2
Process partition: 3
***Total CPU time elapsed: 0.04s
[nbeyond@WinterSugar KueperProblem-PS]$
Then I ran ogs using mpirun as follows
[nbeyond@WinterSugar KueperProblem-PS]$ *mpirun -np 4 ~/workbench/trunk.build.petsc/bin/ogs kueper*
Use PETSc solverNumber of CPUs: 4, rank: 0
Use PETSc solverNumber of CPUs: 4, rank: 1
Use PETSc solverNumber of CPUs: 4, rank: 2
Use PETSc solverNumber of CPUs: 4, rank: 3
kueper
---------------------------------------------
Data input:
kueper
---------------------------------------------
Data input:
###################################################
## ##
## OpenGeoSys-Project ##
## ##
## Helmholtz Center for Environmental Research ##
## UFZ Leipzig - Environmental Informatics ##
## TU Dresden ##
## University of Kiel ##
## University of Edinburgh ##
## University of Tuebingen (ZAG) ##
## Federal Institute for Geosciences ##
## and Natural Resources (BGR) ##
## ##
## Version 5.5(WW) Date 22.05.2014 ##
## ##
kueper
---------------------------------------------
Data input:
###################################################
File name (without extension): kueper
---------------------------------------------
Data input:
GEOLIB::readGLIFile open stream from file kueper.gli ... done
read points from stream ... GEOLIB::readGLIFile open stream from file kueper.gli ... done
read points from stream ... GEOLIB::readGLIFile open stream from file kueper.gli ... done
read points from stream ... ok, 8 points read
read polylines from stream ... ok, 8 points read
read polylines from stream ... ok, 5 polylines read
PCSRead ... ok, 5 polylines read
PCSRead ... tag #SURFACE not found or input stream error in GEOObjects
tag #SURFACE not found or input stream error in GEOObjects
ok, 8 points read
read polylines from stream ... done, read 1 processes
MFPRead
GEOLIB::readGLIFile open stream from file kueper.gli ... done
ok, 5 polylines read
PCSRead ... read points from stream ... tag #SURFACE not found or input stream error in GEOObjects
BCRead ... done, read 1 processes
MFPRead
ok, 8 points read
done, read 1 processes
MFPRead
BCRead ... read polylines from stream ... done, read 3 boundary conditions
STRead ... WARNING: in STRead: could not read source term
ok, 5 polylines read
done, read 3 boundary conditions
STRead ... done, read 0 source terms
ICRead
BCRead ... tag #SURFACE not found or input stream error in GEOObjects
WARNING: in STRead: could not read source term
PCSRead ... done, read 0 source terms
ICRead
OUTRead
done, read 1 processes
MFPRead
OUTRead
done, read 3 boundary conditions
TIMRead
STRead ... WARNING: in STRead: could not read source term
MMPRead ... MMPRead ... done, read 0 source terms
ICRead
BCRead ...
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
OUTRead
done, read 3 boundary conditions
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
STRead ... done, read 0 source terms
may enter capillary pressure specific parameters directly.
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/ndone, read 4 medium properties
TIMRead
ICRead
WARNING: in STRead: could not read source term
OUTRead
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/ndone, read 4 medium properties
NUMRead
-> Mass lumping selected for PS_GLOBAL
--
Using old $NON_LINEAR_SOLVER keyword.
Eventually this will be obsolete. Consider switching to
$NON_LINEAR_ITERATIONS for better results and greater flexibility.
--
NUMRead
-> Mass lumping selected for PS_GLOBAL
--
Using old $NON_LINEAR_SOLVER keyword.
Eventually this will be obsolete. Consider switching to
$NON_LINEAR_ITERATIONS for better results and greater flexibility.
--
MMPRead ...
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
TIMRead
MMPRead ... --/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/ndone, read 4 medium properties
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
NUMRead
-> Mass lumping selected for PS_GLOBAL
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
--
Using old $NON_LINEAR_SOLVER keyword.
Eventually this will be obsolete. Consider switching to
$NON_LINEAR_ITERATIONS for better results and greater flexibility.
may enter capillary pressure specific parameters directly.
--
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/ndone, read 4 medium properties
NUMRead
-> Mass lumping selected for PS_GLOBAL
--
Using old $NON_LINEAR_SOLVER keyword.
Eventually this will be obsolete. Consider switching to
$NON_LINEAR_ITERATIONS for better results and greater flexibility.
--
[1]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[1]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[1]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[1]PETSC ERROR: to get more information on the crash.
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.5.2, unknown
[0]PETSC ERROR: /home/nbeyond/workbench/trunk.build.petsc/bin/ogs on a linux-fast named WinterSugar by nbeyond Wed Dec 3 14:16:25 2014
[0]PETSC ERROR: Configure options -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
[0]PETSC ERROR: #1 User provided function() line 0 in unknown file
[1]PETSC ERROR: Signal received
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.5.2, unknown
[1]PETSC ERROR: /home/nbeyond/workbench/trunk.build.petsc/bin/ogs on a linux-fast named WinterSugar by nbeyond Wed Dec 3 14:16:25 2014
[1]PETSC ERROR: Configure options -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
[1]PETSC ERROR: #1 User provided function() line 0 in unknown file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
-->Reading binary mesh file ...[3]PETSC ERROR: ------------------------------------------------------------------------
[3]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[3]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[3]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[3]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[3]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[3]PETSC ERROR: to get more information on the crash.
[3]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[3]PETSC ERROR: Signal received
[3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[3]PETSC ERROR: Petsc Release Version 3.5.2, unknown
[3]PETSC ERROR: /home/nbeyond/workbench/trunk.build.petsc/bin/ogs on a linux-fast named WinterSugar by nbeyond Wed Dec 3 14:16:25 2014
[3]PETSC ERROR: Configure options -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
[3]PETSC ERROR: #1 User provided function() line 0 in unknown file
[2]PETSC ERROR: ------------------------------------------------------------------------
[2]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[2]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[2]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[2]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[2]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[2]PETSC ERROR: to get more information on the crash.
[2]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[2]PETSC ERROR: Signal received
[2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[2]PETSC ERROR: Petsc Release Version 3.5.2, unknown
[2]PETSC ERROR: /home/nbeyond/workbench/trunk.build.petsc/bin/ogs on a linux-fast named WinterSugar by nbeyond Wed Dec 3 14:16:25 2014
[2]PETSC ERROR: Configure options -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
[2]PETSC ERROR: #1 User provided function() line 0 in unknown file
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 20468 on
node WinterSugar exiting improperly. There are two reasons this could occur:
1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.
2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"
This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[WinterSugar:20467] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
[WinterSugar:20467] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
[nbeyond@WinterSugar KueperProblem-PS]$
Now, I have two conditional questions after all these.
*1. Is OGS PETSC installation at least successful? So, the problem now stays within the scope of KueperProblem-PS.
2. What is it otherwise?*
On Wednesday, December 3, 2014 2:06:13 PM UTC+9, Chan-Hee Park wrote:
After analysing the compilation error just below, I have revised
the code just as
PetscSynchronizedFlush(PETSC_COMM_WORLD, NULL);
Then, this made the compiler happy. I do not know if this will
cause any unintended error because of a little revision.
--
You received this message because you are subscribed to the Google Groups "ogs-devs" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ogs-devs+unsubscribe@googlegroups.com <mailto:ogs-devs+unsubscribe@googlegroups.com>.
For more options, visit https://groups.google.com/d/optout\.