Homepage Books Releases Docs Discourse

About OpenGeoSys MPI PETSc version installation

Dear everyone who’s involved OpenGeoSys MPI Petsc Version,

I am leaving this question or an error report that occurred to me during custom installation of OpenGeoSys PETSc version on the latest CentOS 7 installed on one of my workstations. This was not a first attempt for installation. A previous brutal failure was on Fedora 20 after trying a couple of days with the version releaser together. For some reason an error occurred at the compilation stage of OpenGeoSys after PETSC installation and some others. To make the story short on this trial of installation, first here is what have done to face the error. I think this error is different from the one I had with Fedora 20.

  1. I cloned a branch by doing the following command for PETSC

git clone -b maint https://bitbucket.org/petsc/petsc petsc

  1. Then, compiled with the following option and installation of PETSC
    ./configure -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
    make PETSC_DIR=/home/nbeyond/git/petsc PETSC_ARCH=linux-fast
    su -c “make PETSC_DIR=/home/nbeyond/git/petsc PETSC_ARCH=linux-fast install”

  2. Installation of mesh_partition

git clone https://github.com/ufz/mesh_partition.git
cd metis-5.0.2/
git submodule init
git submodule update

  1. Module configuration
    In ~/Modules
    petsc_release is created to have

#%Module1.0

module-whatis “PETSc (v3.2-p7):Portable, Extensible Toolkit for Scientific Computation (http://www.mcs.anl.gov/petsc/)”

conflict petsc
#module load gcc/4.8.1-2
#module load openmpi/gcc/1.7.2-1
set path /opt/petsc

prepend-path PETSC_DIR $path
#prepend-path PETSC_ARCH
prepend-path PATH $path/bin
prepend-path LD_LIBRARY_PATH $path/lib

prepend-path -d " " CPPFLAGS -I$path/include
prepend-path -d " " LDFLAGS -L$path/lib

  1. Add the following in .bashrc file for module path

for modules

export MODULEPATH="$MODULEPATH:/home/nbeyond/Modules"

  1. Load the module as follows
    [nbeyond@WinterSugar trunk.build.petsc]$ module load petsc_release

  2. Compilation of OpenGeoSys trunk version as of this posting. Then, I have faced the following error in the process of configuration.

[nbeyond@WinterSugar trunk.build.petsc]$ cmake ~/svn/ogs/trunk/sources -DOGS_FEM_PETSC=ON -DCMAKE_BUILD_TYPE=Release
– The C compiler identification is GNU 4.8.2
– The CXX compiler identification is GNU 4.8.2
– Check for working C compiler: /usr/bin/cc
– Check for working C compiler: /usr/bin/cc – works
– Detecting C compiler ABI info
– Detecting C compiler ABI info - done
– Check for working CXX compiler: /usr/bin/c++
– Check for working CXX compiler: /usr/bin/c++ – works
– Detecting CXX compiler ABI info
– Detecting CXX compiler ABI info - done
– Found PythonInterp: /usr/bin/python (found version “2.7.5”)
– Found Subversion: /usr/bin/svn (found version “1.7.14”)
– Found Git: /usr/bin/git (found version “1.8.3.1”)
– Found Doxygen: /usr/bin/doxygen (found version “1.8.5”)
– Could NOT find cppcheck (missing: CPPCHECK_EXECUTABLE CPPCHECK_POSSIBLEERROR_ARG CPPCHECK_UNUSEDFUNC_ARG CPPCHECK_STYLE_ARG CPPCHECK_INCLUDEPATH_ARG CPPCHECK_QUIET_ARG)
– Number of processors: 24
– Configuring for PETSc
– Recognized PETSc install with single library for all packages
– Performing Test MULTIPASS_TEST_1_petsc_works_minimal
– Performing Test MULTIPASS_TEST_1_petsc_works_minimal - Failed
– Performing Test MULTIPASS_TEST_2_petsc_works_allincludes
– Performing Test MULTIPASS_TEST_2_petsc_works_allincludes - Failed
– Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries
– Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries - Failed
– Performing Test MULTIPASS_TEST_4_petsc_works_all
– Performing Test MULTIPASS_TEST_4_petsc_works_all - Success
– PETSc requires extra include paths and explicit linking to all dependencies. This probably means you have static libraries and something unexpected in PETSc headers.
– found version greater 3.3, version is 3.5.2
– Could NOT find MPI_C (missing: MPI_C_LIBRARIES MPI_C_INCLUDE_PATH)
– Could NOT find MPI_CXX (missing: MPI_CXX_LIBRARIES MPI_CXX_INCLUDE_PATH)
CMake Error at CMakeLists.txt:149 (MESSAGE):
Aborting: MPI implementation is not found!

– Configuring incomplete, errors occurred!
[nbeyond@WinterSugar trunk.build.petsc]$

Any comment or help?

I noticed that I didn’t load mpi/openmpi-x86_64. So, I did it with the following

[nbeyond@WinterSugar trunk.build.petsc]$ module load mpi/openmpi-x86_64

Then, the configuration seemed to work out. But, I have faced the following compilation error now.

[nbeyond@WinterSugar trunk.build.petsc]$ cmake ~/svn/ogs/trunk/sources -DOGS_FEM_PETSC=ON -DCMAKE_BUILD_TYPE=Release
– The C compiler identification is GNU 4.8.2
– The CXX compiler identification is GNU 4.8.2
– Check for working C compiler: /usr/bin/cc
– Check for working C compiler: /usr/bin/cc – works
– Detecting C compiler ABI info
– Detecting C compiler ABI info - done
– Check for working CXX compiler: /usr/bin/c++
– Check for working CXX compiler: /usr/bin/c++ – works
– Detecting CXX compiler ABI info
– Detecting CXX compiler ABI info - done
– Found PythonInterp: /usr/bin/python (found version “2.7.5”)
– Found Subversion: /usr/bin/svn (found version “1.7.14”)
– Found Git: /usr/bin/git (found version “1.8.3.1”)
– Found Doxygen: /usr/bin/doxygen (found version “1.8.5”)
– Could NOT find cppcheck (missing: CPPCHECK_EXECUTABLE CPPCHECK_POSSIBLEERROR_ARG CPPCHECK_UNUSEDFUNC_ARG CPPCHECK_STYLE_ARG CPPCHECK_INCLUDEPATH_ARG CPPCHECK_QUIET_ARG)
– Number of processors: 24
– Configuring for PETSc
– Recognized PETSc install with single library for all packages
– Performing Test MULTIPASS_TEST_1_petsc_works_minimal
– Performing Test MULTIPASS_TEST_1_petsc_works_minimal - Failed
– Performing Test MULTIPASS_TEST_2_petsc_works_allincludes
– Performing Test MULTIPASS_TEST_2_petsc_works_allincludes - Failed
– Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries
– Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries - Failed
– Performing Test MULTIPASS_TEST_4_petsc_works_all
– Performing Test MULTIPASS_TEST_4_petsc_works_all - Success
– PETSc requires extra include paths and explicit linking to all dependencies. This probably means you have static libraries and something unexpected in PETSc headers.
– found version greater 3.3, version is 3.5.2
– Found MPI_C: /usr/lib64/openmpi/lib/libmpi.so
– Found MPI_CXX: /usr/lib64/openmpi/lib/libmpi_cxx.so;/usr/lib64/openmpi/lib/libmpi.so
– Configuring done
– Generating done
– Build files have been written to: /home/nbeyond/workbench/trunk.build.petsc
[nbeyond@WinterSugar trunk.build.petsc]$ make
Scanning dependencies of target Base
[ 0%] Building CXX object Base/CMakeFiles/Base.dir/DateTools.cpp.o
[ 1%] Building CXX object Base/CMakeFiles/Base.dir/MemWatch.cpp.o
[ 2%] Building CXX object Base/CMakeFiles/Base.dir/StringTools.cpp.o
[ 2%] Building CXX object Base/CMakeFiles/Base.dir/binarySearch.cpp.o
Linking CXX static library …/lib/libBase.a
[ 2%] Built target Base
Scanning dependencies of target MathLib
[ 3%] Building CXX object MathLib/CMakeFiles/MathLib.dir/MathTools.cpp.o
[ 4%] Building CXX object MathLib/CMakeFiles/MathLib.dir/AnalyticalGeometry.cpp.o
[ 4%] Building CXX object MathLib/CMakeFiles/MathLib.dir/LinkedTriangle.cpp.o
[ 5%] Building CXX object MathLib/CMakeFiles/MathLib.dir/EarClippingTriangulation.cpp.o
[ 6%] Building CXX object MathLib/CMakeFiles/MathLib.dir/InterpolationAlgorithms/CubicSpline.cpp.o
[ 6%] Building CXX object MathLib/CMakeFiles/MathLib.dir/InterpolationAlgorithms/PiecewiseLinearInterpolation.cpp.o
[ 7%] Building CXX object MathLib/CMakeFiles/MathLib.dir/LinAlg/TriangularSolve.cpp.o
[ 8%] Building CXX object MathLib/CMakeFiles/MathLib.dir/PETSC/PETScLinearSolver.cpp.o
/home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/PETScLinearSolver.cpp: In member function ‘void petsc_group::PETScLinearSolver::Config(PetscReal, PetscInt, KSPType, PCType)’:
/home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/PETScLinearSolver.cpp:148:59: error: too many arguments to function ‘PetscErrorCode KSPSetOperators(KSP, Mat, Mat)’
KSPSetOperators(lsolver, A, A,DIFFERENT_NONZERO_PATTERN);
^
In file included from /home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/PETScLinearSolver.h:17:0,
from /home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/PETScLinearSolver.cpp:7:
/opt/petsc/include/petscksp.h:260:29: note: declared here
PETSC_EXTERN PetscErrorCode KSPSetOperators(KSP,Mat,Mat);
^
make[2]: *** [MathLib/CMakeFiles/MathLib.dir/PETSC/PETScLinearSolver.cpp.o] Error 1
make[1]: *** [MathLib/CMakeFiles/MathLib.dir/all] Error 2
make: *** [all] Error 2
[nbeyond@WinterSugar trunk.build.petsc]$

Any thought?

···

On Tuesday, December 2, 2014 2:01:47 PM UTC+9, Chan-Hee Park wrote:

Dear everyone who’s involved OpenGeoSys MPI Petsc Version,

I am leaving this question or an error report that occurred to me during custom installation of OpenGeoSys PETSc version on the latest CentOS 7 installed on one of my workstations. This was not a first attempt for installation. A previous brutal failure was on Fedora 20 after trying a couple of days with the version releaser together. For some reason an error occurred at the compilation stage of OpenGeoSys after PETSC installation and some others. To make the story short on this trial of installation, first here is what have done to face the error. I think this error is different from the one I had with Fedora 20.

  1. I cloned a branch by doing the following command for PETSC

git clone -b maint https://bitbucket.org/petsc/petsc petsc

  1. Then, compiled with the following option and installation of PETSC
    ./configure -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
    make PETSC_DIR=/home/nbeyond/git/petsc PETSC_ARCH=linux-fast
    su -c “make PETSC_DIR=/home/nbeyond/git/petsc PETSC_ARCH=linux-fast install”
  1. Installation of mesh_partition

git clone https://github.com/ufz/mesh_partition.git
cd metis-5.0.2/
git submodule init
git submodule update

  1. Module configuration
    In ~/Modules
    petsc_release is created to have

#%Module1.0

module-whatis “PETSc (v3.2-p7):Portable, Extensible Toolkit for Scientific Computation (http://www.mcs.anl.gov/petsc/)”

conflict petsc
#module load gcc/4.8.1-2
#module load openmpi/gcc/1.7.2-1
set path /opt/petsc

prepend-path PETSC_DIR $path
#prepend-path PETSC_ARCH
prepend-path PATH $path/bin
prepend-path LD_LIBRARY_PATH $path/lib

prepend-path -d " " CPPFLAGS -I$path/include
prepend-path -d " " LDFLAGS -L$path/lib

  1. Add the following in .bashrc file for module path

for modules

export MODULEPATH="$MODULEPATH:/home/nbeyond/Modules"

  1. Load the module as follows
    [nbeyond@WinterSugar trunk.build.petsc]$ module load petsc_release
  1. Compilation of OpenGeoSys trunk version as of this posting. Then, I have faced the following error in the process of configuration.

[nbeyond@WinterSugar trunk.build.petsc]$ cmake ~/svn/ogs/trunk/sources -DOGS_FEM_PETSC=ON -DCMAKE_BUILD_TYPE=Release
– The C compiler identification is GNU 4.8.2
– The CXX compiler identification is GNU 4.8.2
– Check for working C compiler: /usr/bin/cc
– Check for working C compiler: /usr/bin/cc – works
– Detecting C compiler ABI info
– Detecting C compiler ABI info - done
– Check for working CXX compiler: /usr/bin/c++
– Check for working CXX compiler: /usr/bin/c++ – works
– Detecting CXX compiler ABI info
– Detecting CXX compiler ABI info - done
– Found PythonInterp: /usr/bin/python (found version “2.7.5”)
– Found Subversion: /usr/bin/svn (found version “1.7.14”)
– Found Git: /usr/bin/git (found version “1.8.3.1”)
– Found Doxygen: /usr/bin/doxygen (found version “1.8.5”)
– Could NOT find cppcheck (missing: CPPCHECK_EXECUTABLE CPPCHECK_POSSIBLEERROR_ARG CPPCHECK_UNUSEDFUNC_ARG CPPCHECK_STYLE_ARG CPPCHECK_INCLUDEPATH_ARG CPPCHECK_QUIET_ARG)
– Number of processors: 24
– Configuring for PETSc
– Recognized PETSc install with single library for all packages
– Performing Test MULTIPASS_TEST_1_petsc_works_minimal
– Performing Test MULTIPASS_TEST_1_petsc_works_minimal - Failed
– Performing Test MULTIPASS_TEST_2_petsc_works_allincludes
– Performing Test MULTIPASS_TEST_2_petsc_works_allincludes - Failed
– Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries
– Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries - Failed
– Performing Test MULTIPASS_TEST_4_petsc_works_all
– Performing Test MULTIPASS_TEST_4_petsc_works_all - Success
– PETSc requires extra include paths and explicit linking to all dependencies. This probably means you have static libraries and something unexpected in PETSc headers.
– found version greater 3.3, version is 3.5.2
– Could NOT find MPI_C (missing: MPI_C_LIBRARIES MPI_C_INCLUDE_PATH)
– Could NOT find MPI_CXX (missing: MPI_CXX_LIBRARIES MPI_CXX_INCLUDE_PATH)
CMake Error at CMakeLists.txt:149 (MESSAGE):
Aborting: MPI implementation is not found!

– Configuring incomplete, errors occurred!
[nbeyond@WinterSugar trunk.build.petsc]$

Any comment or help?

For the latest PETSc version,

changing
   KSPSetOperators(lsolver, A, A,DIFFERENT_NONZERO_PATTERN);
to
   KSPSetOperators(lsolver, A, A);

could fix the problem.

···

On 12/02/2014 06:16 AM, Chan-Hee Park wrote:

I noticed that I didn't load mpi/openmpi-x86_64. So, I did it with the following

[nbeyond@WinterSugar trunk.build.petsc]$ module load mpi/openmpi-x86_64

Then, the configuration seemed to work out. But, I have faced the following compilation error now.

[nbeyond@WinterSugar trunk.build.petsc]$ cmake ~/svn/ogs/trunk/sources -DOGS_FEM_PETSC=ON -DCMAKE_BUILD_TYPE=Release
-- The C compiler identification is GNU 4.8.2
-- The CXX compiler identification is GNU 4.8.2
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Found PythonInterp: /usr/bin/python (found version "2.7.5")
-- Found Subversion: /usr/bin/svn (found version "1.7.14")
-- Found Git: /usr/bin/git (found version "1.8.3.1")
-- Found Doxygen: /usr/bin/doxygen (found version "1.8.5")
-- Could NOT find cppcheck (missing: CPPCHECK_EXECUTABLE CPPCHECK_POSSIBLEERROR_ARG CPPCHECK_UNUSEDFUNC_ARG CPPCHECK_STYLE_ARG CPPCHECK_INCLUDEPATH_ARG CPPCHECK_QUIET_ARG)
-- Number of processors: 24
-- Configuring for PETSc
-- Recognized PETSc install with single library for all packages
-- Performing Test MULTIPASS_TEST_1_petsc_works_minimal
-- Performing Test MULTIPASS_TEST_1_petsc_works_minimal - Failed
-- Performing Test MULTIPASS_TEST_2_petsc_works_allincludes
-- Performing Test MULTIPASS_TEST_2_petsc_works_allincludes - Failed
-- Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries
-- Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries - Failed
-- Performing Test MULTIPASS_TEST_4_petsc_works_all
-- Performing Test MULTIPASS_TEST_4_petsc_works_all - Success
-- PETSc requires extra include paths and explicit linking to all dependencies. This probably means you have static libraries and something unexpected in PETSc headers.
-- found version greater 3.3, version is 3.5.2
-- Found MPI_C: /usr/lib64/openmpi/lib/libmpi.so
-- Found MPI_CXX: /usr/lib64/openmpi/lib/libmpi_cxx.so;/usr/lib64/openmpi/lib/libmpi.so
-- Configuring done
-- Generating done
-- Build files have been written to: /home/nbeyond/workbench/trunk.build.petsc
[nbeyond@WinterSugar trunk.build.petsc]$ make
Scanning dependencies of target Base
[ 0%] Building CXX object Base/CMakeFiles/Base.dir/DateTools.cpp.o
[ 1%] Building CXX object Base/CMakeFiles/Base.dir/MemWatch.cpp.o
[ 2%] Building CXX object Base/CMakeFiles/Base.dir/StringTools.cpp.o
[ 2%] Building CXX object Base/CMakeFiles/Base.dir/binarySearch.cpp.o
Linking CXX static library ../lib/libBase.a
[ 2%] Built target Base
Scanning dependencies of target MathLib
[ 3%] Building CXX object MathLib/CMakeFiles/MathLib.dir/MathTools.cpp.o
[ 4%] Building CXX object MathLib/CMakeFiles/MathLib.dir/AnalyticalGeometry.cpp.o
[ 4%] Building CXX object MathLib/CMakeFiles/MathLib.dir/LinkedTriangle.cpp.o
[ 5%] Building CXX object MathLib/CMakeFiles/MathLib.dir/EarClippingTriangulation.cpp.o
[ 6%] Building CXX object MathLib/CMakeFiles/MathLib.dir/InterpolationAlgorithms/CubicSpline.cpp.o
[ 6%] Building CXX object MathLib/CMakeFiles/MathLib.dir/InterpolationAlgorithms/PiecewiseLinearInterpolation.cpp.o
[ 7%] Building CXX object MathLib/CMakeFiles/MathLib.dir/LinAlg/TriangularSolve.cpp.o
[ 8%] Building CXX object MathLib/CMakeFiles/MathLib.dir/PETSC/PETScLinearSolver.cpp.o
/home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/PETScLinearSolver.cpp: In member function ‘void petsc_group::PETScLinearSolver::Config(PetscReal, PetscInt, KSPType, PCType)’:
/home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/PETScLinearSolver.cpp:148:59: error: too many arguments to function ‘PetscErrorCode KSPSetOperators(KSP, Mat, Mat)’
    KSPSetOperators(lsolver, A, A,DIFFERENT_NONZERO_PATTERN);
                                                           ^
In file included from /home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/PETScLinearSolver.h:17:0,
                 from /home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/PETScLinearSolver.cpp:7:
/opt/petsc/include/petscksp.h:260:29: note: declared here
PETSC_EXTERN PetscErrorCode KSPSetOperators(KSP,Mat,Mat);
                             ^
make[2]: *** [MathLib/CMakeFiles/MathLib.dir/PETSC/PETScLinearSolver.cpp.o] Error 1
make[1]: *** [MathLib/CMakeFiles/MathLib.dir/all] Error 2
make: *** [all] Error 2
[nbeyond@WinterSugar trunk.build.petsc]$

Any thought?

On Tuesday, December 2, 2014 2:01:47 PM UTC+9, Chan-Hee Park wrote:

    Dear everyone who's involved OpenGeoSys MPI Petsc Version,

    I am leaving this question or an error report that occurred to me
    during custom installation of OpenGeoSys PETSc version on the
    latest CentOS 7 installed on one of my workstations. This was not
    a first attempt for installation. A previous brutal failure was on
    Fedora 20 after trying a couple of days with the version releaser
    together. For some reason an error occurred at the compilation
    stage of OpenGeoSys after PETSC installation and some others. To
    make the story short on this trial of installation, first here is
    what have done to face the error. I think this error is different
    from the one I had with Fedora 20.

    1. I cloned a branch by doing the following command for PETSC

    git clone -b maint https://bitbucket.org/petsc/petsc
    <https://bitbucket.org/petsc/petsc> petsc

    2. Then, compiled with the following option and installation of PETSC
    ./configure -PETSC_ARCH=linux-fast --download-f2cblaslapack=1
    --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0
    --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
    make PETSC_DIR=/home/nbeyond/git/petsc PETSC_ARCH=linux-fast
    su -c "make PETSC_DIR=/home/nbeyond/git/petsc
    PETSC_ARCH=linux-fast install"

    3. Installation of mesh_partition

    git clone https://github.com/ufz/mesh_partition.git
    <https://github.com/ufz/mesh_partition.git>
    cd metis-5.0.2/
    git submodule init
    git submodule update

    4. Module configuration
    In ~/Modules
    *petsc_release* is created to have

    #%Module1.0

    module-whatis "PETSc (v3.2-p7):Portable, Extensible Toolkit for
    Scientific Computation (http://www.mcs.anl.gov/petsc/)"

    conflict petsc
    #module load gcc/4.8.1-2
    #module load openmpi/gcc/1.7.2-1
    set path /opt/petsc

    prepend-path PETSC_DIR $path
    #prepend-path PETSC_ARCH
    prepend-path PATH $path/bin
    prepend-path LD_LIBRARY_PATH $path/lib

    prepend-path -d " " CPPFLAGS -I$path/include
    prepend-path -d " " LDFLAGS -L$path/lib

    5. Add the following in .bashrc file for module path
    # for modules
    export MODULEPATH="$MODULEPATH:/home/nbeyond/Modules"

    6. Load the module as follows
    [nbeyond@WinterSugar trunk.build.petsc]$ module load petsc_release

    6. Compilation of OpenGeoSys trunk version as of this posting.
    Then, I have faced the following error in the process of
    configuration.

    [nbeyond@WinterSugar trunk.build.petsc]$ cmake
    ~/svn/ogs/trunk/sources -DOGS_FEM_PETSC=ON -DCMAKE_BUILD_TYPE=Release
    -- The C compiler identification is GNU 4.8.2
    -- The CXX compiler identification is GNU 4.8.2
    -- Check for working C compiler: /usr/bin/cc
    -- Check for working C compiler: /usr/bin/cc -- works
    -- Detecting C compiler ABI info
    -- Detecting C compiler ABI info - done
    -- Check for working CXX compiler: /usr/bin/c++
    -- Check for working CXX compiler: /usr/bin/c++ -- works
    -- Detecting CXX compiler ABI info
    -- Detecting CXX compiler ABI info - done
    -- Found PythonInterp: /usr/bin/python (found version "2.7.5")
    -- Found Subversion: /usr/bin/svn (found version "1.7.14")
    -- Found Git: /usr/bin/git (found version "1.8.3.1")
    -- Found Doxygen: /usr/bin/doxygen (found version "1.8.5")
    -- Could NOT find cppcheck (missing: CPPCHECK_EXECUTABLE
    CPPCHECK_POSSIBLEERROR_ARG CPPCHECK_UNUSEDFUNC_ARG
    CPPCHECK_STYLE_ARG CPPCHECK_INCLUDEPATH_ARG CPPCHECK_QUIET_ARG)
    -- Number of processors: 24
    -- Configuring for PETSc
    -- Recognized PETSc install with single library for all packages
    -- Performing Test MULTIPASS_TEST_1_petsc_works_minimal
    -- Performing Test MULTIPASS_TEST_1_petsc_works_minimal - Failed
    -- Performing Test MULTIPASS_TEST_2_petsc_works_allincludes
    -- Performing Test MULTIPASS_TEST_2_petsc_works_allincludes - Failed
    -- Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries
    -- Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries - Failed
    -- Performing Test MULTIPASS_TEST_4_petsc_works_all
    -- Performing Test MULTIPASS_TEST_4_petsc_works_all - Success
    -- PETSc requires extra include paths and explicit linking to all
    dependencies. This probably means you have static libraries and
    something unexpected in PETSc headers.
    -- found version greater 3.3, version is 3.5.2
    -- Could NOT find MPI_C (missing: MPI_C_LIBRARIES
    MPI_C_INCLUDE_PATH)
    -- Could NOT find MPI_CXX (missing: MPI_CXX_LIBRARIES
    MPI_CXX_INCLUDE_PATH)
    CMake Error at CMakeLists.txt:149 (MESSAGE):
      Aborting: MPI implementation is not found!

    -- Configuring incomplete, errors occurred!
    [nbeyond@WinterSugar trunk.build.petsc]$

    Any comment or help?

--
You received this message because you are subscribed to the Google Groups "ogs-devs" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ogs-devs+unsubscribe@googlegroups.com <mailto:ogs-devs+unsubscribe@googlegroups.com>.
For more options, visit https://groups.google.com/d/optout.

Thanks Wenqing. That did it to go further. Then, now I have an another problem as follows.

/home/nbeyond/svn/ogs/trunk/sources/FEM/matrix_class.h:298:15: warning: by ‘Math_Group::SymMatrix::multi’ [-Woverloaded-virtual]
virtual void multi(const SymMatrix& m1, const Matrix& m2, Matrix& m_result);
^
Linking CXX static library …/lib/libMSHGEOTOOLS.a
[ 98%] Built target MSHGEOTOOLS
Scanning dependencies of target ogs
[100%] Building CXX object OGS/CMakeFiles/ogs.dir/rf.cpp.o
/home/nbeyond/svn/ogs/trunk/sources/OGS/rf.cpp: In function ‘int main(int, char**)’:
/home/nbeyond/svn/ogs/trunk/sources/OGS/rf.cpp:201:41: error: too few arguments to function ‘PetscErrorCode PetscSynchronizedFlush(MPI_Comm, FILE*)’
PetscSynchronizedFlush(PETSC_COMM_WORLD);
^
In file included from /opt/petsc/include/petscis.h:7:0,
from /opt/petsc/include/petscvec.h:9,
from /opt/petsc/include/petscmat.h:6,
from /opt/petsc/include/petscpc.h:6,
from /opt/petsc/include/petscksp.h:6,
from /home/nbeyond/svn/ogs/trunk/sources/OGS/rf.cpp:68:
/opt/petsc/include/petscsys.h:1834:29: note: declared here
PETSC_EXTERN PetscErrorCode PetscSynchronizedFlush(MPI_Comm,FILE*);
^
make[2]: *** [OGS/CMakeFiles/ogs.dir/rf.cpp.o] Error 1
make[1]: *** [OGS/CMakeFiles/ogs.dir/all] Error 2
make: *** [all] Error 2
[nbeyond@WinterSugar trunk.build.petsc]$

I think I am very close now. Any further thought?

···

On Tuesday, December 2, 2014 5:55:36 PM UTC+9, Wenqing Wang wrote:

For the latest PETSc version,

  changing

    KSPSetOperators(lsolver, A, A,DIFFERENT_NONZERO_PATTERN);

  to

    KSPSetOperators(lsolver, A, A);



  could fix the problem.





  On 12/02/2014 06:16 AM, Chan-Hee Park wrote:
    I noticed that I didn't load mpi/openmpi-x86_64. So, I did it with the following


      [nbeyond@WinterSugar trunk.build.petsc]$ module load mpi/openmpi-x86_64
    Then, the configuration seemed to work out. But, I have faced the following compilation error now.


      [nbeyond@WinterSugar trunk.build.petsc]$ cmake ~/svn/ogs/trunk/sources  -DOGS_FEM_PETSC=ON -DCMAKE_BUILD_TYPE=Release

      -- The C compiler identification is GNU 4.8.2

      -- The CXX compiler identification is GNU 4.8.2

      -- Check for working C compiler: /usr/bin/cc

      -- Check for working C compiler: /usr/bin/cc -- works

      -- Detecting C compiler ABI info

      -- Detecting C compiler ABI info - done

      -- Check for working CXX compiler: /usr/bin/c++

      -- Check for working CXX compiler: /usr/bin/c++ -- works

      -- Detecting CXX compiler ABI info

      -- Detecting CXX compiler ABI info - done

      -- Found PythonInterp: /usr/bin/python (found version "2.7.5")

      -- Found Subversion: /usr/bin/svn (found version "1.7.14")

      -- Found Git: /usr/bin/git (found version "1.8.3.1")

      -- Found Doxygen: /usr/bin/doxygen (found version "1.8.5")

      -- Could NOT find cppcheck (missing:  CPPCHECK_EXECUTABLE CPPCHECK_POSSIBLEERROR_ARG CPPCHECK_UNUSEDFUNC_ARG CPPCHECK_STYLE_ARG CPPCHECK_INCLUDEPATH_ARG CPPCHECK_QUIET_ARG)

      -- Number of processors: 24

      -- Configuring for PETSc

      -- Recognized PETSc install with single library for all packages

      -- Performing Test MULTIPASS_TEST_1_petsc_works_minimal

      -- Performing Test MULTIPASS_TEST_1_petsc_works_          minimal - Failed

      -- Performing Test MULTIPASS_TEST_2_petsc_works_allincludes

      -- Performing Test MULTIPASS_TEST_2_petsc_works_          allincludes - Failed

      -- Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries

      -- Performing Test MULTIPASS_TEST_3_petsc_works_          alllibraries - Failed

      -- Performing Test MULTIPASS_TEST_4_petsc_works_all

      -- Performing Test MULTIPASS_TEST_4_petsc_works_all - Success

      -- PETSc requires extra include paths and explicit linking to all dependencies. This probably means you have static libraries and something unexpected in PETSc headers.

      -- found version greater 3.3, version is 3.5.2

      -- Found MPI_C: /usr/lib64/openmpi/lib/libmpi.so 

      -- Found MPI_CXX: /usr/lib64/openmpi/lib/libmpi_cxx.so;/usr/lib64/openmpi/lib/          libmpi.so 

      -- Configuring done

      -- Generating done

      -- Build files have been written to: /home/nbeyond/workbench/trunk.build.petsc

      [nbeyond@WinterSugar trunk.build.petsc]$ make

      Scanning dependencies of target Base

      [  0%] Building CXX object Base/CMakeFiles/Base.dir/DateTools.cpp.o

      [  1%] Building CXX object Base/CMakeFiles/Base.dir/MemWatch.cpp.o

      [  2%] Building CXX object Base/CMakeFiles/Base.dir/StringTools.cpp.o

      [  2%] Building CXX object Base/CMakeFiles/Base.dir/binarySearch.cpp.o

      Linking CXX static library ../lib/libBase.a

      [  2%] Built target Base

      Scanning dependencies of target MathLib

      [  3%] Building CXX object MathLib/CMakeFiles/MathLib.dir/MathTools.cpp.o

      [  4%] Building CXX object MathLib/CMakeFiles/MathLib.dir/AnalyticalGeometry.cpp.o

      [  4%] Building CXX object MathLib/CMakeFiles/MathLib.dir/LinkedTriangle.cpp.o

      [  5%] Building CXX object MathLib/CMakeFiles/MathLib.dir/EarClippingTriangulation.cpp.o

      [  6%] Building CXX object MathLib/CMakeFiles/MathLib.dir/InterpolationAlgorithms/CubicSpline.cpp.o

      [  6%] Building CXX object MathLib/CMakeFiles/MathLib.dir/InterpolationAlgorithms/PiecewiseLinearInterpolation.cpp.o

      [  7%] Building CXX object MathLib/CMakeFiles/MathLib.dir/LinAlg/TriangularSolve.cpp.o

      [  8%] Building CXX object MathLib/CMakeFiles/MathLib.dir/PETSC/PETScLinearSolver.cpp.o

      /home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/          PETScLinearSolver.cpp: In member function ‘void petsc_group::PETScLinearSolver::Config(          PetscReal, PetscInt, KSPType, PCType)’:

      /home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/          PETScLinearSolver.cpp:148:59: error: too many arguments to function ‘PetscErrorCode KSPSetOperators(KSP, Mat, Mat)’

          KSPSetOperators(lsolver, A, A,DIFFERENT_NONZERO_PATTERN);

                                                                 ^

      In file included from /home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/PETScLinearSolver.h:17:0,

                       from /home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/PETScLinearSolver.cpp:7:

      /opt/petsc/include/petscksp.h:260:29: note: declared here

       PETSC_EXTERN PetscErrorCode KSPSetOperators(KSP,Mat,Mat);

                                   ^

      make[2]: *** [MathLib/CMakeFiles/MathLib.dir/PETSC/PETScLinearSolver.          cpp.o] Error 1

      make[1]: *** [MathLib/CMakeFiles/MathLib.dir/all] Error 2

      make: *** [all] Error 2

      [nbeyond@WinterSugar trunk.build.petsc]$
    Any thought?





    On Tuesday, December 2, 2014 2:01:47 PM UTC+9, Chan-Hee Park wrote:
        Dear everyone who's involved OpenGeoSys MPI Petsc Version,



        I am leaving this question or an error report that occurred to me during custom installation of OpenGeoSys PETSc version on the latest CentOS 7 installed on one of my workstations. This was not a first attempt for installation. A previous brutal failure was on Fedora 20 after trying a couple of days with the version releaser together. For some reason an error occurred at the compilation stage of OpenGeoSys after PETSC installation and some others. To make the story short on this trial of installation, first here is what have done to face the error. I think this error is different from the one I had with Fedora 20.





        1. I cloned a branch  by doing the following command for PETSC

git clone -b maint https://bitbucket.org/petsc/petsc petsc

        2. Then, compiled with the following option and installation of PETSC
          ./configure -PETSC_ARCH=linux-fast  --download-f2cblaslapack=1 --download-metis --download-parmetis  --download-superlu_dist  --download-blacs --download-hypre   -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0

          make PETSC_DIR=/home/nbeyond/git/              petsc PETSC_ARCH=linux-fast

          su -c "make PETSC_DIR=/home/nbeyond/git/              petsc PETSC_ARCH=linux-fast install"
        3. Installation of mesh_partition

git clone https://github.com/ufz/mesh_partition.git

          cd metis-5.0.2/

          git submodule init

          git submodule update
        4. Module configuration

In ~/Modules

          **petsc_release** is created to have



          #%Module1.0



          module-whatis "PETSc (v3.2-p7):Portable, Extensible Toolkit for Scientific Computation ([http://www.mcs.anl.gov/petsc/](http://www.mcs.anl.gov/petsc/))"



          conflict                petsc

          #module                  load                    gcc/4.8.1-2

          #module                  load                    openmpi/gcc/1.7.2-1

          set                     path             /opt/petsc



          prepend-path            PETSC_DIR               $path

          #prepend-path           PETSC_ARCH

          prepend-path            PATH                    $path/bin

          prepend-path            LD_LIBRARY_PATH         $path/lib



          prepend-path -d " "     CPPFLAGS                -I$path/include

          prepend-path -d " "     LDFLAGS                 -L$path/lib
        5. Add the following in .bashrc file for module path

for modules

          export MODULEPATH="$MODULEPATH:/home/nbeyond/Modules"
        6. Load the module as follows
          [nbeyond@WinterSugar trunk.build.petsc]$ module load petsc_release
        6. Compilation of OpenGeoSys trunk version as of this posting. Then, I have faced the following error in the process of configuration.


          [nbeyond@WinterSugar trunk.build.petsc]$ cmake ~/svn/ogs/trunk/sources  -DOGS_FEM_PETSC=ON -DCMAKE_BUILD_TYPE=Release

          -- The C compiler identification is GNU 4.8.2

          -- The CXX compiler identification is GNU 4.8.2

          -- Check for working C compiler: /usr/bin/cc

          -- Check for working C compiler: /usr/bin/cc -- works

          -- Detecting C compiler ABI info

          -- Detecting C compiler ABI info - done

          -- Check for working CXX compiler: /usr/bin/c++

          -- Check for working CXX compiler: /usr/bin/c++ -- works

          -- Detecting CXX compiler ABI info

          -- Detecting CXX compiler ABI info - done

          -- Found PythonInterp: /usr/bin/python (found version "2.7.5")

          -- Found Subversion: /usr/bin/svn (found version "1.7.14")

          -- Found Git: /usr/bin/git (found version "1.8.3.1")

          -- Found Doxygen: /usr/bin/doxygen (found version "1.8.5")

          -- Could NOT find cppcheck (missing:  CPPCHECK_EXECUTABLE CPPCHECK_POSSIBLEERROR_ARG CPPCHECK_UNUSEDFUNC_ARG CPPCHECK_STYLE_ARG CPPCHECK_INCLUDEPATH_ARG CPPCHECK_QUIET_ARG)

          -- Number of processors: 24

          -- Configuring for PETSc

          -- Recognized PETSc install with single library for all packages

          -- Performing Test MULTIPASS_TEST_1_petsc_works_minimal

          -- Performing Test MULTIPASS_TEST_1_petsc_works_              minimal - Failed

          -- Performing Test MULTIPASS_TEST_2_petsc_works_allincludes

          -- Performing Test MULTIPASS_TEST_2_petsc_works_              allincludes - Failed

          -- Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries

          -- Performing Test MULTIPASS_TEST_3_petsc_works_              alllibraries - Failed

          -- Performing Test MULTIPASS_TEST_4_petsc_works_all

          -- Performing Test MULTIPASS_TEST_4_petsc_works_              all - Success

          -- PETSc requires extra include paths and explicit linking to all dependencies. This probably means you have static libraries and something unexpected in PETSc headers.

          -- found version greater 3.3, version is 3.5.2

          -- Could NOT find MPI_C (missing:  MPI_C_LIBRARIES MPI_C_INCLUDE_PATH)

          -- Could NOT find MPI_CXX (missing:  MPI_CXX_LIBRARIES MPI_CXX_INCLUDE_PATH)

          CMake Error at CMakeLists.txt:149 (MESSAGE):

            Aborting: MPI implementation is not found!





          -- Configuring incomplete, errors occurred!

          [nbeyond@WinterSugar trunk.build.petsc]$
        Any comment or help?

  You received this message because you are subscribed to the Google Groups "ogs-devs" group.

  To unsubscribe from this group and stop receiving emails from it, send an email to ogs-devs+u...@googlegroups.com.

  For more options, visit [https://groups.google.com/d/optout](https://groups.google.com/d/optout).

After analysing the compilation error just below, I have revised the code just as

PetscSynchronizedFlush(PETSC_COMM_WORLD, NULL);

Then, this made the compiler happy. I do not know if this will cause any unintended error because of a little revision.

···

On Wednesday, December 3, 2014 9:39:08 AM UTC+9, Chan-Hee Park wrote:

For the latest PETSc version,

  changing

    KSPSetOperators(lsolver, A, A,DIFFERENT_NONZERO_PATTERN);

  to

    KSPSetOperators(lsolver, A, A);



  could fix the problem.





  On 12/02/2014 06:16 AM, Chan-Hee Park wrote:
    I noticed that I didn't load mpi/openmpi-x86_64. So, I did it with the following


      [nbeyond@WinterSugar trunk.build.petsc]$ module load mpi/openmpi-x86_64
    Then, the configuration seemed to work out. But, I have faced the following compilation error now.


      [nbeyond@WinterSugar trunk.build.petsc]$ cmake ~/svn/ogs/trunk/sources  -DOGS_FEM_PETSC=ON -DCMAKE_BUILD_TYPE=Release

      -- The C compiler identification is GNU 4.8.2

      -- The CXX compiler identification is GNU 4.8.2

      -- Check for working C compiler: /usr/bin/cc

      -- Check for working C compiler: /usr/bin/cc -- works

      -- Detecting C compiler ABI info

      -- Detecting C compiler ABI info - done

      -- Check for working CXX compiler: /usr/bin/c++

      -- Check for working CXX compiler: /usr/bin/c++ -- works

      -- Detecting CXX compiler ABI info

      -- Detecting CXX compiler ABI info - done

      -- Found PythonInterp: /usr/bin/python (found version "2.7.5")

      -- Found Subversion: /usr/bin/svn (found version "1.7.14")

      -- Found Git: /usr/bin/git (found version "1.8.3.1")

      -- Found Doxygen: /usr/bin/doxygen (found version "1.8.5")

      -- Could NOT find cppcheck (missing:  CPPCHECK_EXECUTABLE CPPCHECK_POSSIBLEERROR_ARG CPPCHECK_UNUSEDFUNC_ARG CPPCHECK_STYLE_ARG CPPCHECK_INCLUDEPATH_ARG CPPCHECK_QUIET_ARG)

      -- Number of processors: 24

      -- Configuring for PETSc

      -- Recognized PETSc install with single library for all packages

      -- Performing Test MULTIPASS_TEST_1_petsc_works_minimal

      -- Performing Test MULTIPASS_TEST_1_petsc_works_          minimal - Failed

      -- Performing Test MULTIPASS_TEST_2_petsc_works_allincludes

      -- Performing Test MULTIPASS_TEST_2_petsc_works_          allincludes - Failed

      -- Performing Test MULTIPASS_TEST_3_petsc_works_alllibraries

      -- Performing Test MULTIPASS_TEST_3_petsc_works_          alllibraries - Failed

      -- Performing Test MULTIPASS_TEST_4_petsc_works_all

      -- Performing Test MULTIPASS_TEST_4_petsc_works_all - Success

      -- PETSc requires extra include paths and explicit linking to all dependencies. This probably means you have static libraries and something unexpected in PETSc headers.

      -- found version greater 3.3, version is 3.5.2

      -- Found MPI_C: /usr/lib64/openmpi/lib/libmpi.so 

      -- Found MPI_CXX: /usr/lib64/openmpi/lib/libmpi_cxx.so;/usr/lib64/openmpi/lib/          libmpi.so 

      -- Configuring done

      -- Generating done

      -- Build files have been written to: /home/nbeyond/workbench/trunk.build.petsc

      [nbeyond@WinterSugar trunk.build.petsc]$ make

      Scanning dependencies of target Base

      [  0%] Building CXX object Base/CMakeFiles/Base.dir/DateTools.cpp.o

      [  1%] Building CXX object Base/CMakeFiles/Base.dir/MemWatch.cpp.o

      [  2%] Building CXX object Base/CMakeFiles/Base.dir/StringTools.cpp.o

      [  2%] Building CXX object Base/CMakeFiles/Base.dir/binarySearch.cpp.o

      Linking CXX static library ../lib/libBase.a

      [  2%] Built target Base

      Scanning dependencies of target MathLib

      [  3%] Building CXX object MathLib/CMakeFiles/MathLib.dir/MathTools.cpp.o

      [  4%] Building CXX object MathLib/CMakeFiles/MathLib.dir/AnalyticalGeometry.cpp.o

      [  4%] Building CXX object MathLib/CMakeFiles/MathLib.dir/LinkedTriangle.cpp.o

      [  5%] Building CXX object MathLib/CMakeFiles/MathLib.dir/EarClippingTriangulation.cpp.o

      [  6%] Building CXX object MathLib/CMakeFiles/MathLib.dir/InterpolationAlgorithms/CubicSpline.cpp.o

      [  6%] Building CXX object MathLib/CMakeFiles/MathLib.dir/InterpolationAlgorithms/PiecewiseLinearInterpolation.cpp.o

      [  7%] Building CXX object MathLib/CMakeFiles/MathLib.dir/LinAlg/TriangularSolve.cpp.o

      [  8%] Building CXX object MathLib/CMakeFiles/MathLib.dir/PETSC/PETScLinearSolver.cpp.o

      /home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/          PETScLinearSolver.cpp: In member function ‘void petsc_group::PETScLinearSolver::Config(          PetscReal, PetscInt, KSPType, PCType)’:

      /home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/          PETScLinearSolver.cpp:148:59: error: too many arguments to function ‘PetscErrorCode KSPSetOperators(KSP, Mat, Mat)’

          KSPSetOperators(lsolver, A, A,DIFFERENT_NONZERO_PATTERN);

                                                                 ^

      In file included from /home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/PETScLinearSolver.h:17:0,

                       from /home/nbeyond/svn/ogs/trunk/sources/MathLib/PETSC/PETScLinearSolver.cpp:7:

      /opt/petsc/include/petscksp.h:260:29: note: declared here

       PETSC_EXTERN PetscErrorCode KSPSetOperators(KSP,Mat,Mat);

                                   ^

      make[2]: *** [MathLib/CMakeFiles/MathLib.dir/PETSC/PETScLinearSolver.          cpp.o] Error 1

      make[1]: *** [MathLib/CMakeFiles/MathLib.dir/all] Error 2

      make: *** [all] Error 2

      [nbeyond@WinterSugar trunk.build.petsc]$
    Any thought?

Thanks Wenqing. That did it to go further. Then, now I have an another problem as follows.

/home/nbeyond/svn/ogs/trunk/sources/FEM/matrix_class.h:298:15: warning: by ‘Math_Group::SymMatrix::multi’ [-Woverloaded-virtual]
virtual void multi(const SymMatrix& m1, const Matrix& m2, Matrix& m_result);
^
Linking CXX static library …/lib/libMSHGEOTOOLS.a
[ 98%] Built target MSHGEOTOOLS
Scanning dependencies of target ogs
[100%] Building CXX object OGS/CMakeFiles/ogs.dir/rf.cpp.o
/home/nbeyond/svn/ogs/trunk/sources/OGS/rf.cpp: In function ‘int main(int, char**)’:
/home/nbeyond/svn/ogs/trunk/sources/OGS/rf.cpp:201:41: error: too few arguments to function ‘PetscErrorCode PetscSynchronizedFlush(MPI_Comm, FILE*)’
PetscSynchronizedFlush(PETSC_COMM_WORLD);
^
In file included from /opt/petsc/include/petscis.h:7:0,
from /opt/petsc/include/petscvec.h:9,
from /opt/petsc/include/petscmat.h:6,
from /opt/petsc/include/petscpc.h:6,
from /opt/petsc/include/petscksp.h:6,
from /home/nbeyond/svn/ogs/trunk/sources/OGS/rf.cpp:68:
/opt/petsc/include/petscsys.h:1834:29: note: declared here
PETSC_EXTERN PetscErrorCode PetscSynchronizedFlush(MPI_Comm,FILE*);
^
make[2]: *** [OGS/CMakeFiles/ogs.dir/rf.cpp.o] Error 1
make[1]: *** [OGS/CMakeFiles/ogs.dir/all] Error 2
make: *** [all] Error 2
[nbeyond@WinterSugar trunk.build.petsc]$

I think I am very close now. Any further thought?

On Tuesday, December 2, 2014 5:55:36 PM UTC+9, Wenqing Wang wrote:

Now that the compilation is okay. I have tried to solve one benchmark problem using PETSC. I have chosen KueperProblem-PS for the test from trunk/benchmarks/PETSc/KueperProblem-PS.

[nbeyond@WinterSugar KueperProblem-PS]$ ls
kueper.bc kueper.msh kueper_partitioned_nodes_3.msh
kueper.gli kueper.num kueper.pcs
kueper.ic kueper.out kueper.st
kueper.mesh kueper_partitioned_cfg3.msh kueper.tim
kueper.mfp kueper_partitioned_elems_3.msh
kueper.mmp kueper_partitioned.msh
[nbeyond@WinterSugar KueperProblem-PS]$ partmesh mm–
Task option (–metis2ogs or --ogs2metis) is not given. Stop now.
[nbeyond@WinterSugar KueperProblem-PS]$ partmesh --ogs2metis kueper
File name is: kueper
File path is: ./

***Total CPU time elapsed: 0.01s
[nbeyond@WinterSugar KueperProblem-PS]$ more kueper.mesh
2240
1 2 59 58
2 3 60 59
3 4 61 60
4 5 62 61
5 6 63 62
6 7 64 63
7 8 65 64
8 9 66 65
9 10 67 66
10 11 68 67
11 12 69 68
12 13 70 69
13 14 71 70
14 15 72 71
15 16 73 72
16 17 74 73
17 18 75 74
18 19 76 75
19 20 77 76
20 21 78 77
21 22 79 78
22 23 80 79
23 24 81 80
24 25 82 81
25 26 83 82
26 27 84 83
27 28 85 84
28 29 86 85
29 30 87 86
30 31 88 87
31 32 89 88
32 33 90 89
33 34 91 90
34 35 92 91
35 36 93 92
36 37 94 93
37 38 95 94
38 39 96 95
39 40 97 96
40 41 98 97
41 42 99 98
42 43 100 99
43 44 101 100
44 45 102 101
45 46 103 102
[nbeyond@WinterSugar KueperProblem-PS]$ partmesh --metis2ogs -np 4 -n kueper
File name is: kueper
File path is: ./

***Compute mesh topology

CPU time elapsed in constructing topology of grids: 0.02s

···

METIS 5.0 Copyright 1998-11, Regents of the University of Minnesota
size of idx_t: 32bits, real_t: 32bits, idx_t *: 64bits

Mesh Information ------------------------------------------------------------
Name: kueper.mesh, #Elements: 2240, #Nodes: 2337, #Parts: 4

Options ---------------------------------------------------------------------
ptype=kway, objtype=cut, ctype=shem, rtype=greedy, iptype=metisrb
dbglvl=0, ufactor=1.030, minconn=NO, contig=NO, nooutput=NO
seed=-1, niter=10, ncuts=1
gtype=dual, ncommon=1, niter=10, ncuts=1

Direct k-way Partitioning ---------------------------------------------------

  • Edgecut: 285.

Timing Information ----------------------------------------------------------
I/O: 0.001 sec
Partitioning: 0.002 sec (METIS time)
Reporting: 0.001 sec

Memory Information ----------------------------------------------------------
Max memory used: 0.457 MB


***Prepare subdomain mesh
Process partition: 0
Process partition: 1
Process partition: 2
Process partition: 3

***Total CPU time elapsed: 0.04s
[nbeyond@WinterSugar KueperProblem-PS]$

Then I ran ogs using mpirun as follows

[nbeyond@WinterSugar KueperProblem-PS]$ mpirun -np 4 ~/workbench/trunk.build.petsc/bin/ogs kueper

Use PETSc solverNumber of CPUs: 4, rank: 0

Use PETSc solverNumber of CPUs: 4, rank: 1

Use PETSc solverNumber of CPUs: 4, rank: 2

Use PETSc solverNumber of CPUs: 4, rank: 3
kueper


Data input:
kueper


Data input:

      ###################################################
      ##                                               ##
      ##               OpenGeoSys-Project              ##
      ##                                               ##
      ##  Helmholtz Center for Environmental Research  ##
      ##    UFZ Leipzig - Environmental Informatics    ##
      ##                  TU Dresden                   ##
      ##              University of Kiel               ##
      ##            University of Edinburgh            ##
      ##         University of Tuebingen (ZAG)         ##
      ##       Federal Institute for Geosciences       ##
      ##          and Natural Resources (BGR)          ##
      ##                                               ##
      ##       Version 5.5(WW)  Date 22.05.2014        ##
      ##                                               ##

kueper


Data input:
###################################################

      File name (without extension): kueper

Data input:
GEOLIB::readGLIFile open stream from file kueper.gli … done
read points from stream … GEOLIB::readGLIFile open stream from file kueper.gli … done
read points from stream … GEOLIB::readGLIFile open stream from file kueper.gli … done
read points from stream … ok, 8 points read
read polylines from stream … ok, 8 points read
read polylines from stream … ok, 5 polylines read
PCSRead … ok, 5 polylines read
PCSRead … tag #SURFACE not found or input stream error in GEOObjects
tag #SURFACE not found or input stream error in GEOObjects
ok, 8 points read
read polylines from stream … done, read 1 processes
MFPRead
GEOLIB::readGLIFile open stream from file kueper.gli … done
ok, 5 polylines read
PCSRead … read points from stream … tag #SURFACE not found or input stream error in GEOObjects
BCRead … done, read 1 processes
MFPRead
ok, 8 points read
done, read 1 processes
MFPRead
BCRead … read polylines from stream … done, read 3 boundary conditions
STRead … WARNING: in STRead: could not read source term
ok, 5 polylines read
done, read 3 boundary conditions
STRead … done, read 0 source terms
ICRead
BCRead … tag #SURFACE not found or input stream error in GEOObjects
WARNING: in STRead: could not read source term
PCSRead … done, read 0 source terms
ICRead
OUTRead
done, read 1 processes
MFPRead
OUTRead
done, read 3 boundary conditions
TIMRead
TIMRead
STRead … WARNING: in STRead: could not read source term
MMPRead … MMPRead … done, read 0 source terms
ICRead
BCRead …

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.


Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
–/n

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
OUTRead
done, read 3 boundary conditions
–/n

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
STRead … done, read 0 source terms
may enter capillary pressure specific parameters directly.
–/n

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
–/n

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
–/n

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
–/ndone, read 4 medium properties
TIMRead
ICRead
WARNING: in STRead: could not read source term
OUTRead
–/n

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
–/ndone, read 4 medium properties
NUMRead
-> Mass lumping selected for PS_GLOBAL

Using old $NON_LINEAR_SOLVER keyword.
Eventually this will be obsolete. Consider switching to
$NON_LINEAR_ITERATIONS for better results and greater flexibility.

NUMRead
-> Mass lumping selected for PS_GLOBAL

Using old $NON_LINEAR_SOLVER keyword.
Eventually this will be obsolete. Consider switching to
$NON_LINEAR_ITERATIONS for better results and greater flexibility.

MMPRead …

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
–/n

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
TIMRead
MMPRead … --/n

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.


Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
–/n

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
–/ndone, read 4 medium properties
–/n

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
NUMRead
-> Mass lumping selected for PS_GLOBAL
–/n

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you

Using old $NON_LINEAR_SOLVER keyword.
Eventually this will be obsolete. Consider switching to
$NON_LINEAR_ITERATIONS for better results and greater flexibility.
may enter capillary pressure specific parameters directly.

–/n

Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
–/ndone, read 4 medium properties
NUMRead
-> Mass lumping selected for PS_GLOBAL

Using old $NON_LINEAR_SOLVER keyword.
Eventually this will be obsolete. Consider switching to
$NON_LINEAR_ITERATIONS for better results and greater flexibility.

[1]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[1]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[1]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[1]PETSC ERROR: to get more information on the crash.
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.5.2, unknown
[0]PETSC ERROR: /home/nbeyond/workbench/trunk.build.petsc/bin/ogs on a linux-fast named WinterSugar by nbeyond Wed Dec 3 14:16:25 2014
[0]PETSC ERROR: Configure options -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
[0]PETSC ERROR: #1 User provided function() line 0 in unknown file
[1]PETSC ERROR: Signal received
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.5.2, unknown
[1]PETSC ERROR: /home/nbeyond/workbench/trunk.build.petsc/bin/ogs on a linux-fast named WinterSugar by nbeyond Wed Dec 3 14:16:25 2014
[1]PETSC ERROR: Configure options -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
[1]PETSC ERROR: #1 User provided function() line 0 in unknown file

MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.

–>Reading binary mesh file …[3]PETSC ERROR: ------------------------------------------------------------------------
[3]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[3]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[3]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[3]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[3]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[3]PETSC ERROR: to get more information on the crash.
[3]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[3]PETSC ERROR: Signal received
[3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[3]PETSC ERROR: Petsc Release Version 3.5.2, unknown
[3]PETSC ERROR: /home/nbeyond/workbench/trunk.build.petsc/bin/ogs on a linux-fast named WinterSugar by nbeyond Wed Dec 3 14:16:25 2014
[3]PETSC ERROR: Configure options -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
[3]PETSC ERROR: #1 User provided function() line 0 in unknown file
[2]PETSC ERROR: ------------------------------------------------------------------------
[2]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[2]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[2]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[2]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[2]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[2]PETSC ERROR: to get more information on the crash.
[2]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[2]PETSC ERROR: Signal received
[2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[2]PETSC ERROR: Petsc Release Version 3.5.2, unknown
[2]PETSC ERROR: /home/nbeyond/workbench/trunk.build.petsc/bin/ogs on a linux-fast named WinterSugar by nbeyond Wed Dec 3 14:16:25 2014
[2]PETSC ERROR: Configure options -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
[2]PETSC ERROR: #1 User provided function() line 0 in unknown file

mpirun has exited due to process rank 0 with PID 20468 on
node WinterSugar exiting improperly. There are two reasons this could occur:

  1. this process did not call “init” before exiting, but others in
    the job did. This can cause a job to hang indefinitely while it waits
    for all processes to call “init”. By rule, if one process calls “init”,
    then ALL processes must call “init” prior to termination.

  2. this process called “init”, but exited without calling “finalize”.
    By rule, all processes that call “init” MUST call “finalize” prior to
    exiting or it will be considered an “abnormal termination”

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).

[WinterSugar:20467] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
[WinterSugar:20467] Set MCA parameter “orte_base_help_aggregate” to 0 to see all help / error messages
[nbeyond@WinterSugar KueperProblem-PS]$

Now, I have two conditional questions after all these.

1. Is OGS PETSC installation at least successful? So, the problem now stays within the scope of KueperProblem-PS.
2. What is it otherwise?

On Wednesday, December 3, 2014 2:06:13 PM UTC+9, Chan-Hee Park wrote:

After analysing the compilation error just below, I have revised the code just as

PetscSynchronizedFlush(PETSC_COMM_WORLD, NULL);

Then, this made the compiler happy. I do not know if this will cause any unintended error because of a little revision.

PetscSynchronizedFlush() is also changed in the latest version. You can removed it or keep as what you have changed.

For the input, you have to change the linear solver type. You can find some examples in benchmarks/PETSc

···

On 12/03/2014 06:27 AM, Chan-Hee Park wrote:

Now that the compilation is okay. I have tried to solve one benchmark problem using PETSC. I have chosen KueperProblem-PS for the test from trunk/benchmarks/PETSc/KueperProblem-PS.

[nbeyond@WinterSugar KueperProblem-PS]$ ls
kueper.bc kueper.msh kueper_partitioned_nodes_3.msh
kueper.gli kueper.num kueper.pcs
kueper.ic kueper.out kueper.st
kueper.mesh kueper_partitioned_cfg3.msh kueper.tim
kueper.mfp kueper_partitioned_elems_3.msh
kueper.mmp kueper_partitioned.msh
[nbeyond@WinterSugar KueperProblem-PS]$ partmesh mm--
Task option (--metis2ogs or --ogs2metis) is not given. Stop now.
[nbeyond@WinterSugar KueperProblem-PS]$ *partmesh --ogs2metis kueper*
File name is: kueper
File path is: ./

***Total CPU time elapsed: 0.01s
[nbeyond@WinterSugar KueperProblem-PS]$ more kueper.mesh
2240
1 2 59 58
2 3 60 59
3 4 61 60
4 5 62 61
5 6 63 62
6 7 64 63
7 8 65 64
8 9 66 65
9 10 67 66
10 11 68 67
11 12 69 68
12 13 70 69
13 14 71 70
14 15 72 71
15 16 73 72
16 17 74 73
17 18 75 74
18 19 76 75
19 20 77 76
20 21 78 77
21 22 79 78
22 23 80 79
23 24 81 80
24 25 82 81
25 26 83 82
26 27 84 83
27 28 85 84
28 29 86 85
29 30 87 86
30 31 88 87
31 32 89 88
32 33 90 89
33 34 91 90
34 35 92 91
35 36 93 92
36 37 94 93
37 38 95 94
38 39 96 95
39 40 97 96
40 41 98 97
41 42 99 98
42 43 100 99
43 44 101 100
44 45 102 101
45 46 103 102
[nbeyond@WinterSugar KueperProblem-PS]$ *partmesh --metis2ogs -np 4 -n kueper*
File name is: kueper
File path is: ./

***Compute mesh topology

CPU time elapsed in constructing topology of grids: 0.02s

******************************************************************************
METIS 5.0 Copyright 1998-11, Regents of the University of Minnesota
size of idx_t: 32bits, real_t: 32bits, idx_t *: 64bits

Mesh Information ------------------------------------------------------------
Name: kueper.mesh, #Elements: 2240, #Nodes: 2337, #Parts: 4

Options ---------------------------------------------------------------------
ptype=kway, objtype=cut, ctype=shem, rtype=greedy, iptype=metisrb
dbglvl=0, ufactor=1.030, minconn=NO, contig=NO, nooutput=NO
seed=-1, niter=10, ncuts=1
gtype=dual, ncommon=1, niter=10, ncuts=1

Direct k-way Partitioning ---------------------------------------------------
- Edgecut: 285.

Timing Information ----------------------------------------------------------
  I/O: 0.001 sec
  Partitioning: 0.002 sec (METIS time)
  Reporting: 0.001 sec

Memory Information ----------------------------------------------------------
  Max memory used: 0.457 MB
******************************************************************************

***Prepare subdomain mesh
Process partition: 0
Process partition: 1
Process partition: 2
Process partition: 3

***Total CPU time elapsed: 0.04s
[nbeyond@WinterSugar KueperProblem-PS]$

Then I ran ogs using mpirun as follows

[nbeyond@WinterSugar KueperProblem-PS]$ *mpirun -np 4 ~/workbench/trunk.build.petsc/bin/ogs kueper*

Use PETSc solverNumber of CPUs: 4, rank: 0

Use PETSc solverNumber of CPUs: 4, rank: 1

Use PETSc solverNumber of CPUs: 4, rank: 2

Use PETSc solverNumber of CPUs: 4, rank: 3
kueper

---------------------------------------------
Data input:
kueper

---------------------------------------------
Data input:

          ###################################################
          ## ##
          ## OpenGeoSys-Project ##
          ## ##
          ## Helmholtz Center for Environmental Research ##
          ## UFZ Leipzig - Environmental Informatics ##
          ## TU Dresden ##
          ## University of Kiel ##
          ## University of Edinburgh ##
          ## University of Tuebingen (ZAG) ##
          ## Federal Institute for Geosciences ##
          ## and Natural Resources (BGR) ##
          ## ##
          ## Version 5.5(WW) Date 22.05.2014 ##
          ## ##
kueper

---------------------------------------------
Data input:
          ###################################################

          File name (without extension): kueper

---------------------------------------------
Data input:
GEOLIB::readGLIFile open stream from file kueper.gli ... done
read points from stream ... GEOLIB::readGLIFile open stream from file kueper.gli ... done
read points from stream ... GEOLIB::readGLIFile open stream from file kueper.gli ... done
read points from stream ... ok, 8 points read
read polylines from stream ... ok, 8 points read
read polylines from stream ... ok, 5 polylines read
PCSRead ... ok, 5 polylines read
PCSRead ... tag #SURFACE not found or input stream error in GEOObjects
tag #SURFACE not found or input stream error in GEOObjects
ok, 8 points read
read polylines from stream ... done, read 1 processes
MFPRead
GEOLIB::readGLIFile open stream from file kueper.gli ... done
ok, 5 polylines read
PCSRead ... read points from stream ... tag #SURFACE not found or input stream error in GEOObjects
BCRead ... done, read 1 processes
MFPRead
ok, 8 points read
done, read 1 processes
MFPRead
BCRead ... read polylines from stream ... done, read 3 boundary conditions
STRead ... WARNING: in STRead: could not read source term
ok, 5 polylines read
done, read 3 boundary conditions
STRead ... done, read 0 source terms
ICRead
BCRead ... tag #SURFACE not found or input stream error in GEOObjects
WARNING: in STRead: could not read source term
PCSRead ... done, read 0 source terms
ICRead
OUTRead
done, read 1 processes
MFPRead
OUTRead
done, read 3 boundary conditions
TIMRead
STRead ... WARNING: in STRead: could not read source term
MMPRead ... MMPRead ... done, read 0 source terms
ICRead
BCRead ...
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.

--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
OUTRead
done, read 3 boundary conditions
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
STRead ... done, read 0 source terms
may enter capillary pressure specific parameters directly.
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/ndone, read 4 medium properties
TIMRead
ICRead
WARNING: in STRead: could not read source term
OUTRead
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/ndone, read 4 medium properties
NUMRead
-> Mass lumping selected for PS_GLOBAL
--
Using old $NON_LINEAR_SOLVER keyword.
Eventually this will be obsolete. Consider switching to
$NON_LINEAR_ITERATIONS for better results and greater flexibility.
--
NUMRead
-> Mass lumping selected for PS_GLOBAL
--
Using old $NON_LINEAR_SOLVER keyword.
Eventually this will be obsolete. Consider switching to
$NON_LINEAR_ITERATIONS for better results and greater flexibility.
--
MMPRead ...
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
TIMRead
MMPRead ... --/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.

--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/ndone, read 4 medium properties
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
NUMRead
-> Mass lumping selected for PS_GLOBAL
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
--
Using old $NON_LINEAR_SOLVER keyword.
Eventually this will be obsolete. Consider switching to
$NON_LINEAR_ITERATIONS for better results and greater flexibility.
may enter capillary pressure specific parameters directly.
--
--/n
--
Adopting capillary pressure saturation parameters from the
relative permeability function for phase 0. Alternatively, you
may enter capillary pressure specific parameters directly.
--/ndone, read 4 medium properties
NUMRead
-> Mass lumping selected for PS_GLOBAL
--
Using old $NON_LINEAR_SOLVER keyword.
Eventually this will be obsolete. Consider switching to
$NON_LINEAR_ITERATIONS for better results and greater flexibility.
--
[1]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[1]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[1]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[1]PETSC ERROR: to get more information on the crash.
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.5.2, unknown
[0]PETSC ERROR: /home/nbeyond/workbench/trunk.build.petsc/bin/ogs on a linux-fast named WinterSugar by nbeyond Wed Dec 3 14:16:25 2014
[0]PETSC ERROR: Configure options -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
[0]PETSC ERROR: #1 User provided function() line 0 in unknown file
[1]PETSC ERROR: Signal received
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.5.2, unknown
[1]PETSC ERROR: /home/nbeyond/workbench/trunk.build.petsc/bin/ogs on a linux-fast named WinterSugar by nbeyond Wed Dec 3 14:16:25 2014
[1]PETSC ERROR: Configure options -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
[1]PETSC ERROR: #1 User provided function() line 0 in unknown file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
-->Reading binary mesh file ...[3]PETSC ERROR: ------------------------------------------------------------------------
[3]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[3]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[3]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[3]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[3]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[3]PETSC ERROR: to get more information on the crash.
[3]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[3]PETSC ERROR: Signal received
[3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[3]PETSC ERROR: Petsc Release Version 3.5.2, unknown
[3]PETSC ERROR: /home/nbeyond/workbench/trunk.build.petsc/bin/ogs on a linux-fast named WinterSugar by nbeyond Wed Dec 3 14:16:25 2014
[3]PETSC ERROR: Configure options -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
[3]PETSC ERROR: #1 User provided function() line 0 in unknown file
[2]PETSC ERROR: ------------------------------------------------------------------------
[2]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[2]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[2]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[2]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[2]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[2]PETSC ERROR: to get more information on the crash.
[2]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[2]PETSC ERROR: Signal received
[2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[2]PETSC ERROR: Petsc Release Version 3.5.2, unknown
[2]PETSC ERROR: /home/nbeyond/workbench/trunk.build.petsc/bin/ogs on a linux-fast named WinterSugar by nbeyond Wed Dec 3 14:16:25 2014
[2]PETSC ERROR: Configure options -PETSC_ARCH=linux-fast --download-f2cblaslapack=1 --download-metis --download-parmetis --download-superlu_dist --download-blacs --download-hypre -with-debugging=0 --prefix=/opt/petsc --with-mpi-dir=/usr/lib64/openmpi --with-c2html=0
[2]PETSC ERROR: #1 User provided function() line 0 in unknown file
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 20468 on
node WinterSugar exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[WinterSugar:20467] 3 more processes have sent help message help-mpi-api.txt / mpi-abort
[WinterSugar:20467] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
[nbeyond@WinterSugar KueperProblem-PS]$

Now, I have two conditional questions after all these.

*1. Is OGS PETSC installation at least successful? So, the problem now stays within the scope of KueperProblem-PS.
2. What is it otherwise?*

On Wednesday, December 3, 2014 2:06:13 PM UTC+9, Chan-Hee Park wrote:

    After analysing the compilation error just below, I have revised
    the code just as

    PetscSynchronizedFlush(PETSC_COMM_WORLD, NULL);

    Then, this made the compiler happy. I do not know if this will
    cause any unintended error because of a little revision.

--
You received this message because you are subscribed to the Google Groups "ogs-devs" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ogs-devs+unsubscribe@googlegroups.com <mailto:ogs-devs+unsubscribe@googlegroups.com>.
For more options, visit https://groups.google.com/d/optout.