Parallel version of MMG2D/MMGS

Hi Algiane,

On February 25th, you have mentioned that there were no plans to parallelize MMG2D (Parallel version of mmg). Is it still the case? Does that statement also apply to MMGS?

Thank you,

Patrick Laurin

Hi Patrick,

Yes, I am sorry but there is still no plans around the parallelization of mmg2d and mmgs.

Regards,

Algiane

Dear Algiane,

I am also looking forward to the parallel version of MMG3D, could you please let us know once it will be released?

Sincerely yours,
Hao

Hi Hao,

I will put a news on the Mmg website: https://www.mmgtools.org/mmg-remesher-news and try to not forget to warn you here too!

Regards,
Algiane

Thanks, Algiane. We hope to hear from you soon.

Best regards,
Hao

Hi Hao,

We have published the first public release of the ParMmg library. You can find details and related links in the last news of the MmgTools website (http://www.mmgtools.org/mmg-simplicial-remesher/parmmg-v1-2-0).

Regards,
Algiane

Dear Algiane,

Thanks so much for letting us know your latest development. I read through the tutorial https://www.mmgtools.org/mmg-remesher-try-mmg/mmg-remesher-tutorials/parmmg/parmmg-isotropic-adaptation-to-a-solution
Here are some findings:

  1. The mesh before and after ParMmg are not partitioned, is that correct?
  2. I tried to call ParMmg in FreeFem, the command line can be:
    exec(“mpirun -np 8 parmmg_O3 sphere -out sphere-out.mesh -mesh-size 200000 -niter 6 -nlayers 3 -hgradreq 3.0”)

And I have some enquiries:

  1. I tried the above sample https://www.mmgtools.org/mmg-remesher-try-mmg/mmg-remesher-tutorials/parmmg/parmmg-isotropic-adaptation-to-a-solution on Ubuntu, it works well. However, when I switch to MacOS, it returns an error:error_MacOS (3.9 KB)
  2. For now, is it available to use ParMmg to discretize and optimize an implicitly defined surface, as we did by using MMG3D: https://www.mmgtools.org/mmg-remesher-try-mmg/mmg-remesher-tutorials/mmg-remesher-mmg3d/mmg-remesher-implicit-domain-meshing

Here I attach a mesh file:LvSet_1.mesh (1.3 MB) and a level set solution file:LvSet_1.sol (70.6 KB)
Is it possible to use ParMmg to update the above mesh?

  1. Related to our previous discussion https://forum.mmgtools.org/t/how-to-keep-the-interface-mesh-fixed/400
    Now, with ParMmg, would it be possible to keep the interface element remain unchanged? Or is it possible for ParMmg to keep the original regional label after updating mesh?

Sincerely yours,
Hao

Dear Hao,
I will try to answer your questions in the same order.
Regarding you findings:

  1. At the moment, the command line version of ParMmg only reads and returns a centralized mesh (the library version can also handle already partitioned input meshes).
  2. This seems a legit way to call the command line version of ParMmg from FreeFem, provided that you are running a sequential FreeFem script.

Regarding your questions:

  1. Your error seems to be related with filenames. Can you try to add the extension to the input mesh (i.e. “sphere.mesh”), and tell us if you have the same behaviour? Then, we could help you more if you please tell us some details of your installation:

    • which version of MacOS and compiler you are using,
    • which ParMmg branch you are using (if different from master),
    • which version of Mmg it is linking (if different from the automatically downloaded one)
  2. Level set discretization is not supported yet in ParMmg, but is in our workplane to include it in next releases.

  3. Consequently, domain references (regional labels) can only be preserved without the levelset discretization. You can force the preservation of an interface by setting its triangles as required.

Yours sincerely,
Luca

Dear Luca,

Thanks for your timely reply.

Regarding to the findings:
2. For now is it available to call ParMmg in parallelized FreeFem scripts?

Regarding to my questions:

  1. I added the extension as below:
    mpirun -np 4 parmmg_O3 ./sphere.mesh -sol ./sphere.sol -out sphere-out.mesh -mesh-size 200000 -niter 6 -nlayers 3 -hgradreq 3.0
    I have the same behaviour. In addition, my Mac has 4 cores 8 threads. I tried -np 4 and -np 8.
    For -np 4, it returns Mac_error_np4 (3.4 KB)
    For -np8, it returns Mac_error_np8 (1.1 KB)

Sincerely yours,
Hao

Dear Hao,

For the moment there is no user interface to adapt a distributed mesh with ParMmg from a parallel FreeFem script.
However, if you are running a parallel FreeFem script and you have the possibility to gather the global mesh on a single process, it should be possible to run ParMmg from command line from that single process only, with the centralized mesh as input.
If you envisage this kind of usage, we can further exchange about it.

Regarding your errors:

  • The problem on 8 processes is probably due to the handling of hyperthreading on your computer. You can force mpi to use more processes than the cores it sees with the option --oversubscribe:

    mpirun -np 8 --oversubscribe parmmg_O3 ./sphere.mesh -sol ./sphere.sol -out sphere-out.mesh -mesh-size 200000 -niter 6 -nlayers 3 -hgradreq 3.0

    This should let your program run, most probably getting the same error you get with 4 processes.

  • The problem on 4 processes (the segmentation fault) would need some more information for us to fix it. So I propose you a step-by-step procedure to install a debug version of ParMmg, and rerun the test case to produce debug information. The procedure consistes in 1) Installing a debug version of mmg; 2) Installing a debug version of ParMmg using this latter mmg version. Here it goes:

    1. Setup: Create a new directory called TestPMMG (for example) in a proper location, and store its name in a variable TestDir

      mkdir TestPMMG && cd TestPMMG
      TestDir=$(pwd)

      Store the compiler flags needed to use AddressSanitizer in a variable cmpopt

      cmpopt="-fsanitize=address -O1 -fno-omit-frame-pointer"

      Check that git is installed on your computer, otherwise install it with brew

      brew install git

    2. Install mmg: Clone, create a build directory, checkout on the appropriate commit

      git clone https://github.com/MmgTools/mmg.git
      cd mmg && mkdir build && cd build
      git checkout 9bd9ac675c7fef52df46e2cabdd4e9119adad425

      configure (you should now be in the $TestDir/mmg/build folder)

      cmake -DCMAKE_BUILD_TYPE=debug \
      -DCMAKE_C_FLAGS=$cmpopt \
      -DCMAKE_CXX_FLAGS=$cmpopt \
      -DCMAKE_EXE_LINKER_FLAGS=$cmpopt ..

      compile (from the same folder)

      make -j 8

    3. Install ParMmg: Clone, create a build directory

      cd $TestDir
      git clone https://github.com/MmgTools/ParMmg.git
      cd ParMmg && mkdir build && cd build

      configure (you should now be in the $TestDir/ParMmg/build folder)

      cmake -DCMAKE_BUILD_TYPE=debug \
      -DCMAKE_C_FLAGS=$cmpopt \
      -DCMAKE_CXX_FLAGS=$cmpopt \
      -DCMAKE_EXE_LINKER_FLAGS=$cmpopt \
      -DDOWNLOAD_MMG=off \
      -DMMG_DIR=$TestDir/mmg ..

      compile (from the same folder)

      make -j 8

    4. At this point, you should have a new debug version of the program called parmmg in the $TestDir/ParMmg/build/bin folder. You can now go to your tutorial folder and rerun the test with

      mpirun -np 4 $TestDir/ParMmg/build/bin/parmmg ./sphere.mesh -sol ./sphere.sol -out sphere-out.mesh -mesh-size 200000 -niter 6 -nlayers 3 -hgradreq 3.0

      (just remember to replace $TestDir with its absolute path, if working from another shell)
      Can you please report us the full output of this run?

Yours,
Luca

Dear Luca,

Thanks very much for your step-by-step instruction.
First, as for the 8 processors case, yes, it is true that I get the same error messeage as 4 processor case.

And I followed your instruction to compile the debug version of MMG and ParMmg.
Here is the full output of this run report_MacOS_debug1 (4.4 KB)

Sincerely yours,
Hao

Dear Hao,
Thank you for your tests. It does seem to be just a problem of input filenames related to Catalina, so it would be useful if you could provide us some last information:

  1. Get compiler information.
    • In your $TestDir/ParMmg/build folder, type:

      ccmake ..

      then type “t” to toggle advanced mode, copy the compiler path stored in the “CMAKE_C_COMPILER” variable, then exit with “q”.

    • Run in your terminal the line you copied, followed by “--version” to get the complete version of the compiler you are using, and please forward it to us.

  2. Test if you can reproduce the problem on just 1 process (mpirun -np 1 …).
  3. Apply a patch on the code to produce more debug messages.
    • Download the attached file debug0.patch (2.7 KB) in your $TestDir/ParMmg folder.
    • In your $TestDir/ParMmg folder, apply the patch and recompile:

      git apply debug0.patch --verbose
      cd build
      make -j 8

    • Please rerun the test (just on 1 process, if the problem showed up on 1 process) and forward us the output you get.
    • (Optional) To ripristinate your original code, you can go back in your $TestDir/ParMmg folder to remove the patch and recompile:

      git apply -R debug0.patch
      cd build
      make -j 8

We hope to be able to fix the issue you are experiencing with these last pieces of information.

Yours,
Luca

Dear Luca,

Thanks very much for your efforts! Here are my testing results:

  1. Get compiler information:
    Here is the Complete_version_of_compiler (202 Bytes) I am using.

  2. Yes, I reproduce the error with 1 processors. The command line is:
    mpirun -np 1 /Users/lihao/TestPMMG/ParMmg/build/bin/parmmg ./sphere.mesh -sol ./sphere.sol -out sphere-out.mesh -mesh-size 200000 -niter 6 -nlayers 3 -hgradreq 3.0
    The error message is:error_message_1_processors (4.5 KB)

  3. I recompile with the patch file. And run with 1 processor.
    Here is the error message: error_message_1_processor_recompile.rtf (2.8 KB)

Many thanks! And please let me know if I made any mistake during steps.

Sincerely yours,
Hao

Dear Hao,
It seems like the patch is not active for some reason (it should print some specific lines before the error message).
From your $TestDir/ParMmg dir, please check the output of the command

git status

If should give you a list of files modified by the patch. If you further type

git diff

the output of this command should be equal to the contents of the debug0.patch file. If it is not the case, please clean your repository with

git checkout -- .

and try reapply the patch: take care that for the debug0.patch file to be in your $TestDir/ParMmg folder (otherwise the patch won’t be applied) and reapply the patch from inside the same folder

git apply debug0.patch --verbose

this command too should tell you that two files were modified by the patch. If everything is ok at this point, recompile

cd build
make -j 8

and run your test to get more debug information in the output.

Hope it will work this time!
Yours,

Luca

Dear Luca,

I tried again. Here is the results step by step:

git diff: git_diff_result (48.0 KB)

git apply debug0.patch --verbose git_apply_patch_result (290 Bytes)

make -j8 make_results (1.1 KB)

testing result: error_message_2 (2.3 KB)

I guess that I need to recompile the TestPMMG from the very begining.

Sincerely yours,
Hao

Dear Hao,
Thanks, the patch seems to be applied. Compilation is ok too, but apparently you cannot link the executable for a file permission problem.
Quick fix for that, from the $TestDir/ParMmg/build directory: just remove the old program using sudo permission

sudo rm bin/parmmg

then recompile with make -j 8. Linking should go smoothly this time!
Yours,
Luca

Dear Luca,

This time, it seems that I compile successfully: make_result (599 Bytes)

And here is the testing output: error_message_3 (2.7 KB)

However, the output seems the same as before. Hope the message can help.

Sincerely yours,
Hao

Dear Hao,
The patch is effective this time, and it shows that ParMmg passes the correct filename to Mmg, then it gets lost on your machine.
The problem doesn’t show up on our machines, so I propose you to apply a further patch to Mmg debug_mmg.patch (976 Bytes) and recompile both Mmg and ParMmg:

cd $TestDir/mmg
git apply debug_mmg.patch --verbose
cd build; make -j 8
cd $TestDir/ParMmg/build; make -j 8
cd $TestDir

then to try two more tests to help us solve it:

  1. Rerun ParMmg and post us the output.
  2. Look if the problem is present even with the sequential version of mmg3d and post us the output:

    $TestDir/mmg/build/bin/mmg3d ./sphere.mesh -sol ./sphere.sol -out sphere-out.mesh

It is also worth trying if explicitly specifying the “-in” option before the filename can circumvent your problem:

mpirun -np 1 $TestDir/ParMmg/build/bin/parmmg -in ./sphere.mesh -sol ./sphere.sol -out sphere-out.mesh -mesh-size 200000 -niter 6 -nlayers 3 -hgradreq 3.0

Looking forward to hearing from you,
Luca