Pages

Friday, January 19, 2018

Research Note #13 - WRF-ARW, NetCDF 3.6.3 and ARWPost Installation

One thing I would anxiously like to check since beginning WRF-CHEM study is whether the output could be post-processed by ARWPost, one of the original post-processor of WRF-ARW. While ARWPost could only provide limited diagnostics of the model output compared to UPP, it's definitely much more simple and straightforward. Moreover, the failure of UPP to process the chemical fields from WRF-CHEM output really urged me to check it out. 

After the first try on FX-10, I found out that the tool was outdated. The latest version was released in 2011 and I couldn't install it using NetCDF 4.4 library. I somehow managed to install it using the older NetCDF (version 3.6.3), but the tool failed to read WRF-CHEM outputs. Nevertheless, after switching from FX-10 to O-PACS, I would like to try installing it once again.

Before installing the tool, I tried to find out the cause of failure. The following were my hypothesis:
  • ARWPost program uses old NetCDF functions in the source code, hence failing the installation when used the latest NetCDF libraries.
  • ARWPost program could only read WRF-ARW instead of WRF-CHEM because the difference of the output fields between those two.
In order to prove those hypothesis, I re-installed WRF model once again, this time without CHEM add-on (original WRF-ARW), in another directory. After the model was successfully compiled, I tried to install ARWPost. For the first try, I used the original NetCDF (version 4.4), and it failed like last time. Then I installed NetCDF 3.6.3 and used the library for ARWPost installation. This time, it worked, the compilation was successful. So, ARWPost did indeed only worked with older NetCDF. While NetCDF supports backward compatibility, the reason why ARWPost could not be installed with older NetCDF functions still remained a mystery for me.

---------------------------

Update: ARWPost Installation
Since I moved to another supercomputer with much more updated compiler, I couldn't use the old compiler to compile netcdf 3.6.3. Therefore, I uploaded pre-compiled binaries of that version for future ARWPost installation. 

Download link
Password: yj36ubxw

Using the pre-compiled binaries, it's not necessary to build and install netcdf 3.6.3 anymore. Just copy the directories, set the environment variable of netcdf into the directories, and you are ready to install ARWPost.

----------------------------

The next one was to find out whether ARWPost could read either WRF-ARW or WRF-CHEM output, or both of them. I run WRF-ARW first and read the output with ARWPost, and it worked flawlessly. Next, reading WRF-CHEM output.



!!!!!!!!!!!!!!!!
  ARWpost v3.1
!!!!!!!!!!!!!!!!

FOUND the following input files:
 ./testchem.nc

START PROCESSING DATA

   WARNING --- I do not recognize this data.
               OUTPUT FROM *             PROGRAM:WRF/CHEM V3.8.1 MODEL
               Will make an attempt to read it.


 Processing  time --- 2017-11-01_00:00:00
   Found the right date - continue
 WARNING: The following requested fields could not be found
            height
            pressure
            tk
            tc

 Processing  time --- 2017-11-01_01:00:00
   Found the right date - continue

 Processing  time --- 2017-11-01_02:00:00

   Found the right date - continue

...

Processing  time --- 2017-11-02_00:00:00
   Found the right date - continue

DONE Processing Data

CREATING .ctl file

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!  Successful completion of ARWpost  !
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Surprisingly, It worked for both model output and the version difference of NetCDF had no effect at all. It sure showed some warning when it read the output, but overall, the post-processing was successful. Anyway, the best news is, it could read all chemical fields in the output file. This is some fields from the output CTL.

PM2_5_DRY     24  0  pm2.5 aerosol dry mass (ug m^-3)
PM10          24  0  pm10 dry mass (ug m^-3)
DMS_0          1  0  dms oceanic concentrations (nM/L)
PHOTR201      24  0  cl2 photolysis rate (min{-1})
PHOTR202      24  0  hocl photolysis rate (min{-1})
PHOTR203      24  0  fmcl photolysis rate (min{-1})
so2           24  0  SO2 mixing ratio (ppmv)
sulf          24  0  SULF mixing ratio (ppmv)
dms           24  0  DMS mixing ratio (ppmv)
msa           24  0  MSA mixing ratio (ppmv)
P25           24  0  other gocart primary pm25 (ug/kg-dryair)
BC1           24  0  Hydrophobic Black Carbon (ug/kg-dryair)
BC2           24  0  Hydrophilic Black Carbon (ug/kg-dryair)
OC1           24  0  Hydrophobic Black Carbon (ug/kg-dryair)
OC2           24  0  Hydrophilic Black Carbon (ug/kg-dryair)
DUST_1        24  0  dust size bin 1: 0.5um effective radius (ug/kg-dryair)
DUST_2        24  0  dust size bin 2: 1.4um effective radius (ug/kg-dryair)
DUST_3        24  0  dust size bin 3: 2.4um effective radius (ug/kg-dryair)
DUST_4        24  0  dust size bin 4: 4.5um effective radius (ug/kg-dryair)
DUST_5        24  0  dust size bin 5: 8.0um effective radius (ug/kg-dryair)
SEAS_1        24  0  sea-salt size bin 1: 0.3um effective radius (ug/kg-dryair)
SEAS_2        24  0  sea-salt size bin 2: 1.0um effective radius (ug/kg-dryair)
SEAS_3        24  0  sea-salt size bin 3: 3.2um effective radius (ug/kg-dryair)
SEAS_4        24  0  sea-salt size bin 4: 7.5um effective radius (ug/kg-dryair)
P10           24  0  other gocart primary pm10 (ug/kg-dryair)

That was really good since ARWPost could also interpolate sigma level from WRF output into pressure level. It will make my job much easier in the future. After opened it on GrADS:


The conclusion is, while it can only work with older NetCDF libraries (unless you modify the source code), ARWPost could read the model output, as long as it uses the same dynamic solver (ARW). Anyway, I still have no idea why it couldn't read WRF-CHEM output on FX-10.

    

Wednesday, January 17, 2018

Research Note #12 - Installing WRF-CHEM on Oakforest-PACS

This is basically the same with normal installation except for some modifications.

1. NetCDF Libraries
Because Oakforest-PACS (O-PACS) has already had NetCDF libraries (C, Fortran and C++), we don't need to install them. We just to have load the modules into the environment variables.

Update (2018/12):
If the netcdf libraries and mpi libraries have been installed in the system, ignore this step and go straight to step #3.

$ module load netcdf
$ module load netcdf-fortran
$ module load netcdf-cxx

Since WRF installer needs all of them at once, it's better to copy the NetCDF files into a directory in parallel storage, then set the path to that directory and put it into bash_profile file. To copy the files, check the environment variables which contains entry "netcdf". Merge directories of all netcdf for each compiler into one, hence we'll get a single netcdf directory needed for WRF installer.

$ env | grep netcdf

$ cp <original netcdf dir> <new dir>
$ cp <original netcdf-fortran dir> <new dir>
$ cp <original netcdf-cxx dir> <new dir>

Let's say the new netcdf directory is at /work/gi55/c24223/libs/netcdf

export PACSLIB=/work/gi55/c24223/libs
export NETCDF=$PACSLIB/netcdf
export PATH=$NETCDF/bin:$PATH

2. MPI Library
O-PACS will load Intel MPI (impi) by default, thus we also don't need to install MPICH. Just set the directory of impi binaries (which contains mpicc, mpiexec etc) into environment variables and put it inside the bash_profile. The trick is basically same with NetCDF libraries, but this time, no need to copy the files because impi only has a single directory for all compiler.

$ env | grep impi

export mpi=/home/opt/local/cores/intel/impi/2018.1.163/bin64
export PATH=$mpi:$PATH

Now, reload the bash_profile script to get the changes.

3. Compiling WRF
It's better to compile WRF first, before installing any other libraries for WPS (zlib, libpng and JasPer). The main reason is, in order to install those WPS libraries, we should set up some environment variables which may mess up the WRF compilation (LD_LIBRARY_PATH, CPPFLAG and LDFLAGS).

Update (2018/12): 
If you have already set the .bash_profile file with shared_library variable (LD_LIBRARY_PATH), you don't need to delete/remark that line. Just remark CPPFLAG and LDFLAGS. Anyway, don't forget to remove the remarks (back to original state) after WRF installation completed, otherwise the WPS installation will fail.

$ export WRF_CHEM=1
$ export WRF_EM_CORE=1
$ export WRF_NMM_CORE=0
$ tar -xzvf WRFV3.8.1.TAR.gz
$ cd WRFV3/
tar -xzvf WRFV3-Chem-3.8.1.TAR.gz
$ ./clean -a
$ ./configure

Select the dmpar option with intel compiler Xeon AVX mods and basic nesting, then compile em_real.

18. (serial)  19. (smpar)  20. (dmpar)  21. (dm+sm)   INTEL (ifort/icc): Xeon (SNB with AVX mods)

$ ./compile em_real >& compile.log &

Make sure the compilation is successful.

==========================================================================
build started:   Tue Jan 16 13:26:53 JST 2018
build completed: Tue Jan 16 14:11:22 JST 2018

--->                  Executables successfully built                  <---

-rwxr-x--- 1 c24223 gi55 65532338 Jan 16 14:11 main/ndown.exe
-rwxr-x--- 1 c24223 gi55 65463925 Jan 16 14:11 main/real.exe
-rwxr-x--- 1 c24223 gi55 64466197 Jan 16 14:11 main/tc.exe
-rwxr-x--- 1 c24223 gi55 75598185 Jan 16 14:10 main/wrf.exe

==========================================================================

4. Zlib, Libpng and JasPer libraries
Because the libraries will be compiled and installed using intel compilers, we should set the environment variables.

Update (2018/12):
If all WPS library (zlib, libpng and jasper) has been installed or exists in the system, ignored this step and go straight to step #5. 

$ export CC=icc
$ export FC=ifort
$ export CXX=icpc
$ export F77=ifort
$ export FCFLAGS=-axMIC-AVX512
$ export FFLAGS=-axMIC-AVX512

Install all libraries in $PACSLIB directory.

$ tar -xzvf zlib-1.2.11.tar.gz
$ tar -xzvf libpng-1.6.34.tar.gz
$ tar -xzvf jasper-1.900.1.tar.gz

The configuration and installation of each libraries are almost same. Before installing zlib, set environment variables for dynamic library linker and c pre-processor.

$ export LDLIB=-L$PACSLIB/grib2/lib
$ export CPPFLAGS=-I$PACSLIB/grib2/include

Then install each library with the following steps (zlib first, then libpng and JasPer):

$ ./configure --prefix=$PACSLIB/grib2 
$ make
$ make install

5. Compiling WPS
Before compiling WPS, exit and re-login the shell. It's needed to return the environment variables of LDLIB and CPPFLAGS back to their original states with the entries loaded by O-PACS.

Set environment variables for JasPer:

export JASPERLIB=$PACSLIB/grib2/lib
export JASPERINC=$PACSLIB/grib2/include

Then add grib2 lib directory as well as NetCDF lib directory into the existed LD_LIBRARY_PATH.

export LD_LIBRARY_PATH=/home/opt/local/cores/intel/impi/2018.1.163/intel64/lib: <many directories>:/work/gi55/c24223/libs/grib2/lib:/work/gi55/c24223/libs/netcdf/lib

Then configure and compile using intel compiler with serial configuration (option #17):

$ tar -xzvf WPSV3.8.1.TAR.gz
$ cd WPS
$ ./clean -a
$ ./configure
$ ./compile

Check if the compilation successfully generates geogrid.exe, ungrib.exe and metgrid.exe.

lrwxrwxrwx 1 c24223 gi55 23 Jan 16 15:08 geogrid.exe -> geogrid/src/geogrid.exe
lrwxrwxrwx 1 c24223 gi55 23 Jan 16 15:09 metgrid.exe -> metgrid/src/metgrid.exe
lrwxrwxrwx 1 c24223 gi55 21 Jan 16 15:09 ungrib.exe -> ungrib/src/ungrib.exe

For the last measure, check the library dependency of ungrib.exe and the other two executables.

$ ldd ungrib.exe
 linux-vdso.so.1 =>  (0x00007fff239f0000)
        libpng16.so.16 => /work/gi55/c24223/libs/grib2/lib/libpng16.so.16 (0x00007f3555e64000)
        libz.so.1 => /work/gi55/c24223/libs/grib2/lib/libz.so.1 (0x00007f3555c45000)
        libm.so.6 => /lib64/libm.so.6 (0x00007f3555943000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f3555727000)
        libc.so.6 => /lib64/libc.so.6 (0x00007f3555366000)
        libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f3555150000)
        libdl.so.2 => /lib64/libdl.so.2 (0x00007f3554f4c000)
        libimf.so => /home/opt/local/cores/intel/compilers_and_libraries_2018.1.163/linux/compiler/lib/intel64/libimf.so (0x00007f35549be000)
        libsvml.so => /home/opt/local/cores/intel/compilers_and_libraries_2018.1.163/linux/compiler/lib/intel64/libsvml.so (0x00007f355330b000)
        libirng.so => /home/opt/local/cores/intel/compilers_and_libraries_2018.1.163/linux/compiler/lib/intel64/libirng.so (0x00007f3552f97000)
        libintlc.so.5 => /home/opt/local/cores/intel/compilers_and_libraries_2018.1.163/linux/compiler/lib/intel64/libintlc.so.5 (0x00007f3552d2a000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f35560b3000)

If all library found (no 'library not found'), then the compilation is successful.

Tuesday, January 16, 2018

Research Note #11 - Running WRF-CHEM on Oakforest-PACS

After so many compiling issues with WRF-CHEM installation on FX-10 since the beginning of this year, I started to change my attention to Oakforest-PACS (O-PACS) super computer. Just reading the specs, O-PACS definitely looks much more promising than FX-10: newer processors, (much) more cores and threads, more memory, twice as many nodes as FX-10 and the best of all; intel compilers! While not getting used as GCC, I had few experiences with intel compilers, especially ifort, and they're definitely way better than Fujitsu original compilers in term of compatibility with UNIX/Linux software development. 

Another unexpected thing which convinced me to switch to O-PACS was an email I received last friday. It said (in Japanese) that FX-10 service will be stopped after March 2018. Well, that's just like pouring salt on my food. I've done with FX-10. 

And that's it. In the early morning, I started using O-PACS and installing WRF-CHEM, and if it succeed, I would try to run it using parallel jobs with the compute node of the super computer. Guess what? Not only succeed installing the model, I also successfully run it with MPI and multi-nodes, and those were happened in just one day. While the installation itself was not perfectly smooth due to library linker issue and run-time stack problems, the model could run well afterwards. I will post the installation details soon.

There are several things I would like to highlight for running WRF-CHEM on compute node of O-PACS:
  • Number of processes per node affects the model processing time more than the number of nodes used. Using fewer nodes with more processes could significantly decrease the time consumption than using more nodes with fewer processes.
  • Number of nodes affects the consumption of token much. Using more nodes with fewer process was way more expensive than using fewer nodes with more processes.

Just take a look at the picture above. Job 1595130 (the third from bottom) was submitted to use 64 nodes and 8 processes per node, while job 1595143 (the bottom-most) was submitted to use (only) 16 nodes but has 32 processes per node. The total processes between the two jobs were same: 512 (64x8 and 16x32), the elapse times were almost same as well: around 10 minutes. How about the token consumed? Were they similar? Well ... not so much.

It's obvious that job 1595130 (more nodes, fewer processes) consumed token 4x more than job 1595143 (fewer nodes, more processes). Or maybe that's because I used different resource group?Well, we'll find out soon ...

Nevertheless, at least, starting from today, I can run the model using the full capabilities of one of the fastest super computers in the world. Yay !!!

Finally, I can concentrate more on the upcoming exam on Feb 7.

Time to sleep for more works tomorrow ... 

Wednesday, January 10, 2018

Research Note #10 - Installing NetCDF on FX10

Got a very good lesson today. I've just realized that the NetCDF package I installed for WRF on FX10 few months ago was in fact version 4.1.3 and not 4.4.1.1. I found that 4.1.3 package from WRF online tutorial web page contains more source codes, not only C but fortran and C++, where 4.4.1.1 downloaded  directly from NetCDF website only has either C or fortran codes. In order to install NetCDF libraries for both C and fortran from the latest packages, one should download and install the NetCDF-C first, then install NetCDF-fortran. It was quite confusing since NetCDF-C package's name doesn't contain "C" on it (just NetCDF).

The followings are the steps to install the latest NetCDF-C (4.4.1.1) and NetCDF-fortran (4.2) on FX10 Oakleaf super computer of University of Tokyo. The compilers used for these installations are cross-compilers of FX10 (frtpx, fccpx and FCCpx) on login node. While some resources I found on internet use quite complicated compiler options and optimizations, I used very basic one instead. So far, I found no problems for the compilations.  

1. Environment variables (bash)

$ export CC=fccpx
$ export CXX=FCCpx
$ export FC=frtpx
$ export F77=frtpx

2. Install NetCDF-C

Because it's cross-compiling, we need to add option "--host" for configure script. For FX10, the host name is "sparc64v-sparc-linux-gnu". For other machine, it should be "x86_64" (just run uname command to make it sure).

./configure --prefix=$INSTALL_DIR/netcdf --disable-dap --disable-netcdf-4 --disable-shared --host=sparc64v-sparc-linux-gnu
make
make install

Since it's cross-compilation, we cannot run "make check" on login mode of FX10. In order to check the installed binaries, we should use Interactive mode of FX10 to run them.

3. Install NetCDF-fortran

This is basically the same with its C counterpart. One of differences is the option for configure script where there are no "--disable-dap" and "--disable-netcdf-4" for fortran version. One important thing for this installation is, we should set CPPFLAGS and LDFLAGS on environment variables before running configure script, otherwise the configuration will be failed.

export CPPFLAGS=-I$INSTALL_DIR/netcdf/include
export LDFLAGS=-L$INSTALL_DIR/netcdf/lib

./configure --prefix=$INSTALL_DIR/netcdf --disable-shared --host=sparc64v-sparc-linux-gnu
make
make install
  
After  installation, check the contents of netcdf directory, especially /lib and /include, make sure all libraries are there. Also, don't forget to test "ncdump" on interactive mode to make sure it could be run well.