Wednesday, September 13, 2017

Research Note #8 - Running WRF Simulation (No Chem WPS)

To run WRF model (for this case, v.3.8.1), there are basically 3 main steps: Pre-processing/WPS (geogrid, ungrib, metgrib), WRF simulation (real, wrf) and post-processing (UPP, grib2ctl, gribmap). This post assumes that :
  • Only single/coarse domain created for this simulation.
  • Meteorological simulation only, no chem wps included.
  • ARW core used.
  • GrADS will be used to plot the output and UPP is used for post-processing (because ARWpost is .. ehmm .. troublesome) 
  • All utilities (UPP, wgrib, and gribmap) have been installed and paths are set
  • WRF history interval is 1 hour because UPP could only process output interval >= 1 hour (for now).
For more detail info:


WPS --> Prepare data for WRF
-----------------------------
1. GEOGRID --> Setting domain with geogrid
   - Edit namelist.wps
       &share  :    wrf_core (core model)
                         max_dom (coarse=1, nested > 1)
                      io_form_geogrid (output format, 2 for NC format)
      &geogrid:  e_we, e_sn (grid numbers in w-e and s-n components),
                        dx, dy (grid res in meters)
                     map_proj (projection type) --> low latitude --> mercator
                     ref_lat, ref_lon (center coordinates of domain)
                     truelat1 (true latitude) --> 30 for mercator
                     geog_data_path (path of geo data)
   - Run geogrid.exe --> make sure 'successful ..' message shown.
   - Output file --> geo_em_dxx.nc in NC format(xx --> domain number id)
 
2. UNGRIB --> Extracting meteorological fields from met data into intermediate data
   - Download met data files and put in a directory
   - Make symbolic link of Vtable --> e.g. GFS data --> ln -sf ungrib/Variable_Tables/Vtable.GFS Vtable
   - Run link_grib.csh to the met data --> ./link_grib.csh /home/data/gfs/gfs.t00z (put parts of filename, DONT PUT ONLY path there)
   - Edit namelist.wps
        &share  : start_date (start date of unpacking, not related to any domain hence only 1st column will be processed)
                         end_date (idem)
                         interval_seconds (frequency of data in seconds, hourly --> 3600, 3hourly --> 10800, 6hrly --> 21600 and so on)
         &ungrib : out_format (WPS)
   - Run ungrib.exe --> make sure 'successful ..' message shown.
   - Output files --> FILE:YYYY-MM-DD_HH
 
3. METGRIB --> Horizontally interpolates extracted met data onto model domain 
   - Edit namelist.wps
       &metgrid : io_form_metgrid (output format, 2 for NC format)
   - Run metgrid.exe --> make sure 'successful ..' message shown.
   - Output files --> met_em.dxx.YYYY-MM-DD_hh:mm:ss.nc

 
WRF --> Simulate/Run the Model
------------------------------
1. REAL --> Vertically interpolates met_em data (METGRIB's output), creates boundary and initial condition files and does some consistency checks.
   - CD to WRFVx/run
   - Make symbolic links to the met_em files
   - Edit namelist.input
       &time_control : run_* (simulation length, will override end dates if less than end dates)
                                  start_* (start time of simulation)
                                  end_* (end time of simulation, could be overridden by run_*)
                                 interval_seconds (data frequency, must be same with namelist.wps)
                                  history_interval (frequency to write data to wrfout file in minutes, e.g. 60 --> hourly)
                                  frames_per_outfile (how many time periods must be written in a single wrfout file, e.g. 1440 --> could contain 24 hourly history).
                             io_form* (should be 2 for NC)
   &domains      :      time_step (time step for model simulation/integration)
                                 max_dom, e_we, e_sn, dx, dy, grid_id (must be same with namelist.wps)
                                  num_metgrid_levels (number of vertical levels, data-dependent, should be 32 for GFS. Check met_em files with ncdump -h)
   - Run real.exe --> check rsl.out.0000 for the process --> tail -f rsl.out.0000
   - For multi-processor run --> MPI -np x ./real.exe (x = number of processor. e.g 4)
   - Output files --> wrfinput_dxx and wrfbdy_dxx
 
2. WRF --> Generates the model forecast
   - Run wrf.exe --> check rsl.out.0000 for the process --> tail -f rsl.out.0000
   - For multi-processor run --> MPI -np x ./wrf.exe (x = number of processor. e.g 4)
   - Output files --> wrfout_dxx_[initial time], one for each domain
 
 
UPP (Unipost Post Processing) --> Convert wrfout NC file into GRIB data
-------------------------------------------------------------------------------------------
1. RUN_UNIPOST_FRAMES --> Convert single wrfout NC with forecast time frames into GRIB data
   - CD to DOMAINS/postprd/ directory
   - Make symbolic link to wrfout data to DOMAINS/wrfprd/ directory
   - Edit run_unipost_frames
            TOP_DIR (top directory of WRFV3 and UPP)
           DOMAINPATH (domain path)
           WRF_PATH (WRF path)
           UNIPOST_HOME (UPP path)
           POSTEXEC, SCRIPTS (path of UPP binaries and scripts)
           modelDataPath (wrfout file/symbolic link directory --> domains/wrfprd/)
           dyncore (WRF solver, should be 'ARW')
           informat (wrfout format, should be 'netcdf')
           outformat (upp output format, should be 'grib')
           startdate (forecast initial time --> YYYYMMDDHH, should be same as namelist.input)
           fhr (forecast start hour, should be same as namelist.input)
           lastfhr (forecast end hour, should be same as namelist.input)
           incrementhr (history interval, should be same as namelist.input)
   - Run run_unipost_frames --> confirm the messages in case there are errors
   - Output files --> WRFPRS_dxx.hh, one for each history interval
 

GRIB2CTL --> Creates control file for the GRIB data (UPP output) to be read by GrADS
--------------------------------------------------------------------------------------
   - Run grib2ctl.pl script --> perl grib2ctl.pl -verf WRFPRS_d01.%f2 > test.ctl (for domain 01, each forecast hour into a file named test.ctl)
   - Output file --> test.ctl, make sure the t-def parameter in control file corrects with forecast history interval
 
 
GRIBMAP --> Creates GRIB index file to be read by GrADS
--------------------------------------------------------
   - Run gribmap --> gribmap -i test.ctl
   - Output file --> WRFPRS_dxx.00.idx (1 index file for all forecast history interval)

 Once the idx file created, you can open the data with GrADS.

--------------------------------------------------------------------------------------------------------

Some errors which could occurs:

1. WRF crash

The simulation abruptly stopped with messages such as below:

BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES = EXIT CODE: 139 = CLEANING UP REMAINING PROCESSES = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES

Possible causes:

  • Time step is too large compared with the horizontal resolution. Recommended time step is 6*dx in km. For example: dx = 10000 (10 km), then time step should be 6*10 = 60s.

2. WRF freeze

The simulation abruptly stopped without any messages. Possible causes:
  • Memory consumption is too large. Use less processors and set bigger stack size. For example : ulimit -s 20000 or unlimited (not recommended for some cases)