I want to build WRFv4.1.1-CMAQv5.3.2 two-way model. When I follow the guidance Step 11 in the tutorial, I find that only wrf.exe has been built in /WRF-4.1.1/main/, but real.exe, tc.exe and ndown.exe don’t exist.
I have built CMAQ BLD_CCTM_v532_pgi_twoway folder and copy it as cmaq in /model/WRF-4.1.1/. I use ioapi 3.2 installed earlier, and it works well in one-way model. If I do something wrong, please give me some advice to fix it.
Big thanks for your kind help. Here are my configure.wrf and compile.log files.
Could you please check the section $(SOLVER)_real in the file main/Makefile and see the compilation for real.exe and other *exe are being commented out.
Thanks for your reply. I checked the Makefile you pointed out and found that the relevant part was commented out.
A few days ago I found that WRFv4.5.2-CMAQv5.4 twoway model description interface mentioned that WRFv4.1.1-CMAQv5.3.2 had bugs, so I referred to this guide and tried to compile, but this time I encountered different error.
Log file shows that some modules were compiled without .o files that were not included in previous compilations .F file is converted to .f90 file.
For example, /CCTM/scripts/BLD_WRFv4.5.2_CCTM_v54_pgi/phys/module_cam_support.F doesn’t build .f90 and .o files, then causes an error message.
Meanwhile, the twoway files at /CMAQ_REPO/CCTM/src/twoway/ don’t appear to have been compiled either.
I don’t know why compilation selectively skips some modules required for subsequent compilation. If you have any suggestion on them, please let me know.
If you check wrf-cmaq_buildlog.txt, you will see a line with “-j 2” at the very beginning of the file. It means the compilation was done with two cores. In your case, the compilation failure was likely due to compilation was out of syn in those two cores and broke file dependency. You just need to retype “compile em_real >& mylog” again and you might need to do a few more times. If you want to avoid this, you can compile the code with one core (much slower) by typing “compile -j 1 em_real >& mylog”.
Thank you very much for your helpful advice. After adding “-j 1” in compile code, my previous error were resolved. Now I can find wrf.exe, real.exe and tc.exe in /main/ folder, but there are still some errors compiling ndown.exe. It seems that /dyn_em/solve_em.f90 can’t get the module of twoway.
When I constructed the WRF-CMAQ coupled model, those *.exe other than wrf.exe were not on my mind. This short coming has been rectified in WRF v4.4. If you need to use the WRF version you have, please download a new copy of WRF of the same version and compile the model without constructing the coupled model to generate those other exe files.
I tried compiling the twoway model using WRFv4.1.1 and CMAQv5.4 and found that 4 *.exe were successfully generated.
I used separate WRFv3.9.1.1 and WRFv4.5.2 before trying the twoway model and didn’t experience the compilation errors I encountered this time. This could be due to my configuration mistakes or an unknown bug.
In addition, there are several questions about runscript. If I want to simulate multiple nested domains, do I just need to change the Building WRF Namelist in runscript, like changing the namelist.input in the WRF model, separated by commas for multiple days? And I notice there is a option about running WRF only or WRF-CMAQ. I don’t understand the meaning of w/o in option 2 = run WRF-CMAQ coupled model w/o producing GRID and MET files.
Thank you very much for your help, your method has been helpful to all the problems I have encountered.
I found that in the bldit_cctm.csh file, CMAQv5.4 requires WRFv4.4+ to build the WRF-CMAQ twoway model. When using WRFv4.1.1, I found that the files located in /CCTM/scripts/BLD/cmaq/ did not compile automatically, and still only. F or.f90 files, so I then tried WRFv4.4 with CMAQv5.4.
When I run bldit_cctm.csh, I found that ndown.exe still cannot be generated. The log file shown /dyn_em/solve_em.f90 does not recognize and CALL the subroutine located in /cmaq/twoway*, and the associated log file looked something like this in our previous discussion.
Finally, I discovered the cause of this problem. It may not be the root cause, but I found the problem is related to it.
During the usual compilation, I had a problem calling the hd5f library, so I modified the LIB_EXTERNAL in configure.wrf to add -ldl at the end of hdf5 (which was related to the local server setup). After this, I still run bldit_cctm.csh and then the error arose.
This time, after I modified configure.wrf, I did a
./clean -a
in the BLD folder, and the.o and.mod files in the BLD folder were deleted. Then I ran bldit with the modified configure.wrf and found that all four .exe files were generated, with no error with solve_em.f90.
By the way, I’d like to ask two questions on WRF-CMAQ runscript.
If I want to simulate multiple nested domains, do I just need to change the Building WRF Namelist in runscript, like changing the namelist.input in the WRF model, separated by commas for multiple days?
And I notice there is a option about running WRF only or WRF-CMAQ. I don’t understand the meaning of w/o in option 2 = run WRF-CMAQ coupled model w/o producing GRID and MET files .
I am glad to hear now you can create those 4 exe files. Here are the answers for your questions:
For nested domains, you can’t run WRF-CMAQ the way as running WRF only by providing nesting information in the WRF namelist separated by comms. You need to generate corresponding WRF input for each domains as well as IC/BC. Then run the nested domains one by one in a sequential order.
When you run the offline CMAQ model, here is the flow: run WRF, run MCIP, then run CMAQ. For the WRF-CMAQ coupled model, the MCIP step is gone (literally being absorbed in the coupled model calculation). When you run the coupled model with option 1 or 3, MCIP like files are created for checking purposes. The values of in those MCIP like files are not exactly the same as corresponding MCIP files. They are off by one WRF time step.
Now I can run the WRF part of the coupled model and get wrfout files with set up in runscript:
wrf_cmaq_option = 0
However, when I set it to 1 ( run WRF only, with producing MCIP like GRID and MET files ) or direct_sw_feedback = T, the program displays an error:
ERROR: The option “direct_sw_feedback=T” and “wrf_cmaq_option==1” require CMAQ coupling
I followed wfy’s advice and changed my model version from WRFv4.4-CMAQv5.4 to WRFv4.3-CMAQv5.3.3, but the problem remained.
Here are my runscript, running log and build log. The BLD/cmaq folder looks like it was compiled properly, but the error looks like cmaq was not recognized. run_cctm_WRFCMAQ.csh (52.0 KB) rsl.error.0000.txt (561 Bytes) wrf-cmaq_buildlog.log.txt (3.5 MB)
It seems to me that you are using a run script tailors for any combination of WRFV44+ and CMAQ 533+ for your coupled model with WRF43 and CMAQ533. In order words, WRF 43 does not recognize option, this particular parameter. If you have additional question, please shoot me an email (wong.david-c@epa.gov).