CMAQ v5.3.2 ISSUE

Hello all,

I am trying to run CMAQ ISAM with the newest version v5.3.2. The error response says:

" ================================
|>— TIME INTEGRATION —<|
================================

 Processing Day/Time [YYYYDDD:HHMMSS]: 2016080:000000
   Which is Equivalent to (UTC): 0:00:00  Sunday,  March 20, 2016
   Time-Step Length (HHMMSS): 000500
             VDIFF completed...    5.0 seconds
            COUPLE completed...    0.0 seconds
              HADV completed...    4.8 seconds
              ZADV completed...    0.4 seconds

At line 112 of file o3totcol.f
Fortran runtime error: Unit number is negative and unit was not already opened with OPEN(NEWUNIT=…)

Error termination. Backtrace:
At line 112 of file o3totcol.f
Fortran runtime error: Unit number is negative and unit was not already opened with OPEN(NEWUNIT=…)"

The input setting is correct. I have run the same seting with CMAQ ISAM v5.3.1. Would you have a look and give some advice?

Thank you.

There is also other message in the .out file.
"Error termination. Backtrace:
At line 112 of file o3totcol.f
Fortran runtime error: Unit number is negative and unit was not already opened with OPEN(NEWUNIT=…)

Error termination. Backtrace:
#0 0x2b4ea1c4a8b3 in data_transfer_init
at /home/yul18051/src/cmaq/5.2.1/spack/var/spack/stage/gcc-9.1.0-ypwshntn6jtaq2dvhaf5tnn6fw6ayzkj/spack-src/libgfortran/io/transfer.c:2806
#1 0x6ef280 in ???
#2 0x6f65a4 in ???
#3 0x6345cf in ???
#4 0x6287f4 in ???
#5 0x626ca6 in ???
#6 0x62707c in ???
#7 0x3c14a1ed5c in ???
#8 0x405abc in ???
#0 0x2b58f820d8b3 in data_transfer_init

mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

Process name: [[13454,1],63]
Exit code: 2

real 40.67
user 80.04
sys 42.57


** Runscript Detected an Error: CGRID file was not written. **
** This indicates that CMAQ was interrupted or an issue **
** exists with writing output. The runscript will now **
** abort rather than proceeding to subsequent days. **


"

In your run script, did you set OMIfile to OMI_1979_to_2019.dat ?

1 Like

Thank you @whao, I use the former one OMI_1979_to_2017.dat but not OMI_1979_to_2019.dat in my run script. I think this is the reason.

There are no CCTM_SA_* files in the output directory. I have set “CTM_ISAM Y”. Is there anything that I didn’t set for compiling the system?

Did you uncomment “set ISAM_CCTM” in the bldit script?
See the new ISAM tutorial for additional documentation on setting up ISAM.

Thank you Christian. My CMAQ v5.3.2 ISAM can run successfully.
Here is another question. I want to run CMAQ-ISAM with 49 variables in the CMAQ MASK file. Is this too many variables? I have tried with only one variables, or 9 variables with the same system and it runs successfully. But when I try to run with 49 variables, it failed with the following error.

 >>--->> WARNING in subroutine OPEN3
 Could not open SA_CONC_1       :  Maximum number of files already opened.

 *** ERROR ABORT in subroutine OP_SA on PE 000
 Could not create SA_CONC_1        file

PM3EXIT: DTBUF 0:00:00 March 20, 2016
Date and time 0:00:00 March 20, 2016 (2016080:000000)

In addition to the 49 variables in the mask file, do you also have a fairly large number of different emission files?

The error message suggests that the total number of input and output files exceeds the I/O API parameter MXFILE3 defined in PARMS3.EXT. For I/O API code downloaded and compiled prior to October 2019, that parameter was set to 64, so if you indeed have 40+ emission files and are using an older version of I/O API, you likely were running up against that limit. In October 2019, this parameter was increased to 256.

If this is the case, you need to update the I/O API version on your system with the latest version and recompile both the library and CMAQ. If your application requires more than 256 combined input and output files and/or the number of variables in any file exceeds 2048, you will instead need to obtain and use the “large” version as noted in the CMAQ user guide and the ISAM tutorial linked in my earlier post.

Update to use a more-recent I/O API version, which increased the maximum number of open I/O API files to 256 back a little over a year ago: see https://www.cmascenter.org/ioapi/documentation/all_versions/html/NEWSTUFF.html#oct162019,
or possibly, if necessary, use I/O API-LARGE (see https://www.cmascenter.org/ioapi/documentation/all_versions/html/AVAIL.html and downloaded from https://www.cmascenter.org/ioapi/download/ioapi-3.2-large.tar.gz), which supports up to 512 files and 16384 variables per file, at a not-inconsiderable memory-use and performance penalty.
Note that I/O API-LARGE should be kept carefully segregated from the “normal” I/O API, and that programs using it need to be completely re-compiled from scratch.

Thank you for the information @cjcoats and @hogrefe.christian, but I am using ioapi3.2-large in my system.

This is puzzling, then.

Could you please look at the beginning portion of one of your CTM_LOG* files and confirm that MXVARS3 is being reported as 16384, not 2048?

For reference, could you please also post how many emission input files are specified in your run script and which tag classes are defined for the 49 tags?

In my CTM_LOG* files, MXVARS3 is reported as 2048. But when I build my system, I do use ioapi3.2-large. Should I do extra change when I install ioapi module?

In the CTM_LOG* file:

for conditions of use.

 ioapi-3.2: $Id: init3.F90 136 2019-10-16 13:57:49Z coats $               
 Version with PARMS3.EXT/PARAMETER::MXVARS3= 2048                         
 netCDF version 4.7.0 of Aug 30 2019 16:30:20 $  

Is this means my system is not using ioapi3.2-large but use ioapi-3.2 normal instead?

This means you have a mixed build: the actual library libioapi.a you’re using is not an I/O API-LARGE

This was my mistake. I was using the Git version of IOAPI Large and neglected to replace PARMS3.EXT with PARMS3-LARGE.EXT which I discovered after poking through the installed include files and checking the value of MXVARS3

1 Like