Error creating BCON from Hemispheric CMAQ outputs

Dear all, I’m following this tutorial to create BC condition for a domain set on Europe: CMAQ/CMAQ_UG_tutorial_HCMAQ_IC_BC.md at master · USEPA/CMAQ · GitHub
I arrived without problems to step 6 where I should regrid the files from the hemispheric output to the grid I need to simulate but I’m finding this error :

—>> WARNING in subroutine RDTFLAG
Time step not available in file CTM_CONC_1 for variable NTR
M3WARN: DTBUF 0:00:00 Jan. 16, 2016 (2016016:000000)

 >>--->> WARNING in subroutine M3_BCOUT:INTERP3
 Time step not available for "NTR" from CTM_CONC_1
 M3WARN:  DTBUF 12:00:00  April 16, 2016(2016107:120000)

 *** ERROR ABORT in subroutine M3_BCOUT
 Could not read input CTM Conc file CTM_CONC_1
 Date and time  0:00:00   Jan. 16, 2016   (2016016:000000)

log.bc.txt (19.3 KB)
I uploaded the entire log in this message. Has anyone some suggestion about?

Thanks in advance

Kind Regards

The log is clear enough: it claims that variable NTR for date&time 0:00:00Z Jan. 16, 2016 is not available in the CTM_CONC_1 file.

The log message for opening this file says that the starting date&time is 0:00:00 Jan. 16, 2016; however, the time step is 2196:00:00 hh:mm:ss (91.5 days) which says that this is surely a badly-constructed file…

What does the M3Tools command m3stat CTM_CONC_1 DEFAULT say about statistics for this date&time? …and what does the netCDF command ncdump -h $CTM_CONC_1 say about the variables-list in the file?

Thanks for your answer. The variable NTR is present inside the file but it has no values. The time format displayed is the original one from the file downloaded here: CMAQ/CMAQ_UG_tutorial_HCMAQ_IC_BC.md at master · USEPA/CMAQ · GitHub
while the MCIP outputs I’m trying to use are hourly for the whole year 2016. I’m uploading the output of ncdump -h $CTM_CONC_1 but I can’t use the command m3stat because I haven’t the M3Tools on this machine.

Is it reasonable to assume that albeit the $CTM_CONC_1 has time step 2196:00:00 hh:mm:ss is can use it with a full year reference grid file?

Thanks ncdump_conc.txt (22.2 KB)

In the ncdump -h output:

global attributes:
...
		:SDATE = 2016016 ;
		:STIME = 0 ;
		:TSTEP = 21960000 ;

says that this is not "MCIP outputs … hourly for the whole year 2016. "

BTW, you ought to have the M3Tools programs any time you’re dealing with I/O API files… they’re the basic utility-tool set for these files (as well as being a set of well-written sample codes).

The ncdump -h $CTM_CONC_1 you asked me is relative to the input file that is a hemispheric output file from CMAQ containing six-time stamps (10/16/2015 12:00, 1/16/2016 0:00, 4/16/2016 12:00, 7/17/2016 0:00, 10/16/2016 12:00, and 1/16/2017 0:00). The first time stamps has been deleted because I have no output for year 2015.

The ncdump -h of the MCIP out is different and I’m uploading it now. ncdump_MCIP.txt (15.9 KB)

The

NCO = "netCDF Operators version 4.8.1 (Homepage = http://nco.sf.net, Code = http://github.com/nco/nco)"

makes me extremely dubious of the whole thing.

From the I/O API Home Page:

The Models-3 I/O API is a programming interface, not a data format !!

I/O API files are not synonymous with “netCDF files” !!
Instead, netCDF is one of four (and a half) distinct lower layers on which the data and metadata structures for I/O API files are currently available; additional lower layers have been incorporated at times in the past, and may very well be incorporated at various times in the future. Attempts to treat the I/O API as “just a data format” have consistently failed in all cases because the persons attempting to do so have not fully understood the I/O API data structures embedded in these files. (And generally they haven’t attempted to contact the I/OL API author about these data structures, either.

Attempts to treat the I/O API as a data format (e.g., using nco) are not authorized; historically, none of them has ever got everything right. Almost certainly something has been screwed up in the creation of this file, and that is the problem.

Could you therefore suggest me an alternative way to solve the problem?
I have the original MCIP outputs (not edited with NCO) for each day of the year, before to merge them in only one file I started from them but I had a similar error. I’m uploading the log files:

  1. Log from run_bcon with original daily mcip out file log.bc_original_MCIP.txt (19.3 KB)
  2. m3stat log file for the input file (hemispheric CMAQ output) m3statlog.txt (8.6 KB)
  3. ncdump -h mcip original input file (used for 1.) ncdump_MCIP_original.txt (9.2 KB)

The canonical way to do this is to use repeated runs of M3Tools program m3cple, using the runs to keep adding on to a common output file, starting from the earliest file and continuing with the files in sequential order.

Note that this can be made easier by using File-Set File Lists for the inputs…

Ok, so if I well understand I can regrid the individual times I need for year 2016 from the hemispheric CMAQ output file to the final grid of my coarse domain using m3cple having at the end 4 boundary condition files (one for each season) to use for the CMAQ simulations of my domain.
is this right?

The ncdump -h $CTM_CONC_1 you asked me is relative to the input file that is a hemispheric output file from CMAQ containing six-time stamps (10/16/2015 12:00, 1/16/2016 0:00, 4/16/2016 12:00, 7/17/2016 0:00, 10/16/2016 12:00, and 1/16/2017 0:00). The first time stamps has been deleted because I have no output for year 2015.

You should not delete the first time step from the downloaded file before using it as input to BCON, it is there intentionally to allow a full year of BCON generating using quarterly mean H-CMAQ concentrations. Since the H-CMAQ concentrations are quarterly means, in the file they are represented at time steps roughly corresponding to the middle of each season (i.e. mid-January for winter, mid-April for spring etc.). The target times and time step structure are obtained from the MCIP files (i.e. hourly in your case), and INTERP3 is called to interpolate the H-CMAQ quarterly values to the target times. By including an extra ‘fall’ value at the beginning of the file and an extra ‘spring’ value at the end, this file can be used to process an entire calendar year of boundary conditions if the MCIP files are available. For multiple years or years other than 2016, users need to follow the tutorial on how to time shift the H-CMAQ file.

Please try rerunning BCON with the downloaded H-CMAQ file without removing the first time step. My guess is that using NCO to remove the first time step somehow messed up the TFLAG variable which then made the file unusable for INTERP3.

Thanks for your answer. I made the trial you suggested that was also the first attempt I tried at the beginning. When I run bcon using the H-CMAQ file with all the time step as in the original one I have this error message:

—>> WARNING in subroutine RDTFLAG
Time step not available in file CTM_CONC_1 for variable NTR
M3WARN: DTBUF 0:00:00 Jan. 16, 2016 (2016016:000000)

 >>--->> WARNING in subroutine M3_BCOUT:INTERP3
 Time step not available for "NTR" from CTM_CONC_1
 M3WARN:  DTBUF 12:00:00  April 16, 2016(2016107:120000)

 *** ERROR ABORT in subroutine M3_BCOUT
 Could not read input CTM Conc file CTM_CONC_1
 Date and time  0:00:00   Jan. 16, 2016   (2016016:000000)

I’m attaching the full log file with the bcon run : log_bcon_full-file.txt (19.3 KB)

I believe that the issue is due to your using the v5.3.1 version of BCON. This older version (5.3.1) requires the regrid versus profile method to be done at compile time, rather than as a run time option. Please upgrade your CMAQ version to 5.3.2 and retry the tutorial.

Upgrading to CMAQv5.3.2 is always a good idea, but I don’t think we have changed the BCON code since 5.3.1.

Are you sure you’re using the original unmodified downloaded file in your new test? Could you please post the output from ncdump -v TFLAG CCTM_CONC_v53beta2_intel17.0_HEMIS_cb6r3m_ae7_kmtbr_m3dry_2016_quarterly_av.nc?

I should have asked about this earlier - which (if any) of the optional steps 2 and 3 in the tutorial did you work through?

I see from the BCON log file that the BCON is trying to read file “CCTM_CONC_v53beta2_gcc_HEMIS_cb05e51_ae6_aq_kmtbr_m3dry_2016_quarterly_av.nc” while the file we posted is “CCTM_CONC_v53beta2_intel17.0_HEMIS_cb6r3m_ae7_kmtbr_m3dry_2016_quarterly_av.nc” so it seems you went through the species remapping step.
If so, which tools did you use to do the remapping? Could you please post the outputs of ncdump -v TFLAG for both of these files?

Yes, following the tutorial I remapped the original input file to the final mechanism. Here the ncdump of both files:

  1. original: original_TFLAG.txt (88.8 KB)
  2. remapped remapped_TFLAG.txt (28.2 KB)

For the remapping I used the combine_remapping script provided in the tutorial

Thanks for posting, this helps.

When you look at the TFLAG values in the two files, you’ll notice that some of them are missing in the file after you did the remapping. This means that an error occurred when you ran combine to perform the remapping.

When you ran combine to perform the remapping, how did you generate the required SpecDef file to go from cb6r3m_ae7 to cb05e51_ae6? In my recollection, we did not post a SpecDef file for this specific conversion. I am guessing that the SpecDef file contained some entries that led to errors in combine which then led to the missing time steps for some variables in the remapped file. What does the ‘combine’ log file tell you?

Yes this is right, there was no default SpecDef file in the tutorial directory and I created it on the base of the template of the other. I re run combine remapping and I’m attaching both the log of the script and the SpecDef file created.

  1. SpecDef_cb05e51_ae6_aq_derived_from_cb6r3m_ae7_kmtbr.txt (6.4 KB)
  2. log.combine.remapping.txt (22.7 KB)

Thanks.

Something doesn’t quite add up - the SpecDef file you posted contains 70+ defined species but the combine log file shows only 61 species being written to an existing output file that has 69 variables. For example, NTR is defined in the SpecDef file but is not being written by ‘combine’ to the output file (and ‘combine’ would fail - and did fail as per @lizadams post below - because the various CB05 organic nitrate species listed in the NTR definition do not exist in the cb6r3m mechanism).

Here is what I’m thinking:

  • You probably modified the SpecDef file several times, and the version you posted was not the exact version used to run ‘combine’ generating the log file you posted

  • ‘combine’ opens the output file in read/write mode, so if you’re modifying SpecDef between different attempts, you should delete the previous ‘combine’ output file before running again. Otherwise, previously defined variables will remain in that file but may not have values associated with them. This is exactly what may have happened with NTR - your combine log file does not show it being written, but it was listed in the SpecDef file you posted which may be an earlier version of the file than the one ultimately used to successfully run ‘combine’

  • It also looks like you based your custom cb6r3m_ae7 to cb05e51_ae6 mapping on the postprocessing SpecDef file for cb05e51_ae6_aq. This is not correct - you need to generate the CMAQ model species listed in the GC_, AE_ and NR_ namelist files in the cb05e51_ae6_aq mechanism directory from the CMAQ model species listed in the GC_, AE_ and NR_ namelist files in the cb6r3m_ae7_kmtbr. NTR, ATOTI, ATOTK, etc. are not CMAQ model species, so if you generate BCON files with these species, CMAQ would not be able to use them

When I try to use the SpecDef_cb05e51_ae6_aq_derived_from_cb6r3m_ae7_kmtbr.txt

with the following definition:
NTR ,ppbV ,1000.0*(NTROH[1]+NTRALK[1]+NTRCN[1]+NTRCNOH[1]+NTRM[1]+NTRI[1]+NTRPX[1])

I am getting the following error:

 >>--->> WARNING in subroutine READ3
 Requested variable "NTROH" not available
 M3WARN:  DTBUF 12:00:00  Oct. 16, 2015 (2015289:120000)

ERROR Invalid syntax for field: NTROH[1]

ERROR Cannot read NTROH from INFILE1