MCIP and CCTM issue for CMAQ version 5.2.1

Good afternoon CMAQ users:

I am trying to run CCTM, it shows these following error messages.

 XCENT_B:    -97.000000000000  XCENT3D (file):    -97.000000000000
 YCENT_B:     33.000000000000  YCENT3D (file):     40.000000000000
 XCELL_B:  12000.000000000000  XCELL3D (file):  12000.000000000000
 YCELL_B:  12000.000000000000  YCELL3D (file):  12000.000000000000


 *** ERROR ABORT in subroutine SubhFile_Cell on PE 000   
 File header inconsistent with GRID_CRO_2D
 Date&time specified as 0

However, on GRIDDESC, it shows
’ ’
‘LamCon_40N_97W’
2 33.000 45.000 -97.000 -97.000 33.000
’ ’
‘SE52BENCH’
‘LamCon_40N_97W’ -112000.000 406500.000 12000.000 12000.000 157 148 1
’ ’
which is the same as my emission file. And I check GRIDCRO2D, the global attributes are also the same as my emission file. In fact, all my mcip output (GRIDBDY2D, GRIDCRO2D, GRIDDOT2D, METBDY3D, METCRO3D, METCRO2D, METDOT3D) has these global attributes. I don’t know why these error messages were shown. Anyone has an idea?

            :P_ALP = 33. ;
            :P_BET = 45. ;
            :P_GAM = -97. ;
            :XCENT = -97. ;
            :YCENT = 33. ;

Best
Huan

1 Like

Hello Huan.
When you checked the GRIDCRO2D file, did you used m3edhdr iaopi tool?
This is a useful to edit values in ncf files in general. Maybe, if you edit the YCENT3d value in the GRIDCRO2d file, using that tool, from 40 to 45, you can run CCTM without error, but it is an suggestion. If you still have this error, maybe you are using different GRIDCRO2D files to get emission and process in CMAQ.

Tell us about it
Grettings

Hi Grettings:

Thank you for your time!
I checked that this error message is from the default ocean file, which has different coordinate as my input. I used m3fake to create an ocean file that has the same coordinate as my input, but with 0, since my study area is inland. Thus, now, this error message is killed.
Currently, the error messages are INTERPX and PE 000. Actually, this PE 000 error always appears when I try to run CCTM. Do you have any idea how shall I solve this?

Best
Huan
>>—>> WARNING in subroutine INIT_MET:INTERPX
Variable “PURB” not in file GRID_CRO_2D
M3WARN: DTBUF 0:00:00 July 13, 2016 (2016195:000000)

 *** ERROR ABORT in subroutine INIT_MET on PE 000        
  Error interpolating variable PURB from GRID_CRO_2D
 Date and time 0:00:00   July 13, 2016  (2016195:000000)

application called MPI_Abort(MPI_COMM_WORLD, 37196448) - process 0
0.261u 0.244s 0:05.09 9.8% 0+0k 0+96io 0pf+0w
date
Wed Sep 19 15:55:03 EDT 2018
if ( ! -e /scratch/brown/fang63/CMAQ-5.2.1/data/output_CCTM_v521_intel_160713/LOGS ) then
mv: No match.
setenv NEW_START false
set TODAYG = date -ud "${TODAYG}+1days" +%Y-%m-%d
date -ud 2016-07-13+1days +%Y-%m-%d
set TODAYJ = date -ud "${TODAYG}" +%Y%j
date -ud 2016-07-14 +%Y%j
end
while ( 2016196 < = 2016195 )
exit

Hi @fang63
I had to create an ocean file for my domain that was inland as well, however, what I did wasn’t correct apparently, would you mind sharing the script you used to create your OCEAN file?

This is the script I used for that:

Thank you!

Hi Daniel(?):

Are you also planning to create a ocean file with zeros for the variables for an inland study area? Below is my script.

#!/bin/csh -f

#m3fake script to create a dummy ocean file
set APPL = 160713 #> Application Name (e.g. Gridname)
setenv GRIDDESC $CMAQ_HOME/data/met/mcip/$APPL/GRIDDESC
setenv GRID_NAME SE52BENCH
setenv OUTFILE $CMAQ_HOME/data/ocean/ocean_file.dummy.$GRID_NAME.ncf
/scratch/brown/fang63/CMAQ-5.2.1/lib/x86_64/intel/ioapi/Linux2_x86_64ifort/m3fake << EOF

2
$GRID_NAME
1
0
2
OPEN
UNKNOWN
OPEN
3
5

SURF
UNKNOWN
SURF
3
5

OUTFILE

EOF

Thanks for sharing your script @fang63! Yes, I do also need to create an ocean file with zeros for the study area.

Quick question, what version of IOAPI are you using?

I am using version 3.2

Greetings,
We have corrected some errors in the sample run script we provide for creating a dummy ocean file. The updated tutorial can be found under DOCS/Tutorials on the CMAQ GitHub repository for CMAQv5.3b2:


Also CMAQv5.3 now allows a user to run CMAQ without an ocean input file. This is also explained in the tutorial. Hope this helps!

1 Like

I am seeing an error related to MCIP output files when running CMAQv5.2.1:

image

It is basically saying that the time step flag for MET_CRO_3D cannot be read?

Yes, but: “time step flag [for 2014244:0000] for MET_CRO_3D cannot be read” then further means “MET_CRO_3D does not contain DENS_J for 2014244:0000”

Hi skunwar,
Have you figured out this error? I am havinf extaly the same one.

If I remember correctly, this problem went away when I moved away MCIP output files (that existed from previous run of MCIP) from the MCIP output folder!

On reconsideration: how many time steps does the MET_CRO_3D file have?

Note that you need 25 hourly input time steps (from 2014243:000000 through 2014244:000000) to interpolate met data for a 24-hour run-duration ( a “fencepost problem”).

Hy @fang63 @ernesto.pinoc @cjcoats ,

I’m having same issue after running CCTM script. I’m not sure how to run m3edhdr tool. And I didn’t see XCENT3D -97.0 in GRID_CRO_2D file. Please see the error below:

 XCENT_B:    -78.650001525879  XCENT3D (file):    -97.000000000000
 YCENT_B:     40.000000000000  YCENT3D (file):     40.000000000000
 XCELL_B:  12000.000000000000  XCELL3D (file):  12000.000000000000
 YCELL_B:  12000.000000000000  YCELL3D (file):  12000.000000000000


 *** ERROR ABORT in subroutine SubhFile_Cell on PE 110
 File header inconsistent with GRID_CRO_2D

Please let me know how to solve this issue?

Thanks
Rasel

Hello @munshimdrasel ,

In this other thread where @tlspero helped you troubleshoot an MCIP issue, you mentioned that you had previously run CMAQ successfully for the entire U.S. but now you have rerun WRF for a smaller domain and reprocessed those new WRF fields through MCIP.

Are you trying to use these new MCIP files for the CMAQ run you are asking about here? If yes, did you also reprocess all other gridded input files (emissions, etc.) for this new domain? In your run script, did you update the grid name and GRIDDESC file to use the one generated by MCIP for your new domain rather than the one you had used in your CMAQ run for the entire U.S.?

@munshimdrasel

The error you received is similar, but it is not the same as what the original poster needed to solve. In your case, the differences in XCENT_B and XCENT3D suggest to me that you have two different flavors of Lambert conformal projections. The value in XCENT is the standard longitude for the projection (assuming you are using Lambert conformal), and you DO NOT want to try to fix this with M3EDHDR. (By contrast, the YCENT value is reasonably arbitrary as a reference intersection along XCENT, so that could be altered.)

If your MCIP output is in the XCENT_B projection, then you need to process whatever file is compared with XCENT3D to use the same projection. And when I say “projection”–again, assuming Lambert conformal–you need to ensure that the standard longitude and the true latitude(s) are the same between all of your input datasets. These values can be viewed from the global attributes of the file using "ncdump -h ".

Hope this helps.
–Tanya

@hogrefe.christian

Yes. I’m trying to use new MCIP files for the CMAQ run. I’ve used the same wrfout file with entire US domain and it worked fine. After that, I’ve made my domain smaller and changed ncol, nrows values and reran wrf and MCIP. Also, I ran ICON and BCON based on the new smaller domain, NCOLS and NROWS. I’ve checked GRIDESC and I didn’t find any mismatch on my domain NCOLS, NROWS values. Only thing I didn’t do is changing emission files. I’m using the same emission files that I ran for entire US domain. What I know, CCTM will extract emission data for subdomain. That’s why I didn’t change emission files.
Yes. In my run script I’ve updated grid name and GRIDDESC file.

Here’s my error:
XCENT_B: -88.000000000000 XCENT3D (file): -97.000000000000
YCENT_B: 35.000000000000 YCENT3D (file): 40.000000000000
XCELL_B: 12000.000000000000 XCELL3D (file): 12000.000000000000
YCELL_B: 12000.000000000000 YCELL3D (file): 12000.000000000000

 *** ERROR ABORT in subroutine SubhFile_Cell on PE 116
 File header inconsistent with GRID_CRO_2D

Here’s my GRIDDESC:
’ ’
‘LamCon_40N_73W’
2 33.000 37.000 -88.000 -88.000 35.000
’ ’
‘AQF5X’
‘LamCon_40N_73W’ -1182000.000 -582000.000 12000.000 12000.000 197 97 1
’ ’

I’ve also check GRID CRO 2D file and NCOL NROW values are same:

:NCOLS = 197 ;
:NROWS = 97 ;
:NLAYS = 1 ;
:NVARS = 28 ;
:GDTYP = 2 ;
:P_ALP = 33. ;
:P_BET = 37. ;
:P_GAM = -88. ;
:XCENT = -88. ;
:YCENT = 35. ;
:XORIG = -1182000. ;
:YORIG = -582000. ;
:XCELL = 12000. ;
:YCELL = 12000. ;

@tlspero

I’ve checked my GRIDDESC, GRIDCRO2D files, mcip scripts, ICON, BCON files. It seems to me I’ve used the same latitude and longitude everywhere. I don’t know where to look for XCENT3D, YCENT3D values or how did this number come up. One thing I remember, these are the values when I used for the whole US domain. Now I’m only just using a smaller domain. I’ve changed ICON, BCON and MCIP files based on the smaller domain. I just didn’t change emission files which is for whole US domain. What I know CCTM can extract emission data for the subdomain.

Here’s my GRIDCRO2D output

:XCENT = -88. ;
:YCENT = 35. ;

Here’s my GRIDDESC output:

’ ’
‘LamCon_40N_73W’
2 33.000 37.000 -88.000 -88.000 35.000
’ ’
‘AQF5X’
‘LamCon_40N_73W’ -1182000.000 -582000.000 12000.000 12000.000 197 97 1

Also, please see the attached mcip and cctm script and cctm logout files:

run_cctm_rasel.csh_sbatch.txt (27.3 KB) run_mcip_43.csh.txt (20.8 KB)
CTM_LOG_001.v52_gcc_AQF5X_20200601.txt (44.0 KB)

You need to recreate your emission files for your new domain. In general, CMAQ does not window gridded input files from a larger domain to a smaller domain. And in your case, it seems that the new domain uses a different projection (i.e. a different XCENT) than the domain for which you had previously prepared emissions so windowing isn’t even possible.

If all you want to do is to window from your 12 km domain covering the entire US to a 12 km subdomain covering a smaller region, you do not have to rerun WRF unless changing the projection to what you have used in your new WRF run is important to your application. If the projection stays the same and your new 12 km domain is contained within your original 12 km domain, you could use m3tools m3wndw and bcwndw to create meteorological and gridded emission files for the subdomain from your existing files. Alternatively, to create the meteorological fields for the subdomain, you could just rerun MCIP on your existing wrfout files with the appropriate settings for X0, Y0, NCOLS, and NROWS to define the new subdomain.

1 Like

I need to qualify my answer above by saying that my statement “In general, CMAQ does not window gridded input files from a larger domain to a smaller domain” applies to CMAQv5.3+. While windowing capabilities exist for some files in some situations, not all situations are supported and it is therefore safer to generate windowed gridded files offline with a tool like m3wndw.

For older CMAQ versions such as the one you are using, windowing of the (single) gridded emission file by CMAQ probably is feasible, though I have no hands-on experience with this.

In any case, as noted above, windowing is only possible if the map projection is the same between the larger and smaller domain, if both domains have the same grid resolution, and if the smaller domain is a subset of the larger domain.