LUFRAC_CRO error in CMAQv5.3


I am trying to run CMAQv5.3 (after running MCIPv5.0) and am getting the following error:

" >>—>> WARNING in subroutine FLCHECK on PE 010
Inconsistent header data on input files
M3WARN: DTBUF 8:00:00 July 20, 2012 (2012202:080000)"

which is caused by:

" Checking header data for file: LUFRAC_CRO
Inconsistent values for NLAYS: 21 versus 49"

I know the 21 layers should be the 21 land use types, but it’s conflicting with my other meteorology files having 49 layers. I have IOAPI_CHECK_HEADERS turned off. Do you know how I can fix/avoid this error?

Thank you,

IMNHO, this is a bug in grdcheck.F: the tests at lines 137-140 should also exclude LUFRAC files.

Hi @cjcoats,

Thanks for your reply. I edited grdcheck.F to skip LUFRAC_CRO on a couple of steps:

Line 259: added " & INDEX( FNAME, ‘LUFRAC’ ) .EQ. 0 .AND."
Line 138: added " & INDEX( FNAME, ‘LUFRAC’ ) .NE. 0 .OR."

In ${BLD} I did “make clean” and then “make” and it no longer fails on the LUFRAC_CRO header check.

But, now it fails right after the header check with no explanation. I’ve successfully run both the Bench_2016_12SE1 case and the 2016_CONUS simulations, so I know my libraries are successfully built. I rebuilt CCTM with Debug_CCTM turned on, but that doesn’t provide much more information.

I’ve attached one of the logs, as well as the mpi output (slurm). This seems to be implying there’s an issue with the WVEL calculation, but I have CTM_WVEL set to N. Can anyone tell what’s wrong?

Thank you,
CTM_LOG_000.v53_gcc_LA1km_2012emis_WRFv38_20120720.txt (72.3 KB)
slurm-8409684.out.txt (41.4 KB)

Edit: I misunderstood the post here. When I set CTM_WVEL to Y, which I thought would solve the problem, CMAQ failed immediately and did not write any of the CTM_* log files. Please see the attached slurm output file. Any suggestions?

Thank you,
slurm-8409686.out.txt (16.3 KB)

Sorry to keep spamming this thread. When I quit all other running programs, then CMAQ started running.

Hi Elyse,
Does that mean that you have resolved the issue?
Thanks, Liz

Hi Liz,
Yes, the issue was resolved.
Thank you!