This is the error message in your CTM_LOG_000 file:
"CTM_CONC_1" opened as NEW(READ-WRITE )
File name "/data/cmaq/data/output_CCTM_v55_gcc_Bench_2018_12NE3_cracmm2_stage/CCTM_CONC_v55_gcc_Bench_2018_12NE3_cracmm2_stage_20180701.nc"
File type GRDDED3
Execution ID "CMAQ_CCTMv55_sha=a1c596eaf4_root_20250625_170645_110883345"
Grid name "2018_12NE3"
Dimensions: 105 rows, 100 cols, 1 lays, 13 vbles
NetCDF ID: 2490368 opened as VOLATILE READWRITE
Starting date and time 2018182:000000 (0:00:00 July 1, 2018)
Timestep 010000 (1:00:00 hh:mm:ss)
Maximum current record number 0
Gas Chem species saved to CONC file:
*** ERROR ABORT in subroutine OPCONC on PE 000
Could not write O3 to CTM_CONC_1
PM3EXIT: DTBUF 0:00:00 July 1, 2018
Date and time 0:00:00 July 1, 2018 (2018182:000000)
In your bldit_cctm.csh, you have uncommented both ParOpt and DistEnvr environment variables.
Unless you are working on distributed environment then you should comment out the DistEnvr variable.
set ParOpt #> uncomment to build a multiple processor (MPI) executable;
#> comment out for a single processor (serial) executable
#set DistrEnv #> uncomment to distribute environmental variables to multiple machines
#> comment out for a single processor (serial) executable (MPI only)
Please try to comment out this environment variable and re-run bldit_cctm.csh, and then try re-running CMAQ.
If that does not solve the problem, then please verify that you use the same MPI libraries to build I/O API as you use to build CMAQ.
Please share your environment.
Using environment modules, you can see what modules are loaded using module list
Example:
module list
Output:
Currently Loaded Modules:
1) ioapi-3.2/gcc-13.3-classic 2) netcdf-4.5.3-classic/gcc-13.3-classic 3) gcc/13.3.0 4) openmpi_5.0.6/gcc_13.3.0
Alternatively, you can use `echo $LD_LIBRARY_PATH to see what libraries are available when you run the bldit_cctm.csh script.
Example:
echo $LD_LIBRARY_PATH
Output:
/nas/sycamore/apps/openmpi/5.0.6_gcc-13.3.0/lib:/nas/sycamore/apps/gcc/13.3.0/lib:/nas/sycamore/apps/gcc/13.3.0/lib64:/work/users/l/i/lizadams/test_classic_libraries_sycamore/LIBRARIES_gcc_disable-dap/lib:/work/users/l/i/lizadams/test_classic_libraries_sycamore/LIBRARIES_gcc_disable-dap/ioapi-3.2/ioapi/fixed_src
Reviewing your run_cctm_Bench_2018_12NE3_cracmm2_stage_20180701.txt file, it appears that you have commented out the mpirun command, so you may be only running in serial mode? Or this is the serial version of your run script.
#> Executable call for single PE, uncomment to invoke
( /usr/bin/time -p $BLD/$EXEC --allow-run-as-root) |& tee buff_${EXECUTION_ID}.txt
#> Executable call for multi PE, configure for your system
# set MPI = /usr/local/intel/impi/3.2.2.006/bin64
# set MPIRUN = $MPI/mpirun
#( /usr/bin/time -p mpirun -np $NPROCS $BLD/$EXEC ) |& tee buff_${EXECUTION_ID}.txt
Please also share the settings in your config_cmaq.csh.