Error in reading emission inputs for representative days

Hello,

I am using CMAQv5.5+ built with gcc 13.3.0, and trying to run for the year 2018 over CONUS (spinup starting from 2017-12-22). The chemical mechanisms and deposition model are cb6r5_ae7_aq and m3dry. I got the relevant input files from the AWS data storage using the following command:

aws s3 cp --no-sign-request --region=us-east-1 --recursive s3://cmas-cmaq-modeling-platform-2018/2018_12US1

When I first ran the model, I got error messages for missing emission files for 2017-12-22 because the directories /emis/ptnonimp, /emis/othpt, and /emis/pt_oilgas do not contain inputs for all days. But based on a previous post thread netCDF error in reading file and the User Guide for CMAQv5.5+ (section 6.9.1), I believe this can be bypassed—CMAQ can use the available emission files on representative days and gap-fill the rest of days according to files in /emis/emis_dates.

So I set all the environmental variables for using representative dates to TRUE in the run scripts:

  # Allow CMAQ to Use Point Source files with dates that do not
  # match the internal model date
  # To change default behaviour please see Users Guide for EMIS_SYM_DATE

  setenv EMIS_SYM_DATE T

  setenv STK_EM_SYM_DATE_001 T
  setenv STK_EM_SYM_DATE_002 T
  setenv STK_EM_SYM_DATE_003 T
  setenv STK_EM_SYM_DATE_004 T
  setenv STK_EM_SYM_DATE_005 T
  setenv STK_EM_SYM_DATE_006 T
  setenv STK_EM_SYM_DATE_007 T
  setenv STK_EM_SYM_DATE_008 T

But the ERROR ABORT for failing to read the emission files persisted:

*** ERROR ABORT in subroutine retrieve_stack_d on PE 003
     Could not extract STK_EMIS_003     file
 PM3EXIT:  DTBUF 0:00:00   Dec. 22, 2017
     Date and time 0:00:00   Dec. 22, 2017  (2017356:000000)

I was wondering whether the model can read the emis_dates files because their directory (${CMAQ_DATA}/2018_12US1/emis/emis_dates) was not specified in the run scripts. For diagnosis, I am attaching my run scripts, log file, and one of the CTM_LOG files.

CTM_LOG_003.v55_gcc_2018_12US1_cb6r5_ae7_aq_m3dry_20171222.txt (32.3 KB)
run_cctm.CONUS.csh (39.2 KB)
run_cctm.log.txt (20.4 KB)

I’d appreciate suggestions on solving this issue and getting the model running.

Thanks,
Xinyue

From reading the LOG file that you uploaded, it seems that the stk_emis_003 file is simply not there.

    STK_EMIS_003    :/home/xinyueh/CMAQv5.5+/data/2018_12US1/emis/othpt/inln_mole_othpt_20171222_12US1_cmaq_cb6_WR413_MYR_2017.nc4
     
     >>--->> WARNING in subroutine OPEN3
     File not available.
     
     Could not open STK_EMIS_003 file
     
     >>--->> WARNING in subroutine DESC3
     Invalid file name argument "STK_EMIS_003    "

The other emission files were opened successfully without any warning / errors.

Could you try the following command?

ls /home/xinyueh/CMAQv5.5+/data/2018_12US1/emis/othpt/inln_mole_othpt_20171222_12US1_cmaq_cb6_WR413_MYR_2017.nc4
1 Like

Your are right, the stk_emis_003 file is missing from the emission input directories. Actually the stk_emis_001 and stk_emis_008 files are also missing—I only bypassed the errors by creating symbolic links (ln -s inln_mole_ptnonipm_20171210_12US1_cmaq_cb6_WR413_MYR_2017.nc4 inln_mole_ptnonipm_20171222_12US1_cmaq_cb6_WR413_MYR_2017.nc4and ln -s inln_mole_pt_oilgas_20171210_12US1_cmaq_cb6_WR413_MYR_2017.nc4 inln_mole_pt_oilgas_20171222_12US1_cmaq_cb6_WR413_MYR_2017.nc4).

But I am reluctant to create symbolic links for all the missing files because 1) a large number of them are not available from the AWS Data Warehouse (e.g., the ptnonimp inputs only have 64 daily files for 2018), and 2) according to the User Guide, CMAQ should be able to read inputs for point stream emissions in a representative day fashion since version 5.4 (see CMAQ/DOCS/Users_Guide/CMAQ_UG_ch06_model_configuration_options.md at eb4ab13a807a841d57179bc8cc8e934e72394a3c · USEPA/CMAQ · GitHub). Since I also have the emis_dates files (available at AWS S3 Explorer), I wonder if I can modify my run scripts to point the model to these files so that emission inputs for only representative days would be sufficient.

Did you try using the dates specified in this file: smk_merge_dates_201712_for2018spinup.txt

I am not very clear about what each column represents in this file but it seems to point to the representative days. I would like for CMAQ to grab the emission inputs based on dates in this file. Do you have suggestions on how I can achieve that? Thanks.

There are several example run scripts included in the bucket, for example a run script for CMAQ_v55 using the cracmm2 mechanism is here:

https://cmas-cmaq-modeling-platform-2018.s3.amazonaws.com/2018_12US1/CMAQ_v55_cracmm2_scripts/run_cctm_2018_12US1_v55_Base_STAGE_EM_CRACMM2_two_week_16x8.csh

The corresponding s3 download script in that same directory: AWS S3 Explorer will download 2 weeks of data to allow you to run for the following dates as specified in the run script:

#> Set Start and End Days for looping
 setenv NEW_START TRUE             #> Set to FALSE for model restart
 set START_DATE = "2017-12-22"     #> beginning date (Dec 22, 2017)
 set END_DATE   = "2018-01-02"     #> ending date    (Jan 2, 2018)

The run script contains the following section of code that automatically uses the information in the smk_merge_dates_$YYYMMM.txt files to use the emissions data that is available according to the representative dates as specified.

   #> Determine Representative Emission Days
  set EMDATES = $INPDIR/emis/emis_dates/smk_merge_dates_${YYYYMM}.txt
  set intable = `grep "^${YYYYMMDD}" $EMDATES`
  set Date     = `echo $intable[1] | cut -d, -f1`
  set aveday_N = `echo $intable[2] | cut -d, -f1`
  set aveday_Y = `echo $intable[3] | cut -d, -f1`
  set mwdss_N  = `echo $intable[4] | cut -d, -f1`
  set mwdss_Y  = `echo $intable[5] | cut -d, -f1`
  set week_N   = `echo $intable[6] | cut -d, -f1`
  set week_Y   = `echo $intable[7] | cut -d, -f1`
  set all      = `echo $intable[8] | cut -d, -f1`

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.