Hi everyone,
I am comparing CMAQ model outputs with ARM SGP measurements for O₃ (Ozone) concentrations from April 22 to May 22, 2016. My analysis shows CMAQ values are consistently much higher than ARM, often in the 10⁴ ppbv range, while ARM measures around 10¹ to 10² ppbv. Both datasets are hourly averaged and time-synchronized.
Questions:
- Could this be due to vertical layer differences in CMAQ vs. ARM measurement height?
- Is this a common issue with boundary conditions or chemical mechanisms in CMAQ?
- Should I consider vertical averaging instead of using just the first layer?
I appreciate any insights or suggestions!
Hello @azurebullet,
This looks like a unit issue to me. Could it be that in your post-processing of CMAQ results, you applied the ppmV-to-ppbV conversion factor of 1E+03 twice?
I have checked the CCTM_CONC.nc files, the unit is ppmv and I convert them to ppbv. The origin value is in the same level.
Is CCTM_CONC.nc
the original model output file you are generating in your simulations? Which model version are you using? Have you looked at your initial conditions (either from ICON or using a CGRID file)? What are the concentrations in the first few hours of your simulation?
I have checked the ICON file. The unit is also ppmv. But I may need to check CGRID then. This is all time value in my simulation. I will let you know CGRID file unit.
No need to check the CGRID file. Instead, look at a spatial plot of the O3 field in your ICON file (using VERDI, ncview, or any visualization tool you’re familiar with). Do you see O3 ppm values in the 0.01 - 0.1 range, or the 10 - 100 range?
I have checked the value with VERDI. It is 0.01-0.1 range
Thank you. This reconfirms my hypothesis that the orange points in your plot were affected by a unit conversion issue and inadvertently are in ppt rather than ppb.
Specifically, if the orange points you’re showing in the plot were derived from the CCTM_CONC files, if April 22, 2016 was the first day of the simulation in which you initialized the model with the ICON file you visualized, and if the ICON values in that file were indeed 0.01 - 0.1 ppm, then the very first orange dot should be between 10 and 100 ppb (0.01 ppm x 1E+03 to 0.1 ppm x 1E+03) rather than ~30,000 ppb. Such a value between 10 and 100 ppb would put it in line with the observations.
Could you please outline in detail the steps you took to go from the CCTM_CONC output files to the orange points in the plot? Which tools and scripts did you use, and how were unit conversions handled in these steps?
I use monetio package to retrieve data and save them as csv version. This plot is log scale. The raw value in my csv file of ozone is from 1 to 100 range.
O.k., thanks. I am not familiar with NOAA’s monetio python package, but when looking at its function used to read CMAQ files, the default behavior in the call to open_dataset
appears to be to convert CMAQ values from ppm to ppb:
def open_dataset(fname, earth_radius=6370000, convert_to_ppb=True, drop_duplicates=False, **kwargs):
"""Method to open CMAQ IOAPI netcdf files.
Parameters
----------
fname : string or list
fname is the path to the file or files. It will accept hot keys in
strings as well.
earth_radius : float
The earth radius used for the map projection
convert_to_ppb : boolean
If true the units of the gas species will be converted to ppbV
So, very likely the model values being written out by monetio to the csv file you’re then visualizing are already in ppb, and you should not multiply them by 1000 before plotting them.
Please double check your monetio workflow and the monetio documentation to make sure you can fully trace the reading of observed and modeled data, their matching in space and time, any unit conversions, and the generation of the output file.
ok thanks for you clarification. I never thought this stgep. Thank you so much!
Just for completeness, it seems the actual conversion of CMAQ values from ppm to ppb performed by monetio
if convert_to_ppb
is kept at its default value of True
happens in lines 80 - 85 of cmaq.py
if convert_to_ppb:
for i in dset.variables:
if "units" in dset[i].attrs:
if "ppmV" in dset[i].attrs["units"]:
dset[i] *= 1000.0
dset[i].attrs["units"] = "ppbV"
So does it mean that it is ppbv actually. The unit ppmv from CCTM_CONC is not right.
No, the ppmV unit attributes shown in CCTM_CONC file created by CMAQ are absolutely correct. I am sure that if you open CCTM_CONC in VERDI, you will see O3 values somewhere between 0.01 ppmV and 0.1 ppmV (10 - 100 ppbV) for Layer 1.
The issue is that monetio is applying this unit conversion from ppmV to ppbV which you were not aware of. When using tools like monetio (which is NOT a tool developed by the CMAQ team or actively supported by the CMAS Center) for model evaluation, you need to be sure to understand how it processes the data.
You are right. I am sorry that I misunderstood that. I will correct my script and pay attention to the other tools now.