ERROR ABORT in subroutine HRBEIS

Hi, I am runnung biogenic with SMOKEv3.6. I try domain 2 and 3,
I succeed domain 2, but I fail domain 3
I confirmed domain 3’s output files
I am getting an error on tmpbeis3.

File name “/home/jeeho/mcip_bus/d03_2016054/GRIDCRO2D_d03_2016054”
File type GRDDED3
Execution ID “mcip”
Grid name “GRIDOUT_03_CROSS”
Dimensions: 76 rows, 67 cols, 1 lays, 31 vbles
NetCDF ID: 9 opened as READONLY
Time-independent data.
Value for INITIAL_RUN: Y returning TRUE
Processing Tuesday Feb. 23, 2016
at time 0:00:00

 *** ERROR ABORT in subroutine HRBEIS
 LAI=       Inf out of range at (C,R)=  4, 35
 Date and time  0:00:00   Feb. 23, 2016   (2016054:000000)

mcip and Spatial Allocate were also successful.
domain 2 succeeded without problems.
I do not know what is the problem.
I don’t know what to do

and I wondering about HRBEIS

It would be a great help if anyone can help me.


Can you provide the two domain descriptions? Both the one that works and the one that doesn’t?

Then we can probably help further.

Also, SMOKE v 3.6 is a pretty old version. Is there a reason you are using that one?

1 Like

The error message is saying that LAI (leaf area index) is infinite at grid cell (4, 35). LAI is an input to BEIS. I believe that variable is present in your WRF output and simply passed through by MCIP. Verify that LAI is infinite in your MCIP file (GRIDCRO2D). Then, check your WRF output, and verify that it is infinite there.

If it is, post your WRF version and options you are using, and someone here may be able to help. Alternatively, you could try the WRF user forum.

1 Like

Thank you very much for your reply.
We’re using the old version because we don’t have time.
How can I send you the data? By mail?

Thank you for your advice.
I’ll follow your advice.

I think you should be able to paste the relevant rows from the GRIDDESC file into the forum here

I see!
In case domain 02 (succeed case)
set X0 = 3
set Y0 = 3
set NCOLS = 67
set NROWS = 79

In case domain 03 (failed case)
set X0 = 3
set Y0 = 3
set NCOLS = 67
set NROWS = 76

for reference, WRF namlist.wps
parent_id = 1, 1, 2,
parent_grid_ratio = 1, 3, 3,
i_parent_start = 1, 56, 37,
j_parent_start = 1, 44, 18,
e_we = 124, 73, 73,
e_sn = 131, 85, 82,
geog_data_res = ‘10m’,‘2m’,‘30s’,‘3s’,
dx = 27000,
dy = 27000,
map_proj = ‘lambert’,
ref_lat = 38.0,
ref_lon = 126.0,
truelat1 = 30.0,
truelat2 = 60.0,
stand_lon = 126.0,
geog_data_path = ‘/home/data/geog/’


I finally solved this problem.

The cause was the file.
Remove the coordinates from the srg.txt file, which is generated by the Special Allocator (SA), before creating the nc file, to resolve them.

Thank you so much for your answers.

@eyth.alison @cgnolte @bbaek
I’m running the NEI2017 platform. I get a similar error. The error is: *** ERROR ABORT in subroutine HRBEIS; LAI= 308.10 out of range at (C,R)= 96, 1. I checked the METCRO2D file and the maximum value for LAI is 6.48 (not even close to 308.10). I tried to run “beis” for other dates, but I got the very same error.
I also did what Jeeho suggested, but it didn’t work for me.
Does anybody know what’s the problem and how to solve it?
I also attached the log file of “tmpbeis” program.

tmpbeis3_beis_2019_20170201_12US1.txt (8.9 KB)

We are wondering if the 308.10 might be related to a met value – perhaps 308.1 is a temperature value?

Is this based on met data you generated? Has that data been quality assured looking for invalid / unreasonable values?

Please give more details about the case you are running, including where you got the input data, what steps you performed, and what compiler and version you are using. Did you run the Spatial Allocator?

It could be the temperature data. The point is that I checked the met data and I am sure that there is no problem with the MCIP outputs.

Sure. I’m using the NEI2017 setup provided by USEPA. So, I’m working on the precompiled version of SMOKE.
I’m trying to run “beis” for 12US1 domain.
I used WRF3.8 and MCIP4.3 to make the met data.
No, I did not run Spatial Allocator since the surrogate files are already provided for 12US1 domain.

My colleague suggests there might be a problem where the precompiled binary is not executing correctly on your system’s architecture. Try compiling SMOKE on your system, or using CMAQ with the option to run BEIS online. You can write out diagnostic files for QA/QC purposes.

Thank you for your response.
As you mentioned, currently, I’m running “beis” inline in CMAQ to address the problem. I ran NEI2008, NIE2011, and NEI2014 platforms before, with different resolutions and for different domains and years but I had no problem. It’s weird for me.
I’ll try compiling SMOKE on my system to see what happens.
Thank you again.

If other meteorological data are available, you might also try that to see whether the problem is with the executable or the data.

I wonder if it would be interesting to run a “debug/traceback/check everything” version of hrbeis

  • Pick a debug I/O API compile-type BIN (e.g., Linux2_x86_64dbg
  • setenv BIN that type
  • Build netCDF and I/O API for that BIN.
  • cd <SMOKE *src* directory>; make dir; make
  • Fix the SMOKE ASSIGNS file to use this BIN instead of the existing hard-coded one.

With everything built from scratch that way, you’re guaranteed a consistent build (which should have happened anyway, but…), and you’ll get messages if the problem is something out-of-bounds.

CMAS Center has posted one month of 2017 met data (July) for 12US1 here in case you want to test with that:

Instructions for downloading are here:

@eyth.alison @cjcoats @cgnolte
I tried using CMAQ with the option to run BEIS online. The error occurred again. I found out that the problem was with the “B3GRD” file. The LAI_ISOPS variable had odd values. So, I used the “B360FAC” and “BELD4” files from the 2016 platform and the problem is solved.
It seems one of the inputs of the “normbeis” program has a problem (or maybe both) in the NEI2017 platform.

Glad that you got it to work. Were you using a posted input from 2017 platform or one that you made?

For 2017, the land use was upgraded to use new data.