Hi, For some unknown reasons, the inference scenario getting stuck at step 103. I tried multiple runs, without any luck.
Here is the code:
Nsim = math.ceil(24 * 14 * 3600 / 3600)
forecast_points = ["TDryBul"]
for i in range(Nsim):
forecast = requests.put(
f"{URL}/forecast/{test_id}",
json={"point_names": forecast_points, "horizon": 24*3600, "interval": 3600},
).json()["payload"]
u = controller.compute_control(obs)
res = requests.post(f"{URL}/advance/{test_id}", json = u).json()
print(f"\rSimulated step {i+1} of {Nsim} and Temp: {res['payload']['reaTZon_y'] - 273.15}", file=sys.stdout, end="", flush=True)
obs = get_observations(res)
if len(obs) == 0:
break
what is the "inference scenario", and which emulator are you running?
Posted by: haraldwalnum @ Sept. 9, 2024, 6:04 a.m.Adrenalin1 - 'peak_heat_day'
The code is not advancing nor it returning error. It's staying there at 103.
Posted by: RakeshJarupula @ Sept. 9, 2024, 6:23 a.m.Are you running locally or on service?
Posted by: haraldwalnum @ Sept. 9, 2024, 6:34 a.m.I am using service using the URL = 'https://api.boptest.net'
Posted by: RakeshJarupula @ Sept. 9, 2024, 6:39 a.m.Have you tested it locally?
Posted by: haraldwalnum @ Sept. 9, 2024, 7:11 a.m.Not yet. But the inference has to be run through service right ?
Posted by: RakeshJarupula @ Sept. 9, 2024, 7:12 a.m.Does it stop at the same step every time?
It might look as if there are some server issues. We will look into it.