Commit 96c2ed52 authored by Federico Sismondi's avatar Federico Sismondi
Browse files

Fix for slicing the data before pushing for cases where we have too many records

parent 8f626f88
......@@ -430,9 +430,17 @@ if __name__ == "__main__":
with open(DEFAULT_MERGED_DATE_FILEPATH, "r") as f:
json_data = json.load(f)
if len(json_data) > 1000:"Splitting file in several request as data it too big: {} records".format(len(json_data) ))
slice_ceil = 0
while slice_ceil < len(json_data):
slice_floor = slice_ceil
slice_ceil += 1000
write_weather_observed_entity_to_historical_db(json_data[slice_floor,slice_ceil])"Pushed records {} to {}, from {} records".format(slice_floor, slice_ceil,len(json_data)))
write_weather_observed_entity_to_historical_db(json_data)"Historical data pushed to to IoT Platform historical component from {}".format(COLLECTION_DIR))
# this has never been tested, see postman collection for example on getting token
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment