I was trying to pull data from API (fully processed), Attaching the code snippet I used for the same:
from sitecat_py.pandas_api import SiteCatPandas
report_description = {
"reportSuiteID": "**********",
"dateFrom": "2016-08-22",
"dateTo": "2016-08-28",
"dateGranularity": "day",
"metrics": [{"id": "orders"},{"id": "revenue"}],
"elements": [{"id": "product", "classification": "Base Item ID", "selected" : item_list}]
}
df = sc_pd.read_sc_api(report_description)
where, 'item_list' is a list containing 250 Base Item ID's
This code is running fine, but occasionally the queue check crosses 20 and it throws this error in the console:
queue check 1
queue check 2
queue check 3
queue check 4
queue check 5
queue check 6
queue check 7
queue check 8
queue check 9
queue check 10
queue check 11
queue check 12
queue check 13
queue check 14
queue check 15
queue check 16
queue check 17
queue check 18
queue check 19
queue check 20
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "sitecat_py\pandas_api.py", line 65, in read_sc_api
jdata = self.omni.get_report(**kwargs)
File "sitecat_py\python_api.py", line 173, in get_report
return self.make_queued_report_request(method, **kwargs)
File "sitecat_py\python_api.py", line 82, in make_queued_report_request
raise Exception('max_queue_checks reached!!')
Exception: max_queue_checks reached!!
I tried to reduce the number of items in the list to 100, but I end up getting this error sometimes even with the reduced Item list (It worked the first time and gave the same error when I ran it the second time for the same item list)
Is there any safe limit for the number of items for which I can safely pull the data for without it throwing this error since it seems to independent of number of rows ??
Thanks in advance
I posted the same question to this forum above, this answer was helpful
By changing the value of 'max_queue_checks' in the 'pandas_api.py' to some higher value will allow the program to check if the data has been pulled for a longer duration, you can change it a larger number as per your requirements
However don't make the value very large as it will allow inefficient data fetch request to run for a longer duration which will slow down the service.