You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sending GraphQL project items requests for a very large org, for example, becomes very costly, to the point where queries aren't going through anymore because of rate limits. We need to refactor and fine-tune the way we do the request so that the costs are minimal and the datasource is functional.
There are existing cases where a user can query for a couple of times before hitting the rate limit and then needs to wait an hour for the limit to refresh. More details on GitHubs limits here
e.g: for the grafana org, the cost of a query is around 100 points times the number of pages. We need to run the query times page no. because github limits us to only query batches of maximum 100 items and because of this we need to re-run the query.
The text was updated successfully, but these errors were encountered:
We need to go through all the items of a project to return related ones, so the cost cannot really be lowered using our existing implementation. Maybe we could split searching for all items into searching for either issues/PRs/drafts although I think the cost wouldn't change and the search would still happen across all items
Sending GraphQL project items requests for a very large org, for example, becomes very costly, to the point where queries aren't going through anymore because of rate limits. We need to refactor and fine-tune the way we do the request so that the costs are minimal and the datasource is functional.
There are existing cases where a user can query for a couple of times before hitting the rate limit and then needs to wait an hour for the limit to refresh. More details on GitHubs limits here
e.g: for the grafana org, the cost of a query is around 100 points times the number of pages. We need to run the query times page no. because github limits us to only query batches of maximum 100 items and because of this we need to re-run the query.
The text was updated successfully, but these errors were encountered: