Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support long running queries in timestream #256

Open
sarahzinger opened this issue Nov 7, 2023 · 5 comments
Open

Support long running queries in timestream #256

sarahzinger opened this issue Nov 7, 2023 · 5 comments

Comments

@sarahzinger
Copy link
Member

Discussed in https://github.com/grafana/timestream-datasource/discussions/248

Originally posted by dleonard-nasuni October 6, 2023
I am using the timestream-datasource, and have a query that takes a particularly large amount of time. I occasionally receive a 504 server error when I hover over the panel that has the query that is timing out, and each instance of the timeout is right around the minute mark. The logs in /var/log/grafana/grafana.log seem to indicate that this timeout is on the timestream-datasource plugin side due to cancelled context. Is there a way to configure this timeout?

image

Here are the /var/log/grafana/grafana.log logs:

logger=context userId=4 orgId=1 uname=<redacted> t=2023-10-06T20:05:45.450493766Z level=error msg="Internal server error" error="[plugin.downstreamError] failed to query data: Failed to query data: rpc error: code = Canceled desc = context canceled" remote_addr=<redacted> traceID=logger=context userId=4 orgId=1 uname=<redacted> t=2023-10-06T20:05:45.450601507Z level=error msg="Request Completed" method=POST path=/api/ds/query status=500 remote_addr=<redacted> time_ms=60003 duration=1m0.003610261s size=116 referer="https://<redacted>" handler=/api/ds/query

logger=context userId=4 orgId=1 uname=<redacted> t=2023-10-06T20:05:45.450601507Z level=error msg="Request Completed" method=POST path=/api/ds/query status=500 remote_addr=<redacted> time_ms=60003 duration=1m0.003610261s size=116 referer="https://<redacted>" handler=/api/ds/query
```</div>
@sarahzinger
Copy link
Member Author

@dleonard-nasuni just a heads up we're moving from discussions to issues so we can more easily track user feedback, so I'm converting your discussion into an issue. I'm curious are you still having trouble with timestream?

@dleonard-nasuni
Copy link

Hi @sarahzinger , thank you for opening this as an issue. I am still experiencing timeouts around the 1 minute mark for some panels that are data-intensive. I did try setting timeout in timestream.yaml in the jsonData section of my datasource configuration (after trying to figure out where timeouts are picked up in code), but timeouts still occurred after 1 minute after setting this field to 30 seconds.

@iwysiu
Copy link
Contributor

iwysiu commented Nov 8, 2023

I took a look at this and this sounds like a long running query issue. This has been handled in some of the other datasources (cloudwatch logs, athena, and redshift) by having the frontend query the backend periodically for updates, which we can and probably should do here.

Implementing that would be a decently large feature/refactor, so I'm going to move this into our backlog for now and our team will prioritize this against other work.

@iwysiu iwysiu changed the title Configure grafana-server <-> timestream-datasource timeout Support long running queries in timestream Nov 8, 2023
@iwysiu iwysiu moved this from Incoming to Backlog in AWS Datasources Nov 8, 2023
@iwysiu iwysiu added type/feature-request New feature or request and removed type/bug labels Nov 13, 2023
@kevinwcyu kevinwcyu moved this from Backlog to Feature requests in AWS Datasources Nov 17, 2023
@kevinwcyu kevinwcyu moved this from Icebox to Backlog in AWS Datasources Nov 17, 2023
Copy link

This issue has been automatically marked as stale because it has not had activity in the last year. It will be closed in 30 days if no further activity occurs. Please feel free to leave a comment if you believe the issue is still relevant. Thank you for your contributions!

@github-actions github-actions bot added the stale label Nov 13, 2024
@dleonard-nasuni
Copy link

Commenting to avoid the issue being closed

@github-actions github-actions bot removed the stale label Nov 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Backlog
Development

No branches or pull requests

3 participants