Skip to content

chore: add llama-cpp-server to embed llama-server directly #2254

chore: add llama-cpp-server to embed llama-server directly

chore: add llama-cpp-server to embed llama-server directly #2254

Triggered via pull request May 13, 2024 21:22
Status Cancelled
Total duration 4m 24s
Artifacts

autofix-rust.yml

on: pull_request
Fit to window
Zoom out
Zoom in

Annotations

2 errors and 4 warnings
autofix
Canceling since a higher priority waiting request for 'TabbyML/tabby/.github/workflows/autofix-rust.yml@refs/pull/2112/merge-add-crate-llama-cpp-server' exists
autofix
The operation was canceled.
autofix
The `set-output` command is deprecated and will be disabled soon. Please upgrade to using Environment Files. For more information see: https://github.blog/changelog/2022-10-11-github-actions-deprecating-save-state-and-set-output-commands/
autofix
The `set-output` command is deprecated and will be disabled soon. Please upgrade to using Environment Files. For more information see: https://github.blog/changelog/2022-10-11-github-actions-deprecating-save-state-and-set-output-commands/
autofix
The `set-output` command is deprecated and will be disabled soon. Please upgrade to using Environment Files. For more information see: https://github.blog/changelog/2022-10-11-github-actions-deprecating-save-state-and-set-output-commands/
autofix
The `set-output` command is deprecated and will be disabled soon. Please upgrade to using Environment Files. For more information see: https://github.blog/changelog/2022-10-11-github-actions-deprecating-save-state-and-set-output-commands/