-
Notifications
You must be signed in to change notification settings - Fork 350
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The log levels defined on llama.cpp and LlamaSharp side were not aligned anymore (issue #995) #997
The log levels defined on llama.cpp and LlamaSharp side were not aligned anymore (issue #995) #997
Conversation
…ned anymore (issue SciSharp#995)
It looks like the idea of The best way I can think of to handle this is to have a threadstatic field which stores the last log level used. I'll add a comment inline to show what I mean. |
/// <summary> | ||
/// Continue log level is equivalent to None in the way it is used in llama.cpp. | ||
/// </summary> | ||
Continue = 5, | ||
} | ||
|
||
internal static class LLamaLogLevelExtensions |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
internal static class LLamaLogLevelExtensions | |
internal static class LLamaLogLevelExtensions | |
{ | |
[ThreadStatic] private static LogLevel _previous; | |
public static LogLevel ToLogLevel(this LLamaLogLevel llama) | |
{ | |
_previous = (llama) switch | |
{ | |
LLamaLogLevel.None => LogLevel.None, | |
LLamaLogLevel.Debug => LogLevel.Debug, | |
LLamaLogLevel.Info => LogLevel.Information, | |
LLamaLogLevel.Warning => LogLevel.Warning, | |
LLamaLogLevel.Error => LogLevel.Error, | |
LLamaLogLevel.Continue => _previous, | |
_ => throw new ArgumentOutOfRangeException(nameof(llama), llama, null) | |
}; | |
return _previous; | |
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(Note that I haven't tested this, but hopefully it illustrates what I mean)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it, good idea
Looks good to me, thanks for fixing this! Just waiting for the tests to run and then I'll merge it :) |
Hum I don't get the CI error running locally
|
I'll re-run it, unfortunately our CI is flakey sometimes |
Looks like it worked this time. |
Following this issue #995
After this PR on the llama.cpp repo the loglevel defined on both llama.cpp and LlamaSharp sides were not aligned anymore. This misalignment made the model weight loading fail when a custom logger is used.
Help
I am just wondering if it is correct to consider the
CONT
log level as equivalent toNONE
.