-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[GPT] enable historical signals fetch #1089
Conversation
def get_version(self): | ||
# later on, identify by its specs | ||
# return f"{self.gpt_model}-{self.source}-{self.indicator}-{self.period}-{self.GLOBAL_VERSION}" | ||
return "0.0.0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand why the version should contains all these data and not just a number (here GLOBAL_VERISON
)
Evaluator/TA/ai_evaluator/ai.py
Outdated
self.max_confidence_threshold = self.UI.user_input( | ||
"max_confidence_threshold", enums.UserInputTypes.INT, | ||
self.max_confidence_threshold, inputs, min_val=0, max_val=100, | ||
title="Maximum confidence threshold: % confidence value starting from which to return 1 or -1." | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't it be a min confidence value ?
return self._get_signal_from_stored_signals(exchange, symbol, time_frame, version, candle_open_time) | ||
if self.use_stored_signals_only(): | ||
return await self._fetch_signal_from_stored_signals(exchange, symbol, time_frame, version, candle_open_time) | ||
return await self._get_signal_from_gpt(messages, model, max_tokens, n, stop, temperature) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
def _get_open_candle_timestamp(self, time_frame: commons_enums.TimeFrames, base_timestamp: float): | ||
tf_seconds = commons_enums.TimeFramesMinutes[time_frame] * commons_constants.MINUTE_TO_SECONDS | ||
return base_timestamp - (base_timestamp % tf_seconds) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't we have already this code somewhere else?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added in commons
@@ -55,7 +55,7 @@ def __init__(self, tentacles_setup_config): | |||
self.indicator = None | |||
self.source = None | |||
self.period = None | |||
self.min_confidence_threshold = 0 | |||
self.min_confidence_threshold = 100 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the default value?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, we don't want to round to 1 below 100% confidence by default
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure I understand. I haven't seen any 100% confidence value sent by gpt
requires Drakkar-Software/OctoBot#2450