-
-
Notifications
You must be signed in to change notification settings - Fork 271
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Timing data contains laps with incorrect duplicate lap times #404
Comments
I was thinking if we can check the I also noticed, that the lap "Time" (start time) i think is in GMT and race control messages are maybe in race local time. I think race control messages time should be converted to GMT ? |
import fastf1
session = fastf1.get_session(2023, 'Canada', 'Q')
session.load(telemetry=False)
ver = session.laps.pick_driver('VER')
ver_df = ver.loc[:, ('LapNumber', 'Time', 'LapTime')]
#create a column that calculates the difference with previous finish lap time
ver_df['sub_time'] = ver_df['Time'].diff()
#create a boolean column to check if 'sub_time' equals 'LapTime'
ver_df['bool_check'] = ver_df['sub_time'] == ver_df['LapTime']
#create a boolean column to check if 'LapNumber' equals 'LapNumber' of previous row
ver_df['bool_previous_lap'] = ver_df['LapTime'] == ver_df['LapTime'].shift(1)
#if "bool_check" False and "bool_previous_lap" True, then set "LapTime" to None
ver_df['LapTime'] = ver_df['LapTime'].mask((ver_df['bool_check'] == False) & (ver_df['bool_previous_lap'] == True), None)
#remove the columns that were used to remove the duplicates
ver_df.drop(['sub_time', 'bool_check', 'bool_previous_lap'], axis=1, inplace=True) I tried to do something like this, but obviously you can correct me if this could lead to ignore "useful" laps. This piece of code only adds some kind of temporary column to check two conditions:
So, if the first condition if False and the second condition is True, we can set None to that lap time. Finally, temp columns are removed from the dataframe. |
@d-tomasino this seems to work, although I'm not entirely happy with a solution like this because it just assumes that any lap time that matches these criteria is incorrect. F1 drivers surprisingly often set two successive laps with exactly the same time (can happen multiple times per race actually). So this would need some more extensive testing on multiple session where it is manually verified whether the removed laps were correctly detected. Additionally, your first check is in theory already implemented in the API parser. It should warn the user about "timing integrity errors", but apparently it is not triggered here. Before fixing this we should figure out why this warning is not shown because there has to be something else that's going on. |
@theOehrly thanks for the reply! You're right, it's understandable that could happen not so rarely to have two straight laps with same exact time. However, in that case (as far as I understood) the difference in "Time" between the two adjacent rows should match the "LapTime" value, which is why, in the case of two consecutive real laps, the two conditions should report In any case, as soon as I can, I could try to take a look first at the "timing integrity errors" warning that is not shown, so that we can try to solve everything step by step |
Describe the issue:
For the Qualifying of the 2023 Canadian GP, the timing data for some drivers contains laps that have the exact same lap time as a previous lap.
For example: Perez' first two laps, Verstappen's last two laps
Reference: https://www.fia.com/sites/default/files/2023_09_can_f1_q0_timing_qualifyingsessionlaptimes_v01.pdf
Edit after first investigation:
The laps that have incorrect lap times (and sector 3 times) are laps during which the session was red-flagged. The lap time and sector 3 time of the previous lap is then received again from the API. I.e. the incorrectly duplicated data already exists in the source data.
Expected Behaviour
FastF1 should detect that these values are incorrect and ignore them.
Reproduce the code example:
Error message:
The text was updated successfully, but these errors were encountered: