-
Notifications
You must be signed in to change notification settings - Fork 114
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🎛️ ControlHelper Rewrite #1052
🎛️ ControlHelper Rewrite #1052
Conversation
As per issue e-mission#994, fixed the `Download JSON Dump`. The previous date-picker formatted the input as 'dd MMM yyyy', used luxon to format the object. This is a bit hacky - going to rewrite the 'ControlHelper' service, and completely remove the `moment()` altogether.
`js/services.js` has been rewritten so that it no longer relies on the legacy `moments.js` library, and instead uses `luxon`.
It seems that moment.js and luxon handle the |
Update: Wheras luxon, oddly, is not changing the date at all: My guess is that the issue lies within the local time - it seems Luxon uses that for the endOf() method. I'm going to go skim through both of these libraries, and see if I can find the code itself. If anyone has sample data of an overnight travel (e.g., spanning two days), I would really appreciate if I could borrow that for testing! The timestamps seem to be pretty finicky, so I'd like to test my code with as many types of data as possible. |
See comments on [the PR Draft](e-mission#1052). Luxon and Moment.js seem to handle the `endOf('day')` slightly differently. The remedy for this was to simulate the old Unix Integer conversion for the end date, but use the contemporary one for the start date. This adjustment fixed the download, and worked for the multiple timelines tested.
Adjusted `js/services.js` such that it only uses the constants exported from `js/plugins/logger.ts`. Removed the member function `writeFile` from the controlHelper, as it was not being used.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In 2e9173b, I got the "startTime" and "endTime" values mixed. To clarify: the startTime must be processed through the old conversion method getUnixNum()
and the endTime is fetched by running luxon's .toUnixInteger()
.The other way around does not work, I'll update the comment in my next commit.
After reading up on the |
controlHelper no longer utilizes Angular services. The internal functions were separated into their own exports. File write is confirmed to be working, need to write tests and verify all other functions are working as intended.
Update: The code is not committed, but I've got a variation of the JSONDump written that utilizes the Social Sharing plugin, similar to how the QR Code is shared. Below are a few of my notes on the current status of this:
DownloadWorking.mp4 |
After further digging, it seems that the socialSharing library doesn't have a means to stop the extra "message" file from being downloaded. Furthermore, I haven't made much progress on the android side of things. Before continuing down this path further, I think it's worth asking for other folks' opinions! As I see it, we've got two possible ways of moving forward:
Any thoughts @shankari @JGreenlee ? I think this boils down to a question of "Is the extra flexibility worth complicating the share process". Would my time be better spent pivoting to other parts of the rewrite, or is it worth changing this while I'm already updating it? For the time being, I'll keep working on setting up socialSharing on Android! |
@the-bay-kay I don't have a strong preference wrt functionality. To my knowledge, the only people who use this button are:
So I don't think it is a huge deal to them to configure the default mail app. But it would obviously be better if that wasn't required. So I would say that we spend a time-bounded effort on this. If you can get the android code to work in (say) one day, let us clean it up. If not, let us revisit. |
Changed all variables that do not mutate to const.
Success!! Got it working on Android. First thing tomorrow I'll go ahead and test on a few other versions before committing and setting up the PR for review. |
See [PR 1052](e-mission#1052). Using the socialShare library, as we do in the OpCode components, allows iOS users to share through apps other than the defauolt mail app.
22d365a
to
46832c7
Compare
A quick note on the force-push; I accidentally added code from master, and wanted to make sure this branch was as separate when merging. I don't believe the commit would've changed much (it was just a prior update I did), but I figure better to keep them separate for now. |
Here's an email of the JSONDump working on both Android and iOS!! I tested on a couple different versions, so this should be good to go! android_working.mp4iOS.Working.mp4 |
Great work! Before I review this, can you change the target of this PR to some other branch, then change it back to (#1040 is merged but my commits are still showing up in this PR. I've discovered that toggling the branch forces Github to update the diff) |
Updated! Good to know you can clean the git logs like that, I'll go ahead and do the same thing for my next PR! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The code looks clean - I just have a few substitutions I recommend.
I am unsure how, or even if, we would go about testing controlHelper
.
I don't think we can test the email/share menu with unit tests. We might be able to write a test validating the contents of the dump file. But this would require us to mock the retrieval of entries, return some fake entries, and mock the filesystem write.
If you break getMyData
into smaller functions you can probably validate the 'dump' without worrying about mocking filesystem. Function A creates the dump and function B writes the dump to filesystem; function A calls function B. You would test only function A.
Per discussion in PR, made adjustments to debug logs and funciton names.
By extracting the `writeFile` and `shareData` methods, we can now write unit tests for controlHelper; specifically, we are able to test the `writeFile` portion, as the `shareData` method relies on external plugins that are difficult to mock.
Quick update on this! I split apart the two functions in commit 92cd6fb, and they seem to (upon manual testing) be functioning fine! My current issue, however, is with how
I still need to do further investigation as to why exactly this issue is happening! As I discussed with Abby, some of the issues with the i18next_ utils stem from this branch being somewhat out of date, compared to the e-mission-translate repository. Could this be part of the issue? The only function that we need to test is the |
See @Abby-Wheelis's PR #1063 for ideas on how to address both of these. (1. You can define |
I'm pretty stumped on how to write tests for this. The issue lies in the My understanding of the I'll go ahead and clean up the current PR, resolve the conflicts, and push what I have! I've manually tested on iOS & Android, with a few emulators and OPCodes, and everything seems to run smoothly. |
In manual testing, I've run into another issue with this rewrite. When loading the JSON files, all seems to be well when loading files without If anyone has a |
Update: the UserCache data doesn't seem to be the issue. After pre-processing the JSON dump and filtering out the userCache data, the timeline loads, with the following error. Interestingly, the |
Interestingly, this issue doesn't occur when the file is loaded into the Android emulator. Here is a side-by-side of the same timelines loaded on both iOS and android - on iOS, the aforementioned error is thrown, and there are no detected modes. On android, however, the file loads without any issue. Here is a copy of the timeline file (download) - I would be curious to see if other folks are having this same android/ios discrepancy! UPDATE: It seems that all of the current OBJDumps seem to load fine on my Linux/Android setup... I'm beginning to suspect there's an issue with my current |
Cleaned up branch for review
- Android was having issues with the new fileIO - since we are not mocking these functions, we can switch back for now.
|
Found a fix for `['resolveLocalFileSystemURL']` method of fileIO, that works with Android and iOS. Reverting back to this method and updating, to make this consistent with `uploadService.ts` This reverts commit dbe76d1.
Not exactly! When loading this data (using the
When loading my personal data, the script shows that 9 "usercache" entries have been loaded:
When running the I'm pretty stumped on this bug, so I'm putting this into review for a second set of eyes - I'm not terribly familiar with the server, so I may start by combing through the server errors and trying to understand what's missing. |
OK -- peculiarly, I loaded a timeline for Nov 24 from my live OPCode, and it seems to have loaded without any issue...? Looking into this further EDIT: On my open survey opcode, Nov 25th and 23rd have the aforementioned bug, but Nov 24th doesn't -- with my initial inspection, I cannot find any discernible difference between these days (beyond the superficial difference in trips). Notably, this timeline has the |
After some discussion in a meeting today, myself and others have determined this error may be similar to that documented in issue #936 . Below is a full copy of the errors found when running the intake on a faulty timeline file. (emission) $ ./e-mission-py.bash bin/debug/intake_single_user.py -e nrelop_dev-emulator-program_katieBroken
storage not configured, falling back to sample, default configuration
URL not formatted, defaulting to "Stage_database"
Connecting to database URL localhost
analysis.debug.conf.json not configured, falling back to sample, default configuration
google maps key not configured, falling back to nominatim
nominatim not configured either, place decoding must happen on the client
overpass not configured, falling back to default overleaf.de
transit stops query not configured, falling back to default
[ Zen of Python Text ]
analysis.trip_model.conf.json not configured, falling back to sample, default configuration
expectations.conf.json not configured, falling back to sample, default configuration
ERROR:root:habitica not configured, game functions not supported
Traceback (most recent call last):
File "/Users/jrischpa/Documents/e-mission-server/emission/net/ext_service/habitica/proxy.py", line 22, in <module>
key_file = open('conf/net/ext_service/habitica.json')
FileNotFoundError: [Errno 2] No such file or directory: 'conf/net/ext_service/habitica.json'
2023-11-28T13:04:51.247143-08:00**********UUID d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c: moving to long term**********
New entry count == 0 and skip_if_no_new_data = False, continuing
2023-11-28T13:04:51.266319-08:00**********UUID d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c: updating incoming user inputs**********
2023-11-28T13:04:51.366549-08:00**********UUID d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c: filter accuracy if needed**********
2023-11-28T13:04:51.372261-08:00**********UUID d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c: segmenting into trips**********
2023-11-28T13:04:52.045295-08:00**********UUID d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c: segmenting into sections**********
Sectioning failed for user d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c
Traceback (most recent call last):
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/segmentation/section_segmentation.py", line 52, in segment_current_sections
segment_trip_into_sections(user_id, trip_entry, trip_entry.data.source)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/segmentation/section_segmentation.py", line 66, in segment_trip_into_sections
distance_from_place = _get_distance_from_start_place_to_end(trip_entry)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/segmentation/section_segmentation.py", line 200, in _get_distance_from_start_place_to_end
start_place = esda.get_object(esda.RAW_PLACE_KEY, start_place_id)
File "/Users/jrischpa/Documents/e-mission-server/emission/storage/decorations/analysis_timeseries_queries.py", line 48, in get_object
return get_entry(key, object_id).data
AttributeError: 'NoneType' object has no attribute 'data'
2023-11-28T13:04:52.059446-08:00**********UUID d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c: smoothing sections**********
Marking smoothing as failed
Traceback (most recent call last):
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/pandas/core/indexes/base.py", line 3802, in get_loc
return self._engine.get_loc(casted_key)
File "pandas/_libs/index.pyx", line 138, in pandas._libs.index.IndexEngine.get_loc
File "pandas/_libs/index.pyx", line 165, in pandas._libs.index.IndexEngine.get_loc
File "pandas/_libs/hashtable_class_helper.pxi", line 5745, in pandas._libs.hashtable.PyObjectHashTable.get_item
File "pandas/_libs/hashtable_class_helper.pxi", line 5753, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: 'filter'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/location_smoothing.py", line 116, in filter_current_sections
filter_jumps(user_id, section.get_id())
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/location_smoothing.py", line 142, in filter_jumps
is_ios = section_points_df["filter"].dropna().unique().tolist() == ["distance"]
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/pandas/core/frame.py", line 3807, in __getitem__
indexer = self.columns.get_loc(key)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/pandas/core/indexes/base.py", line 3804, in get_loc
raise KeyError(key) from err
KeyError: 'filter'
2023-11-28T13:04:52.077582-08:00**********UUID d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c: cleaning and resampling timeline**********
Cleaning and resampling failed for user d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c
Traceback (most recent call last):
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/clean_and_resample.py", line 88, in clean_and_resample
last_raw_place = save_cleaned_segments_for_ts(user_id, time_query.startTs, time_query.endTs)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/clean_and_resample.py", line 114, in save_cleaned_segments_for_ts
return save_cleaned_segments_for_timeline(user_id, tl)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/clean_and_resample.py", line 140, in save_cleaned_segments_for_timeline
(last_cleaned_place, filtered_tl) = create_and_link_timeline(tl, user_id, trip_map)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/clean_and_resample.py", line 951, in create_and_link_timeline
curr_cleaned_start_place = get_filtered_place(tl.first_place())
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/clean_and_resample.py", line 246, in get_filtered_place
_copy_non_excluded(old_data=raw_place.data,
AttributeError: 'NoneType' object has no attribute 'data'
2023-11-28T13:04:52.628696-08:00**********UUID d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c: inferring transportation mode**********
Error while inferring modes, timestamp is unchanged
Traceback (most recent call last):
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/connectionpool.py", line 714, in urlopen
httplib_response = self._make_request(
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/connectionpool.py", line 403, in _make_request
self._validate_conn(conn)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/connectionpool.py", line 1053, in _validate_conn
conn.connect()
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/connection.py", line 419, in connect
self.sock = ssl_wrap_socket(
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/util/ssl_.py", line 449, in ssl_wrap_socket
ssl_sock = _ssl_wrap_socket_impl(
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/ssl.py", line 501, in wrap_socket
return self.sslsocket_class._create(
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/ssl.py", line 1074, in _create
self.do_handshake()
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/ssl.py", line 1343, in do_handshake
self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1129)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/requests/adapters.py", line 489, in send
resp = conn.urlopen(
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/connectionpool.py", line 798, in urlopen
retries = retries.increment(
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/util/retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='lz4.overpass-api.de', port=443): Max retries exceeded with url: /api/interpreter (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1129)')))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/classification/inference/mode/rule_engine.py", line 24, in predict_mode
mip.runPredictionPipeline(user_id, time_query)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/classification/inference/mode/rule_engine.py", line 55, in runPredictionPipeline
self.predictedProb = self.predictModesStep()
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/classification/inference/mode/rule_engine.py", line 74, in predictModesStep
predictedProb.append(get_prediction(i, section_entry))
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/classification/inference/mode/rule_engine.py", line 113, in get_prediction
return get_motorized_prediction(i, section_entry)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/classification/inference/mode/rule_engine.py", line 140, in get_motorized_prediction
predicted_transit_mode = _get_transit_prediction(i, section_entry)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/classification/inference/mode/rule_engine.py", line 175, in _get_transit_prediction
start_transit_stops = enetm.get_stops_near(section_entry.data.start_loc, start_radius)
File "/Users/jrischpa/Documents/e-mission-server/emission/net/ext_service/transit_matching/match_stops.py", line 124, in get_stops_near
stops = get_public_transit_stops(lat - bbox_delta, lon - bbox_delta, lat + bbox_delta, lon + bbox_delta)
File "/Users/jrischpa/Documents/e-mission-server/emission/net/ext_service/transit_matching/match_stops.py", line 81, in get_public_transit_stops
call_return = make_request_and_catch(overpass_query)
File "/Users/jrischpa/Documents/e-mission-server/emission/net/ext_service/transit_matching/match_stops.py", line 30, in make_request_and_catch
response = requests.post(url + "api/interpreter", data=overpass_query)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/requests/api.py", line 115, in post
return request("post", url, data=data, json=json, **kwargs)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/requests/api.py", line 59, in request
return session.request(method=method, url=url, **kwargs)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/requests/sessions.py", line 587, in request
resp = self.send(prep, **send_kwargs)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/requests/sessions.py", line 701, in send
r = adapter.send(request, **kwargs)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/requests/adapters.py", line 563, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='lz4.overpass-api.de', port=443): Max retries exceeded with url: /api/interpreter (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1129)')))
2023-11-28T13:04:53.186132-08:00**********UUID d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c: inferring labels**********
2023-11-28T13:04:53.196621-08:00**********UUID d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c: populating expectations**********
2023-11-28T13:04:53.205540-08:00**********UUID d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c: creating confirmed objects **********
Error while creating confirmed objects, timestamp is unchanged
Traceback (most recent call last):
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/userinput/matcher.py", line 96, in create_confirmed_objects
confirmed_tl = create_and_link_timeline(ts, timeline, last_confirmed_place)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/userinput/matcher.py", line 124, in create_and_link_timeline
curr_confirmed_start_place = create_confirmed_entry(ts,
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/userinput/matcher.py", line 226, in create_confirmed_entry
confirmed_object_data = copy.copy(tce["data"])
TypeError: 'NoneType' object is not subscriptable
2023-11-28T13:04:53.226488-08:00**********UUID d1f18eb8-8aed-4bcc-8b34-b6b589c78b1c: creating composite objects ********** |
For reference, here's the output from running the intake pipeline on another JSON dump from my open survey, one that loads without errors. (emission) $ ./e-mission-py.bash bin/debug/intake_single_user.py -e nrelop_dev-emulator-program_katieWorking
storage not configured, falling back to sample, default configuration
URL not formatted, defaulting to "Stage_database"
Connecting to database URL localhost
analysis.debug.conf.json not configured, falling back to sample, default configuration
google maps key not configured, falling back to nominatim
nominatim not configured either, place decoding must happen on the client
overpass not configured, falling back to default overleaf.de
transit stops query not configured, falling back to default
[ Zen of Python ]
analysis.trip_model.conf.json not configured, falling back to sample, default configuration
expectations.conf.json not configured, falling back to sample, default configuration
ERROR:root:habitica not configured, game functions not supported
Traceback (most recent call last):
File "/Users/jrischpa/Documents/e-mission-server/emission/net/ext_service/habitica/proxy.py", line 22, in <module>
key_file = open('conf/net/ext_service/habitica.json')
FileNotFoundError: [Errno 2] No such file or directory: 'conf/net/ext_service/habitica.json'
2023-11-28T13:43:48.249504-08:00**********UUID eada86fb-40ec-4cd1-8ca8-7862db147718: moving to long term**********
Got error None while saving entry AttrDict({'_id': ObjectId('65665f072bec2a55ca5f625c'), 'metadata': {'time_zone': 'America/Los_Angeles', 'plugin': 'none', 'write_ts': 1700876988.3018951, 'platform': 'ios', 'read_ts': 0, 'key': 'statemachine/transition', 'type': 'message'}, 'data': {'currState': 'STATE_ONGOING_TRIP', 'transition': None, 'ts': 1700876988.301783}, 'user_id': UUID('eada86fb-40ec-4cd1-8ca8-7862db147718')}) -> None
New entry count == 4991 and skip_if_no_new_data = False, continuing
2023-11-28T13:43:53.339275-08:00**********UUID eada86fb-40ec-4cd1-8ca8-7862db147718: updating incoming user inputs**********
2023-11-28T13:43:53.360571-08:00**********UUID eada86fb-40ec-4cd1-8ca8-7862db147718: filter accuracy if needed**********
2023-11-28T13:43:53.365240-08:00**********UUID eada86fb-40ec-4cd1-8ca8-7862db147718: segmenting into trips**********
2023-11-28T13:43:55.445780-08:00**********UUID eada86fb-40ec-4cd1-8ca8-7862db147718: segmenting into sections**********
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/segmentation/section_segmentation_methods/flip_flop_detection.py:59: RuntimeWarning: invalid value encountered in scalar divide
sm.update({"trip_pct": (curr_section_time * 100)/total_trip_time})
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/segmentation/section_segmentation_methods/flip_flop_detection.py:59: RuntimeWarning: invalid value encountered in scalar divide
sm.update({"trip_pct": (curr_section_time * 100)/total_trip_time})
2023-11-28T13:43:56.780488-08:00**********UUID eada86fb-40ec-4cd1-8ca8-7862db147718: smoothing sections**********
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/speed_outlier_detection.py:27: FutureWarning: The default value of numeric_only in DataFrame.quantile is deprecated. In a future version, it will default to False. Select only valid columns or specify the value of numeric_only to silence this warning.
quartile_vals = df_to_use.quantile([0.25, 0.75]).speed
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/speed_outlier_detection.py:27: FutureWarning: The default value of numeric_only in DataFrame.quantile is deprecated. In a future version, it will default to False. Select only valid columns or specify the value of numeric_only to silence this warning.
quartile_vals = df_to_use.quantile([0.25, 0.75]).speed
Caught error index -1 is out of bounds for axis 0 with size 0 while processing section, skipping...
Traceback (most recent call last):
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/location_smoothing.py", line 195, in get_points_to_filter
filtering_algo.filter(with_speeds_df)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/jump_smoothing.py", line 240, in filter
self.find_segments()
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/jump_smoothing.py", line 127, in find_segments
segmentation_points = self.get_segmentation_points_ios()
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/jump_smoothing.py", line 155, in get_segmentation_points_ios
jump_to = self.with_speeds_df[(self.with_speeds_df.index < jump) & (
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/pandas/core/indexes/base.py", line 5320, in __getitem__
return getitem(key)
IndexError: index -1 is out of bounds for axis 0 with size 0
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/speed_outlier_detection.py:27: FutureWarning: The default value of numeric_only in DataFrame.quantile is deprecated. In a future version, it will default to False. Select only valid columns or specify the value of numeric_only to silence this warning.
quartile_vals = df_to_use.quantile([0.25, 0.75]).speed
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/speed_outlier_detection.py:27: FutureWarning: The default value of numeric_only in DataFrame.quantile is deprecated. In a future version, it will default to False. Select only valid columns or specify the value of numeric_only to silence this warning.
quartile_vals = df_to_use.quantile([0.25, 0.75]).speed
Caught error index -1 is out of bounds for axis 0 with size 0 while processing section, skipping...
Traceback (most recent call last):
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/location_smoothing.py", line 195, in get_points_to_filter
filtering_algo.filter(with_speeds_df)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/jump_smoothing.py", line 240, in filter
self.find_segments()
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/jump_smoothing.py", line 127, in find_segments
segmentation_points = self.get_segmentation_points_ios()
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/jump_smoothing.py", line 155, in get_segmentation_points_ios
jump_to = self.with_speeds_df[(self.with_speeds_df.index < jump) & (
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/pandas/core/indexes/base.py", line 5320, in __getitem__
return getitem(key)
IndexError: index -1 is out of bounds for axis 0 with size 0
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/speed_outlier_detection.py:27: FutureWarning: The default value of numeric_only in DataFrame.quantile is deprecated. In a future version, it will default to False. Select only valid columns or specify the value of numeric_only to silence this warning.
quartile_vals = df_to_use.quantile([0.25, 0.75]).speed
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/speed_outlier_detection.py:27: FutureWarning: The default value of numeric_only in DataFrame.quantile is deprecated. In a future version, it will default to False. Select only valid columns or specify the value of numeric_only to silence this warning.
quartile_vals = df_to_use.quantile([0.25, 0.75]).speed
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/speed_outlier_detection.py:27: FutureWarning: The default value of numeric_only in DataFrame.quantile is deprecated. In a future version, it will default to False. Select only valid columns or specify the value of numeric_only to silence this warning.
quartile_vals = df_to_use.quantile([0.25, 0.75]).speed
Caught error index -1 is out of bounds for axis 0 with size 0 while processing section, skipping...
Traceback (most recent call last):
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/location_smoothing.py", line 195, in get_points_to_filter
filtering_algo.filter(with_speeds_df)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/jump_smoothing.py", line 240, in filter
self.find_segments()
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/jump_smoothing.py", line 127, in find_segments
segmentation_points = self.get_segmentation_points_ios()
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/jump_smoothing.py", line 155, in get_segmentation_points_ios
jump_to = self.with_speeds_df[(self.with_speeds_df.index < jump) & (
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/pandas/core/indexes/base.py", line 5320, in __getitem__
return getitem(key)
IndexError: index -1 is out of bounds for axis 0 with size 0
2023-11-28T13:43:57.438092-08:00**********UUID eada86fb-40ec-4cd1-8ca8-7862db147718: cleaning and resampling timeline**********
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/speed_outlier_detection.py:27: FutureWarning: The default value of numeric_only in DataFrame.quantile is deprecated. In a future version, it will default to False. Select only valid columns or specify the value of numeric_only to silence this warning.
quartile_vals = df_to_use.quantile([0.25, 0.75]).speed
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/speed_outlier_detection.py:27: FutureWarning: The default value of numeric_only in DataFrame.quantile is deprecated. In a future version, it will default to False. Select only valid columns or specify the value of numeric_only to silence this warning.
quartile_vals = df_to_use.quantile([0.25, 0.75]).speed
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/speed_outlier_detection.py:27: FutureWarning: The default value of numeric_only in DataFrame.quantile is deprecated. In a future version, it will default to False. Select only valid columns or specify the value of numeric_only to silence this warning.
quartile_vals = df_to_use.quantile([0.25, 0.75]).speed
/Users/jrischpa/Documents/e-mission-server/emission/analysis/intake/cleaning/cleaning_methods/speed_outlier_detection.py:27: FutureWarning: The default value of numeric_only in DataFrame.quantile is deprecated. In a future version, it will default to False. Select only valid columns or specify the value of numeric_only to silence this warning.
quartile_vals = df_to_use.quantile([0.25, 0.75]).speed
2023-11-28T13:43:58.963839-08:00**********UUID eada86fb-40ec-4cd1-8ca8-7862db147718: inferring transportation mode**********
Error while inferring modes, timestamp is unchanged
Traceback (most recent call last):
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/connectionpool.py", line 714, in urlopen
httplib_response = self._make_request(
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/connectionpool.py", line 403, in _make_request
self._validate_conn(conn)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/connectionpool.py", line 1053, in _validate_conn
conn.connect()
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/connection.py", line 419, in connect
self.sock = ssl_wrap_socket(
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/util/ssl_.py", line 449, in ssl_wrap_socket
ssl_sock = _ssl_wrap_socket_impl(
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/ssl.py", line 501, in wrap_socket
return self.sslsocket_class._create(
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/ssl.py", line 1074, in _create
self.do_handshake()
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/ssl.py", line 1343, in do_handshake
self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1129)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/requests/adapters.py", line 489, in send
resp = conn.urlopen(
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/connectionpool.py", line 798, in urlopen
retries = retries.increment(
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/urllib3/util/retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='lz4.overpass-api.de', port=443): Max retries exceeded with url: /api/interpreter (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1129)')))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/classification/inference/mode/rule_engine.py", line 24, in predict_mode
mip.runPredictionPipeline(user_id, time_query)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/classification/inference/mode/rule_engine.py", line 55, in runPredictionPipeline
self.predictedProb = self.predictModesStep()
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/classification/inference/mode/rule_engine.py", line 74, in predictModesStep
predictedProb.append(get_prediction(i, section_entry))
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/classification/inference/mode/rule_engine.py", line 113, in get_prediction
return get_motorized_prediction(i, section_entry)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/classification/inference/mode/rule_engine.py", line 140, in get_motorized_prediction
predicted_transit_mode = _get_transit_prediction(i, section_entry)
File "/Users/jrischpa/Documents/e-mission-server/emission/analysis/classification/inference/mode/rule_engine.py", line 175, in _get_transit_prediction
start_transit_stops = enetm.get_stops_near(section_entry.data.start_loc, start_radius)
File "/Users/jrischpa/Documents/e-mission-server/emission/net/ext_service/transit_matching/match_stops.py", line 124, in get_stops_near
stops = get_public_transit_stops(lat - bbox_delta, lon - bbox_delta, lat + bbox_delta, lon + bbox_delta)
File "/Users/jrischpa/Documents/e-mission-server/emission/net/ext_service/transit_matching/match_stops.py", line 81, in get_public_transit_stops
call_return = make_request_and_catch(overpass_query)
File "/Users/jrischpa/Documents/e-mission-server/emission/net/ext_service/transit_matching/match_stops.py", line 30, in make_request_and_catch
response = requests.post(url + "api/interpreter", data=overpass_query)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/requests/api.py", line 115, in post
return request("post", url, data=data, json=json, **kwargs)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/requests/api.py", line 59, in request
return session.request(method=method, url=url, **kwargs)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/requests/sessions.py", line 587, in request
resp = self.send(prep, **send_kwargs)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/requests/sessions.py", line 701, in send
r = adapter.send(request, **kwargs)
File "/Users/jrischpa/miniconda-23.1.0/envs/emission/lib/python3.9/site-packages/requests/adapters.py", line 563, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='lz4.overpass-api.de', port=443): Max retries exceeded with url: /api/interpreter (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1129)')))
2023-11-28T13:43:59.500719-08:00**********UUID eada86fb-40ec-4cd1-8ca8-7862db147718: inferring labels**********
2023-11-28T13:43:59.551722-08:00**********UUID eada86fb-40ec-4cd1-8ca8-7862db147718: populating expectations**********
2023-11-28T13:43:59.573080-08:00**********UUID eada86fb-40ec-4cd1-8ca8-7862db147718: creating confirmed objects **********
While getting section summary, section length = 0. This should never happen, but let's not crash if it does
While getting section summary, section length = 0. This should never happen, but let's not crash if it does
While getting section summary, section length = 0. This should never happen, but let's not crash if it does
While getting section summary, section length = 0. This should never happen, but let's not crash if it does
2023-11-28T13:43:59.978388-08:00**********UUID eada86fb-40ec-4cd1-8ca8-7862db147718: creating composite objects ********** |
@shankari What do we expect to happen when we dump, and then intake, a day that begins with a trip (meaning the Does the dump need to include the place? Because if not, the |
My current theory is that this is why some days exhibit this issue and some don't. If the day started with a place (meaning you were staying there at midnight), that place gets included in the dump. When the first trip of the day gets processed, it has a |
Or, alternatively, would we expect the cleaned and confirmed objects to be re-created anyway when we |
If we reset the pipeline (delete all analysis objects) and re-run the intake, cleaned and confirmed objects will be re-created.
I checked the code, and we don't specify the ``time_key`, so we get all entries that were written that day. But to answer the original question, we will not return the first place for the day since it would have been created while processing the data for the last trip on the previous day.
|
I don't think @the-bay-kay was resetting the pipeline as part of this process, so there should have been no new timeline data to process and the pipeline shouldn't have done anything to the timeline. Yet it threw an error in the section segmentation code. If there should have been nothing new, why was it even re-segmenting at all? |
Also, I don't see any reason to think this is related to the migration. @the-bay-kay can confirm whether the same thing happens on If we can confirm that the issue affecting Nov 23 and 25 is not a regression, I suggest we open a separate issue for it and merge this in the meantime. |
I'm going to have to look carefully at the data and potentially try to re-run it locally to debug further. So yes, if this happens on master, I am fine with investigating this issue later. |
Can @the-bay-kay first confirm that this is not a regression? I am happy to merge fairly soon after that. |
Can confirm both! I just double checked, and loading this timeline onto a fresh clone of master has the same issue. If anyone is interested getting a copy of my opcode or these timelines for testing, LMK and I can send them over! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am merging this in the interests of unblocking the final set of changes for the rewrite. However, although I see some expanded mocks, I don't actually see any new tests here. I would suggest that you add tests similar to the ones proposed in the discussion starting at
#1097 (comment)
I have also added some polishing, both large and small, for future fixes.
@@ -15,4 +15,4 @@ | |||
<div id="appRoot" class="fill-container" style="width: 100vw; height: 100vh;"></div> | |||
</body> | |||
<script src="dist/bundle.js"></script> | |||
</html> | |||
</html> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
extraneous whitespace
// window['cordova'].file.cacheDirectory is not guaranteed to free up memory, | ||
// so it's good practice to remove the file right after it's used! | ||
const localClearData = function () { | ||
return new Promise<void>(function (resolve, reject) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Woo! Good addition to the existing functionality!
const localWriteFile = function (result: ServerResponse<any>) { | ||
const resultList = result.phone_data; | ||
return new Promise<void>(function (resolve, reject) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can this be combined with the similar function that saves loggerDB
and userCacheDB
?
Maybe create a new fileWriterService
?
end_confirmed_place: ConfirmedPlace; | ||
end_confirmed_place: ServerData<ConfirmedPlace>; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
dumb typescript question for my edification: what does this change in aid of?
See Issue 994 for context!
In order to fix the 'Download JSON Dump' feature, we need to rework the ControlHelper service. The goal is to separate this service into its own file (e.g.,
controlHelper.ts
). In the process of converting this from an angular service to a TS component, I'll be using luxon instead of moment.js, as discussed in the issue.