-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tons of errors with "FileNotFoundError: [Errno 2] No such file or directory" #49
Comments
Some recent updates broke the notebooks. Will look at them shortly. In the meantime, you can export to your external node (laptop) and then import into destination workspace. |
Thanks Andre! |
Checked in the fix. Refresh your notebooks from github too. |
@amesar Thanks for the quick fix! A few follow-ups:
There are lots of errors while creating models: And also this final error:
|
Good catch. I'll look into it. |
…dModelsIterator to account for paging in list_* methods noten in Issue #49
I checked in a fix with issue #51. Try it again. |
Thanks for the quick fix! Wondering if you know what this error is about please?
|
Not quite clear what the issue is. To externally access Databricks MLflow see https://github.com/amesar/mlflow-resources/blob/master/MLflow_FAQ.md#how-do-i-access-databricks-mlflow-from-outside-databricks |
I ran everything inside the databricks cluster & notebook. Not sure why it is trying to connect to "https://oregon.cloud.databricks.com/api/2.0/workspace/mkdirs" |
https://docs.databricks.com/dev-tools/api/latest/workspace.html#mkdirs is legitimate endpoint. Needs to create the target folder if it doesn't exist. |
Yes, the endpoint is legitimate but I thought it should be made to our databricks host instead of https://oregon.cloud.databricks.com? |
Hi @amesar, why is this closed please? |
Best to send me a zip file of your exported directory: andre at databricks. |
I installed the tool with
%pip install git+https:///github.com/amesar/mlflow-export-import/#egg=mlflow-export-import
to a Databricks notebook and ranexport_experiments("all", "dbfs:/mnt/databricks-common-assets/ml_platform/e2_migration/mlflow_export/", True, notebook_formats="DBC", use_threads=True)
.However it ran into so many "No such file or directory" error which is rather weird as fs.mkdirs(notebook_dir) was called before it.
Is there some problem with DBFS or S3 mounts that could have caused this?
Exporting run 111: f2754468f148478a8e8553a3884f32ec
Wrote 3241 bytes.
Wrote 8605 bytes.
ERROR: run_id: f2754468f148478a8e8553a3884f32ec Exception: [Errno 2] No such file or directory: 'dbfs:/mnt/databricks-common-assets/ml_platform/e2_migration/mlflow_export/1036469/f2754468f148478a8e8553a3884f32ec/artifacts/notebooks/manifest.json'
Traceback (most recent call last):
File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-8ebc49b2-997d-4303-aa17-8b98b91cded1/lib/python3.8/site-packages/mlflow_export_import/run/export_run.py", line 76, in export_run
self.export_notebook(output_dir, notebook, run.data.tags, fs)
File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-8ebc49b2-997d-4303-aa17-8b98b91cded1/lib/python3.8/site-packages/mlflow_export_import/run/export_run.py", line 96, in export_notebook
with open(path, "w") as f:
FileNotFoundError: [Errno 2] No such file or directory: 'dbfs:/mnt/databricks-common-assets/ml_platform/e2_migration/mlflow_export/1036469/f2754468f148478a8e8553a3884f32ec/artifacts/notebooks/manifest.json'
The text was updated successfully, but these errors were encountered: