-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Python file created locally are converted to notebooks #195
Comments
thanks for reporting this issue - seems like a bug that we should fix would the following work for you:
|
Hello @gbrueckl. What you propose would be ideal. It would allow users to create python workspace files locally and synchronise them with Databricks. |
ok, I think I have a fix ready but just to be sure - this only applies to new items created locally first just published |
Thanks for the super quick fix! I think it's fine if it only applies to new items. |
if your issue is fixed now, please close this ticket thanks, |
Hey! Sorry to reopen this but there still is a small issue. While new workspace python files are correctly stored as files, they're not passed with their extension in the file name. For example, a Would it be possible, for workspace python files, to keep the extension in the name? |
just published |
Great thanks! |
Hello! Really like this package, thanks!
When creating locally a python worskpace file in VS code (with ".py" extension), databricks power tools sees it as a [PYTHON] notebook (with .py extension) instead of as a [FILE].
When synchronising with Databricks, it also modifies this python file into a python Notebook, adding the
# Databricks notebook source
header automatically.Is there a way to keep ".py" files as files and not convert it to a notebook?
Config:
exportFormats = {"Scala": ".scala","Python": ".py","SQL": ".sql","R": ".r"}
The text was updated successfully, but these errors were encountered: