You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Python decorators and nice and allow for some beautiful code.
PySpark programmers ofter write DataFrame transformations that change the schema of a DataFrame as follows:
adding columns
removing columns
other changes to schema
This pull request proposes adding some decorators to the quinn public interface. Decorators could provide PySpark programmers with a really nice programming experience.
What would be the ideal decorator end state for PySpark programmers? Or are decorators limited for common PySpark programming patterns?
The text was updated successfully, but these errors were encountered:
I like the idea of using decorators to run validations on a returned DataFrame. It allows programmers to clearly define expected output properties without having to modify their existing functions. Transformations like adding/removing columns or changing the schema tend to be part of the function logic and I don't think decorators make as much sense for that use case. Decorators could allow PySpark programmers to abstract validations away from core transformation logic.
Python decorators and nice and allow for some beautiful code.
PySpark programmers ofter write DataFrame transformations that change the schema of a DataFrame as follows:
This pull request proposes adding some decorators to the quinn public interface. Decorators could provide PySpark programmers with a really nice programming experience.
What would be the ideal decorator end state for PySpark programmers? Or are decorators limited for common PySpark programming patterns?
The text was updated successfully, but these errors were encountered: