You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As noted on the repository, Spark does not support JSON types for this reason, the BQ connector converts the JSON record into a String. In Spark 3.4.0 a new method to was introduced, which allows us to cast the DataFrame into a target schema. However, the casting fails and forces us to use the fromJson function, which would essentially mean having to store each StructType separately for each column that needs to be parsed, in other words, hardcoding.
I was wondering if there is any other way to do this? Can we somehow determine if a column is of the JSON type in BigQuery (metadata does not seem to be available on the read DataFrame) and if so, retrieve only the relevant schema portion for that specific column and convert it into a StructType?
The text was updated successfully, but these errors were encountered:
As noted on the repository, Spark does not support JSON types for this reason, the BQ connector converts the JSON record into a String. In Spark 3.4.0 a new method
to
was introduced, which allows us to cast the DataFrame into a target schema. However, the casting fails and forces us to use thefromJson
function, which would essentially mean having to store eachStructType
separately for each column that needs to be parsed, in other words, hardcoding.I was wondering if there is any other way to do this? Can we somehow determine if a column is of the JSON type in BigQuery (metadata does not seem to be available on the read DataFrame) and if so, retrieve only the relevant schema portion for that specific column and convert it into a
StructType
?The text was updated successfully, but these errors were encountered: