-
Notifications
You must be signed in to change notification settings - Fork 28.8k
[SPARK-22487][SQL][followup] still keep spark.sql.hive.version #19719
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Add back the test cases? |
reverted |
Test build #83706 has finished for PR 19719 at commit
|
Test build #83708 has finished for PR 19719 at commit
|
Test build #83705 has finished for PR 19719 at commit
|
conf += resultSet.getString(1) -> resultSet.getString(2) | ||
} | ||
|
||
assert(conf.get("spark.sql.hive.version") === Some("1.2.1")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we do not set it explicitly, this will not be returned.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a default value will be returned, isn't it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nope, I think. Try it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In Hive, "SET" returns all changed properties while "SET -v" returns all properties. In Spark, SET
queries all key-value pairs that are set in the SQLConf of the sparkSession.
@@ -66,6 +66,12 @@ private[spark] object HiveUtils extends Logging { | |||
.stringConf | |||
.createWithDefault(builtinHiveVersion) | |||
|
|||
// A deprecated config which is only used to provide a default value, in case some existing | |||
// applications depend on this config, e.g. Spark SQL ODBC driver. | |||
val HIVE_EXECUTION_VERSION = buildConf("spark.sql.hive.version") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
doc may needed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This config is a kind of a fake one, I think comments are enough for it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
or maybe more meaningful name
@@ -66,6 +66,12 @@ private[spark] object HiveUtils extends Logging { | |||
.stringConf | |||
.createWithDefault(builtinHiveVersion) | |||
|
|||
// A deprecated config which is only used to provide a default value, in case some existing | |||
// applications depend on this config, e.g. Spark SQL ODBC driver. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i guess the jdbc driver do not depend on it. it used to only return 1.2.1
, i can't imagine it a meaningful value, and i think that jdbc server may work fine with other hive versions too
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's ODBC. Actually it's something out of Spark's control, we must be careful and avoid behavior changes, like conf.get("spark.sql.hive.version")
should still return 1.2.1
instead of null.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if Thrift Server only works for 1.2.1,I guess we don’t need an isolated classload for its hive client like SparkSQLCLIDriver
make it an AlternativeConfig maybe better |
Test build #83712 has finished for PR 19719 at commit
|
@@ -55,6 +55,7 @@ private[hive] object SparkSQLEnv extends Logging { | |||
metadataHive.setOut(new PrintStream(System.out, true, "UTF-8")) | |||
metadataHive.setInfo(new PrintStream(System.err, true, "UTF-8")) | |||
metadataHive.setError(new PrintStream(System.err, true, "UTF-8")) | |||
sparkSession.conf.set(HiveUtils.HIVE_EXECUTION_VERSION, HiveUtils.builtinHiveVersion) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
HiveUtils.builtinHiveVersion => HiveUtils.HIVE_EXECUTION_VERSION?
} | ||
} | ||
|
||
test("Checks Hive version via SET") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i might be a session level conf
Test build #83713 has finished for PR 19719 at commit
|
Test build #83723 has finished for PR 19719 at commit
|
Test build #83732 has finished for PR 19719 at commit
|
Test build #83733 has finished for PR 19719 at commit
|
@@ -66,6 +66,14 @@ private[spark] object HiveUtils extends Logging { | |||
.stringConf | |||
.createWithDefault(builtinHiveVersion) | |||
|
|||
// A fake config which is only here for backward compatibility reasons. This config has no affect |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
affect
-> effect
@@ -66,6 +66,14 @@ private[spark] object HiveUtils extends Logging { | |||
.stringConf | |||
.createWithDefault(builtinHiveVersion) | |||
|
|||
// A fake config which is only here for backward compatibility reasons. This config has no affect | |||
// to Spark, just for reporting the builtin Hive version of Spark to existing applications that | |||
// already reply on this config. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reply
-> rely
Do something like the other deprecated conf |
LGTM except a few comments |
LGTM |
Looking at |
Test build #83748 has finished for PR 19719 at commit
|
Thanks! Merged to master. |
What changes were proposed in this pull request?
a followup of #19712 , adds back the
spark.sql.hive.version
, so that if users try to read this config, they can still get a default value instead of null.How was this patch tested?
N/A