-
Notifications
You must be signed in to change notification settings - Fork 277
Closed
Description
Guys,
Here's what I did -
- Forked the package, made adjustments in HBaseRelation.scala so that while using this for Google BigTable the class must not call methods of Namespaces viz. getNamespaceDescriptor, createNamespaces etc. Check this
- Compiled my version of package, without big fat jar file, it's become hardly 600 KB
- Bundled this package in my code where I'm using SHC for DF write operations as HBase format
- If I run the job I see issues like -
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/client/TableDescriptor
at org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(HBaseRelation.scala:61)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.client.TableDescriptor
If I have compiled SHC on my local, and if I want to use it in my spark job without obstacles like above, what should I do ? Where am I making mistake ?
Metadata
Metadata
Assignees
Labels
No labels