scala - spark-submit --packages is not working on my cluster what could be the reason? -
i trying run spark sample postgress database read in spark application.i gave spark command line arguments spark-submit --packages org.postgresql:postgresql:9.3-1101.jdbc41.jar , still getting class not found exception. can please in solving issue ?
it more helpful if can give code snippet , explain steps of how building jar , running on cluster. also, mode of execution (client /cluster)? because possible reasons classnotfoundexception can specific how making spark-submit call.
following code worked me. can give try.
created below scala object file inside scala maven project in eclipse: code :
import org.apache.spark.sparkcontext import org.apache.spark.sparkcontext._ import org.apache.spark.sql._ import org.apache.spark.sql.sqlcontext import org.apache.spark.sparkconf object sparkpgsqlconnect { case class projects(id:int, name:string, address:string) def main(args:array[string]) { val conf = new sparkconf().setmaster(“local[*]”).setappname("postgresqlconnection") //val conf = new sparkconf().setmaster("yarn-cluster").setappname("postgresqlconnection") val sc = new sparkcontext(conf) val sqlcontext= new org.apache.spark.sql.sqlcontext(sc) import sqlcontext.implicits._ val jdbcdf = sqlcontext.load("jdbc", map( "url" -> "jdbc:postgresql:tempdb?user=******&password=******”, "dbtable" -> “employee”)) jdbcdf.show(false) } }
after that, tested above code locally on eclipse first verify code works fine. then, used maven build jar.
and ran below commands mac terminal: in local mode: ./spark-submit --class sparkpgsqlconnect --master local[*] --driver-class-path postgresql-42.0.0.jre6.jar ~/spgtestclient.jar
in cluster mode: ./spark-submit --class sparkpgsqlconnect --master yarn --deploy-mode cluster --conf spark.executor.memory=200 mb --conf spark.executor.cores=2 --conf "spark.driver.extraclasspath=postgresql-42.0.0.jre6.jar” ~/spgtestcluster.jar
some of jars needed explicit are: postgresql-42.0.0.jre6.jar(this needed getting: exception in thread "main" java.sql.sqlexception: no suitable driver error when trying spark-submit mac terminal).
Comments
Post a Comment