jdbc - How can I connect to a Netezza database from a Spark SQLContext -


i have spark instance , i'm trying connect existing netezza datawarehouse applicance retrieve data.

using sparksql's sqlcontext, , according spark sql programming guide, achievable read method. i've determined need provide jdbc driver using --jars flag, rather spark_classpath in documentation. operation looks

// pyspark df = sqlcontext.read.format('jdbc').options( ... ).load()  // spark-shell val df = sqlcontext.read.format("jdbc").options( ... ).load() 

i can find documentation connecting netezza using jdbc, not how correctly pass username , password. 'options' need pass here?

in pyspark

df = sqlcontext.read.format('jdbc').options(url='jdbc:netezza://server1:5480/database', \     user='kirk', password='****', dbtable='schema.mytable', \     driver='org.netezza.driver').load() 

and in spark-shell

val df = sqlcontext.read.format("jdbc").options(map(              "url" -> "jdbc:netezza://server1:5480/database",               "user" -> "kirk",               "password" -> "****",               "dbtable" -> "schema.mytable",               "driver" -> "org.netezza.driver")).load() 

note netezza likes things in caps. don't know if necessary, doesn't hurt.


Comments