Spark 1.5.0 spark.app.id warning -


i had updated cdh cluster use spark 1.5.0. when submit spark application, system show warning spark.app.id

using default name dagscheduler source because spark.app.id not set. 

i have searched spark.app.id not document it. read link , think used restapi call.

i don't see warning in spark 1.4. explain me , show how set it?

it's not used rest api, rather monitoring purpose e. g when want check yarn logs per example:

yarn logs <spark.app.id> 

it's true specific issue still not documented yet. think it's been added standardize application deployment within hadoop ecosystem.

i suggest set 'spark.app.id' in app.

conf.set("spark.app.id", <app-id>) // considering have sparkconf defined of course 

nevertheless, remains warning won't effect application itself.


Comments