2016-10-09 5 views
1

Mein Anwendungsfall ist wie folgt. Ich muss in der Lage, Java-Methoden aus Python-Code'JavaPackage' Objekt ist nicht aufrufbar

von py Funken zu nennen dies scheint sehr einfach

ich die py Funken wie diese ./pyspark --driver-Klasse-Pfad/Weg beginnen zu sein /to/app.jar

und von pyspark Shell tun dies

 x=sc._jvm.com.abc.def.App 
     x.getMessage() 
     u'Hello' 
     x.getMessage() 
     u'Hello' 

Dies funktioniert gut.

Wenn allerdings mit Funken Job Server arbeiten:

verwende ich das WordCountSparkJob.py Beispiel verschifft

from sparkjobserver.api import SparkJob, build_problems 
from py4j.java_gateway import JavaGateway, java_import 

class WordCountSparkJob(SparkJob): 

def validate(self, context, runtime, config): 
    if config.get('input.strings', None): 
     return config.get('input.strings') 
    else: 
     return build_problems(['config input.strings not found']) 

def run_job(self, context, runtime, data): 
    x = context._jvm.com.abc.def.App   
    return x.getMessage() 

Meine python.conf wie dieses ich die folgende Fehlermeldung erhalten

spark { 
jobserver { 
jobdao = spark.jobserver.io.JobSqlDAO 
} 

context-settings { 
python { 
paths = [ 
"/home/xxx/SPARK/spark-1.6.0-bin-hadoop2.6/python", 
"/home/xxx/.local/lib/python2.7/site-packages/pyhocon", 
"/home/xxx/SPARK/spark-1.6.0-bin-hadoop2.6/python/lib/pyspark.zip", 
"/home/xxx/SPARK/spark-1.6.0-bin-hadoop2.6/python/lib/py4j-0.9-src.zip", 
"/home/xxx/gitrepos/spark-jobserver/job-server-python/src/python /dist/spark_jobserver_python-NO_ENV-py2.7.egg" 
] 
} 
dependent-jar-uris = ["file:///path/to/app.jar"] 
} 
home = /home/path/to/spark 
} 

sieht

[2016-10-08 23:03:46,214] ERROR jobserver.python.PythonJob []   [akka://JobServer/user/context-supervisor/py-context] - From Python: Error while calling 'run_job'TypeError("'JavaPackage' object is not callable",) 
[2016-10-08 23:03:46,226] ERROR jobserver.python.PythonJob [] [akka://JobServer/user/context-supervisor/py-context] - Python job failed with error code 4 
[2016-10-08 23:03:46,228] ERROR .jobserver.JobManagerActor []  [akka://JobServer/user/context-supervisor/py-context] - Got Throwable 
    java.lang.Exception: Python job failed with error code 4 
    at spark.jobserver.python.PythonJob$$anonfun$1.apply(PythonJob.scala:85) 
    at scala.util.Try$.apply(Try.scala:161) 
    at spark.jobserver.python.PythonJob.runJob(PythonJob.scala:62) 
    at spark.jobserver.python.PythonJob.runJob(PythonJob.scala:13) 
    at  spark.jobserver.JobManagerActor$$anonfun$getJobFuture$4.apply(JobManagerActor.scala:288) 
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) 
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) 
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
at java.lang.Thread.run(Thread.java:745) 
[2016-10-08 23:03:46,232] ERROR .jobserver.JobManagerActor [] [akka://JobServer/user/context-supervisor/py-context] - Exception from job 942727f0-dd81-445d-bc64-bd18880eb4bc: 
java.lang.Exception: Python job failed with error code 4 
at spark.jobserver.python.PythonJob$$anonfun$1.apply(PythonJob.scala:85) 
at scala.util.Try$.apply(Try.scala:161) 
at spark.jobserver.python.PythonJob.runJob(PythonJob.scala:62) 
at spark.jobserver.python.PythonJob.runJob(PythonJob.scala:13) 
at spark.jobserver.JobManagerActor$$anonfun$getJobFuture$4.apply(JobManagerActor.scala:288) 
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) 
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) 
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
at java.lang.Thread.run(Thread.java:745) 
[2016-10-08 23:03:46,232] INFO k.jobserver.JobStatusActor [] [akka://JobServer/user/context-supervisor/py-context/$a] - Job 942727f0-dd81-445d-bc64-bd18880eb4bc finished with an error 
[2016-10-08 23:03:46,233] INFO r$RemoteDeadLetterActorRef [] [akka://JobServer/deadLetters] - Message [spark.jobserver.CommonMessages$JobErroredOut] from Actor[akka://JobServer/user/context-supervisor/py-context/$a#1919442151] to Actor[akka://JobServer/deadLetters] was not delivered. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 

In der Datei python.conf habe ich die app.jar als Eintrag in abhängige-jar-uris. Fehle ich etwas hier

+0

- Von Python: Fehler beim Aufruf 'run_job'TypeError (' 'JavaPackage' Objekt ist nicht aufrufbar ",) – codemugal

+0

Haben Sie den gleichen Fehler. Schätzen Sie alle Hinweise. – Alex

Antwort

0

Fehler "'JavaPackage' object is not callable" bedeutet wahrscheinlich, dass PySpark Ihr Glas oder Ihre Klasse darin nicht sehen kann.