2016-05-10 6 views
0

nicht erhalten Ich habe gerade den Hadoop mit Hortonworks Datenplattform installiert. Ich habe drei Maschinen, auf denen CentOS 7 läuft. Einer dieser drei Computer betreibt unter anderem den amabari-Server, NameNode, HiveServer2. Die anderen beiden führen nur Clients dieser Dienste aus.Apache Hive-Jobs funktionieren nicht - Container ist fehlgeschlagen, exitCode = -1000. Konnte Block

Jedes Mal, wenn ich versuche, Hive-Abfragen auszuführen, die MapReduce-Jobs erfordern, schlagen sie fehl. Alle TaskAttempts schlagen in jedem Job mit einer BlockMissingException fehl und die Diagnose ist auf "[Container failed, exitCode = -1000. Konnte Block nicht erhalten ..." gesetzt.

Z. B .:

hive> select count(*) from pgc; 
Query ID = root_20160510184153_51d881b2-fbb5-47d3-8a06-9d62f51950e1 
Total jobs = 1 
Launching Job 1 out of 1 


Status: Running (Executing on YARN cluster with App id application_1462904248344_0007) 

-------------------------------------------------------------------------------- 
     VERTICES  STATUS TOTAL COMPLETED RUNNING PENDING FAILED KILLED 
-------------------------------------------------------------------------------- 
Map 1     FAILED  9   0  0  9  14  0 
Reducer 2    KILLED  1   0  0  1  0  0 
-------------------------------------------------------------------------------- 
VERTICES: 00/02 [>>--------------------------] 0% ELAPSED TIME: 80.05 s 
-------------------------------------------------------------------------------- 
Status: Failed 
Vertex failed, vertexName=Map 1, vertexId=vertex_1462904248344_0007_1_00, diagnostics=[Task failed, taskId=task_1462904248344_0007_1_00_000001, diagnostics=[TaskAttempt 0 failed, info=[Container container_e49_1462904248344_0007_02_000003 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar 
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar 
     at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945) 
     at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604) 
     at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844) 
     at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896) 
     at java.io.DataInputStream.read(DataInputStream.java:100) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119) 
     at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366) 
     at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267) 
     at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:422) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
     at java.lang.Thread.run(Thread.java:745) 

]], TaskAttempt 1 failed, info=[Container container_e49_1462904248344_0007_02_000009 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar 
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar 
     at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945) 
     at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604) 
     at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844) 
     at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896) 
     at java.io.DataInputStream.read(DataInputStream.java:100) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119) 
     at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366) 
     at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267) 
     at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:422) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
     at java.lang.Thread.run(Thread.java:745) 

]], TaskAttempt 2 failed, info=[Container container_e49_1462904248344_0007_02_000013 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073759911_19135 file=/tmp/hive/root/_tez_session_dir/fe7f8921-1363-410f-9bcf-1ef2285fe369/hive-hcatalog-core.jar 
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073759911_19135 file=/tmp/hive/root/_tez_session_dir/fe7f8921-1363-410f-9bcf-1ef2285fe369/hive-hcatalog-core.jar 
     at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945) 
     at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604) 
     at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844) 
     at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896) 
     at java.io.DataInputStream.read(DataInputStream.java:100) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119) 
     at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366) 
     at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267) 
     at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:422) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
     at java.lang.Thread.run(Thread.java:745) 

]], TaskAttempt 3 failed, info=[Container container_e49_1462904248344_0007_02_000018 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar 
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar 
     at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945) 
     at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604) 
     at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844) 
     at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896) 
     at java.io.DataInputStream.read(DataInputStream.java:100) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119) 
     at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366) 
     at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267) 
     at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:422) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
     at java.lang.Thread.run(Thread.java:745) 

]]], Task failed, taskId=task_1462904248344_0007_1_00_000003, diagnostics=[TaskAttempt 0 failed, info=[Container container_e49_1462904248344_0007_02_000005 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar 
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar 
     at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945) 
     at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604) 
     at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844) 
     at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896) 
     at java.io.DataInputStream.read(DataInputStream.java:100) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119) 
     at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366) 
     at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267) 
     at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:422) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
     at java.lang.Thread.run(Thread.java:745) 

]], TaskAttempt 1 failed, info=[Container container_e49_1462904248344_0007_02_000008 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar 
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar 
     at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945) 
     at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604) 
     at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844) 
     at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896) 
     at java.io.DataInputStream.read(DataInputStream.java:100) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119) 
     at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366) 
     at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267) 
     at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:422) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
     at java.lang.Thread.run(Thread.java:745) 

]], TaskAttempt 2 failed, info=[Container container_e49_1462904248344_0007_02_000014 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073759911_19135 file=/tmp/hive/root/_tez_session_dir/fe7f8921-1363-410f-9bcf-1ef2285fe369/hive-hcatalog-core.jar 
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073759911_19135 file=/tmp/hive/root/_tez_session_dir/fe7f8921-1363-410f-9bcf-1ef2285fe369/hive-hcatalog-core.jar 
     at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945) 
     at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604) 
     at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844) 
     at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896) 
     at java.io.DataInputStream.read(DataInputStream.java:100) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119) 
     at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366) 
     at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267) 
     at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:422) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
     at java.lang.Thread.run(Thread.java:745) 

]], TaskAttempt 3 failed, info=[Container container_e49_1462904248344_0007_02_000017 finished with diagnostics set to [Container failed, exitCode=-1000. Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar 
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1214017999-10.130.3.52-1459431677581:blk_1073742638_1816 file=/user/root/.hiveJars/hive-exec-1.2.1000.2.4.0.0-169-1a2ff40e61734ceb11f35cefe7900422e9064d327d2021994ebadff4e8c631f5.jar 
     at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:945) 
     at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:604) 
     at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:844) 
     at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:896) 
     at java.io.DataInputStream.read(DataInputStream.java:100) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59) 
     at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119) 
     at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366) 
     at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:267) 
     at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:63) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:361) 
     at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:422) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:358) 
     at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:62) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
     at java.lang.Thread.run(Thread.java:745) 

]]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:2 killedTasks:7, Vertex vertex_1462904248344_0007_1_00 [Map 1] killed/failed due to:OWN_TASK_FAILURE] 
... 

Hat jemand vor diesem Problem sehen? Danke im Voraus.

Antwort

0

In Linux Terminal, su HDFS & die hadoop dfsadmin -report laufen beschädigt zu bestätigen ist der Block nicht.

Gemäß den Protokollen, denke ich, dass Sie die Abfrage als root-Benutzer ausführen, versuchen Sie, sich als Root-Benutzer zu imitieren. Login für die Ambari UI, dann gehen Sie auf HDFS -> Config -> Erweitert -> Benutzerdefinierte Kern-Ort

Update oder fügen Sie die Eigenschaft

hadoop.proxyuser.root.groups = *

hadoop.proxyuser.root.hosts = *

Verwandte Themen