2016-05-04 5 views
1

Ich versuche, Daten aus mysql-Datenbank auf meinem localhost in HDFS mit sqoop zu importieren, und immer wenn ich versuche, den Code auszuführen, gibt es mir Verbindung verweigert Ausnahme.mysql to sqoop - Verbindung verweigert: java.net.ConnectException

Sqoop Import --connect jdbc: mysql: // localhost/Sqoop --username root --password 123 @ ajith --table mysql_sqoop --m 1

Aber es zeigt die Verbindungsfehler abgelehnt.

Alles ist in meinem lokalen Rechner installiert.

hier wird der Fehler angehängt:

Warning: /usr/lib/sqoop/../hcatalog does not exist! HCatalog jobs will fail. 
 
Please set $HCAT_HOME to the root of your HCatalog installation. 
 
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. 
 
Please set $ACCUMULO_HOME to the root of your Accumulo installation. 
 
Warning: /usr/lib/sqoop/../zookeeper does not exist! Accumulo imports will fail. 
 
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation. 
 
16/05/04 14:44:50 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6 
 
16/05/04 14:44:50 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 
 
16/05/04 14:44:50 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 
 
16/05/04 14:44:50 INFO tool.CodeGenTool: Beginning code generation 
 
16/05/04 14:44:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `mysql_sqoop` AS t LIMIT 1 
 
16/05/04 14:44:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `mysql_sqoop` AS t LIMIT 1 
 
16/05/04 14:44:51 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop 
 
Note: /tmp/sqoop-hduser_/compile/21bf2271e2878039d8e7c32486f8b7b7/mysql_sqoop.java uses or overrides a deprecated API. 
 
Note: Recompile with -Xlint:deprecation for details. 
 
16/05/04 14:44:52 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hduser_/compile/21bf2271e2878039d8e7c32486f8b7b7/mysql_sqoop.jar 
 
16/05/04 14:44:52 WARN manager.MySQLManager: It looks like you are importing from mysql. 
 
16/05/04 14:44:52 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 
 
16/05/04 14:44:52 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 
 
16/05/04 14:44:52 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 
 
16/05/04 14:44:52 INFO mapreduce.ImportJobBase: Beginning import of mysql_sqoop 
 
SLF4J: Class path contains multiple SLF4J bindings. 
 
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
 
SLF4J: Found binding in [jar:file:/usr/local/Hbase/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
 
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 
 
16/05/04 14:44:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
 
16/05/04 14:44:52 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 
 
16/05/04 14:44:53 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 
 
16/05/04 14:44:53 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 
 
16/05/04 14:44:53 ERROR tool.ImportTool: Encountered IOException running import job: java.net.ConnectException: Call From ajith-HP-ENVY-17-Notebook-PC/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused 
 
\t at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
 
\t at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
 
\t at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
 
\t at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
 
\t at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792) 
 
\t at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732) 
 
\t at org.apache.hadoop.ipc.Client.call(Client.java:1479) 
 
\t at org.apache.hadoop.ipc.Client.call(Client.java:1412) 
 
\t at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) 
 
\t at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source) 
 
\t at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771) 
 
\t at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
 
\t at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
 
\t at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
 
\t at java.lang.reflect.Method.invoke(Method.java:606) 
 
\t at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) 
 
\t at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 
 
\t at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source) 
 
\t at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108) 
 
\t at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305) 
 
\t at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301) 
 
\t at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) 
 
\t at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1301) 
 
\t at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424) 
 
\t at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:145) 
 
\t at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266) 
 
\t at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139) 
 
\t at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) 
 
\t at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) 
 
\t at java.security.AccessController.doPrivileged(Native Method) 
 
\t at javax.security.auth.Subject.doAs(Subject.java:415) 
 
\t at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
 
\t at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) 
 
\t at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308) 
 
\t at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) 
 
\t at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169) 
 
\t at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266) 
 
\t at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673) 
 
\t at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118) 
 
\t at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) 
 
\t at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) 
 
\t at org.apache.sqoop.Sqoop.run(Sqoop.java:143) 
 
\t at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) 
 
\t at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) 
 
\t at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) 
 
\t at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) 
 
\t at org.apache.sqoop.Sqoop.main(Sqoop.java:236) 
 
Caused by: java.net.ConnectException: Connection refused 
 
\t at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) 
 
\t at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:744) 
 
\t at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) 
 
\t at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) 
 
\t at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495) 
 
\t at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614) 
 
\t at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712) 
 
\t at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375) 
 
\t at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528) 
 
\t at org.apache.hadoop.ipc.Client.call(Client.java:1451) 
 
\t ... 40 more

Antwort

0

Können Sie bitte diese versuchen?

sqoop import --connect jdbc:mysql://localhost:3306/sqoop --username root --password [email protected] --table mysql_sqoop --m 1 

Ich nehme an, Sie sollten den Port 3306

+0

Ich habe versucht, aber der gleiche Fehler passiert wieder: siehe bitte den Fehler: tool.ImportTool ERROR: Fand IOException läuft Import Job: java.net.ConnectException: Anruf von ajith-HP -ENVY-17-Notebook-PC/127.0.1.1 zu localhost: 9000 bei Verbindungsausnahme fehlgeschlagen: java.net.ConnectException: Verbindung verweigert; Weitere Informationen finden Sie unter: http://wiki.apache.org/hadoop/ConnectionRefused –

+0

Können Sie dies erfolgreich tun? hadoop fs -ls hdfs: // localhost: 9000/ Es sollte das Verbindungsproblem zum HDFS-Namen-Knoten sein. Wenn nicht, verwenden Sie den Parameter '--hadoop-home', um eine explizite Adresse anzugeben. – cdhit

+0

Wenn ich hadoop fs -ls hdfs: // localhost: 9000 /: Dies ist der Fehler comming -> 16/05/05 10:23:00 WARN util.NativeCodeLoader: Kann native-Hadoop-Bibliothek für Ihre Plattform nicht laden. .. unter Verwendung von Built-in-Java-Klassen, sofern zutreffend ls: Anruf von Ajith-HP-ENVY-17-Notebook-PC/127.0.1.1 zu localhost: 9000 bei Verbindungsausnahme fehlgeschlagen: java.net.ConnectException: Verbindung verweigert; Weitere Informationen finden Sie unter: http://wiki.apache.org/hadoop/ConnectionRefused –

0

alle Versuchen hinzufügen Überprüfung die Dämonen mit Hilfe JPS Befehl ausführen, Verbindung verweigert Fehler kommen kann, wenn Dämon Dienste nicht gestartet.

Um alle Dienste start-all.sh zu starten (Hinweis: Es ist veraltet, aber es wird alle Dienste starten), wenn Sie separate Dienste starten möchten, dann verwenden Sie es entsprechend.

Grüße, Vinoth Kannan P

Verwandte Themen