1

Ich verwende Starschemas JDBC-Treiber, um Pentaho mit BigQuery zu verbinden. Ich bin in der Lage, Daten von BigQuery erfolgreich in Pentaho zu holen. Ich kann jedoch keine Daten von Pentaho in BigQuery schreiben. Beim Einfügen von Zeilen in BigQuery wird eine Ausnahme ausgelöst, und der Vorgang wird möglicherweise nicht unterstützt. Wie löse ich das?Pentaho: Ich kann keine Daten von Pentaho nach BigQuery schreiben

Fehlermeldung:

2017/10/30 14:27:43 - Table output 2.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Because of an error, this step can't continue: 
2017/10/30 14:27:43 - Table output 2.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : org.pentaho.di.core.exception.KettleException: 
2017/10/30 14:27:43 - Table output 2.0 - Error inserting row into table [TableID] with values: [A], [I], [G], [1], [2016-02-18], [11], [2016-02-18-12.00.00.123456], [GG], [CB], [132], [null], [null], [null] 
2017/10/30 14:27:43 - Table output 2.0 - 
2017/10/30 14:27:43 - Table output 2.0 - Error inserting/updating row 
2017/10/30 14:27:43 - Table output 2.0 - executeUpdate() 
2017/10/30 14:27:43 - Table output 2.0 - 
2017/10/30 14:27:43 - Table output 2.0 - 
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:385) 
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:125) 
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62) 
2017/10/30 14:27:43 - Table output 2.0 - at java.lang.Thread.run(Unknown Source) 
2017/10/30 14:27:43 - Table output 2.0 - Caused by: org.pentaho.di.core.exception.KettleDatabaseException: 
2017/10/30 14:27:43 - Table output 2.0 - Error inserting/updating row 
2017/10/30 14:27:43 - Table output 2.0 - executeUpdate() 
2017/10/30 14:27:43 - Table output 2.0 - 
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.core.database.Database.insertRow(Database.java:1321) 
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:262) 
2017/10/30 14:27:43 - Table output 2.0 - ... 3 more 
2017/10/30 14:27:43 - Table output 2.0 - Caused by: net.starschema.clouddb.jdbc.BQSQLFeatureNotSupportedException: executeUpdate() 
2017/10/30 14:27:43 - Table output 2.0 - at net.starschema.clouddb.jdbc.BQPreparedStatement.executeUpdate(BQPreparedStatement.java:317) 
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.core.database.Database.insertRow(Database.java:1288) 
2017/10/30 14:27:43 - Table output 2.0 - ... 4 more 
2017/10/30 14:27:43 - BigQuery_rwa-tooling - Statement canceled! 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Something went wrong while trying to stop the transformation: org.pentaho.di.core.exception.KettleDatabaseException: 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Error cancelling statement 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - cancel() 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException: 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Error cancelling statement 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - cancel() 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.core.database.Database.cancelStatement(Database.java:750) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.core.database.Database.cancelQuery(Database.java:732) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.steps.tableinput.TableInput.stopRunning(TableInput.java:299) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.Trans.stopAll(Trans.java:1889) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.step.BaseStep.stopAll(BaseStep.java:2915) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:139) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at java.lang.Thread.run(Unknown Source) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Caused by: net.starschema.clouddb.jdbc.BQSQLFeatureNotSupportedException: cancel() 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at net.starschema.clouddb.jdbc.BQStatementRoot.cancel(BQStatementRoot.java:113) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.core.database.Database.cancelStatement(Database.java:744) 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ... 7 more 
2017/10/30 14:27:43 - Table output 2.0 - Signaling 'output done' to 0 output rowsets. 
2017/10/30 14:27:43 - BigQuery_prID - No commit possible on database connection [BigQuery_prID] 

Antwort

1

Es sieht aus wie Sie dies über Legacy-SQL zu tun, können versuchen, die keine Unterstützung für DML-Anweisungen haben (INSERT/UPDATE/DELETE).

Standard-SQL unterstützt DML, aber diese unterstützen weitgehend die Manipulation von Massentabellen im Gegensatz zu zeilenorientierten Einfügungen; Das Einlesen von Daten über die Verwendung einzelner DML-INSERTs wird nicht empfohlen. Weitere Informationen finden Sie in den Kontingenten unter DML reference documentation.

Sie können entweder BigQuery-Streaming oder Bulk-Aufnahme über einen Ladejob zur Aufnahme verwenden. Da diese Mechanismen jedoch außerhalb der Abfragesprache liegen, müssen Sie möglicherweise einen JDBC-Treiber verwenden.