Ich versuche, meine umschuldete Einführung Modell folgenden this guide (Sie können auch sehen this guide, die die Neueinführungszeit zu erklären) zu dienen. Ich habe retrain.py verändert mein Modell zu exportieren wie folgt:Tensorflow Serving umschuldet Start
... # Same as in the original script:
# Set up the pre-trained graph.
maybe_download_and_extract()
graph, bottleneck_tensor, jpeg_data_tensor, resized_image_tensor = (create_inception_graph())
... # Same as in the original script:
# Add the new layer that we'll be training.
(train_step, cross_entropy, bottleneck_input, ground_truth_input, final_tensor) = add_final_training_ops(len(image_lists.keys()),
FLAGS.final_tensor_name,
bottleneck_tensor)
... # Added at the end of the original script:
# Export model
with graph.as_default():
export_path = sys.argv[-1]
print('Exporting trained model to', export_path)
saver = tf.train.Saver(sharded=True)
model_exporter = exporter.Exporter(saver)
signature = exporter.classification_signature(input_tensor=jpeg_data_tensor, scores_tensor=final_tensor)
model_exporter.init(sess.graph.as_graph_def(), default_graph_signature=signature)
model_exporter.export(export_path, tf.constant(FLAGS.export_version), sess)
print('Done exporting!')
if __name__ == '__main__':
tf.app.run()
Nach meinem Modell exportieren ich den Server zu starten ausgeführt wird:
/serving/bazel-bin/tensorflow_serving/example/inception_inference --port=9000 EXPORT_DIR &> inception_log &
Server-Protokolldatei (inception_log) enthält:
I tensorflow_serving/core/basic_manager.cc:190] Using InlineExecutor for BasicManager.
I tensorflow_serving/example/inception_inference.cc:384] Waiting for models to be loaded...
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] File-system polling found servable version {name: default version: 1} at path /tf_files/scope/export/00000001
I external/org_tensorflow/tensorflow/contrib/session_bundle/session_bundle.cc:129] Attempting to load a SessionBundle from: /tf_files/scope/export/00000001
I tensorflow_serving/example/inception_inference.cc:384] Waiting for models to be loaded...
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] File-system polling found servable version {name: default version: 1} at path /tf_files/scope/export/00000001
I external/org_tensorflow/tensorflow/contrib/session_bundle/session_bundle.cc:106] Running restore op for SessionBundle
I external/org_tensorflow/tensorflow/contrib/session_bundle/session_bundle.cc:203] Done loading SessionBundle
I tensorflow_serving/example/inception_inference.cc:350] Running...
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] File-system polling found servable version {name: default version: 1} at path /tf_files/scope/export/00000001
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] File-system polling found servable version {name: default version: 1} at path /tf_files/scope/export/00000001
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] File-system polling found servable version {name: default version: 1} at path /tf_files/scope/export/00000001
...
Endlich, ich führe den Client und ich bekomme den folgenden Fehler:
/serving/bazel-bin/tensorflow_serving/example/inception_client --server=localhost:9000 --image=TEST_IMG
D0805 09:10:46.208704633 200 ev_posix.c:101] Using polling engine: poll
Traceback (most recent call last):
File "/serving/bazel-bin/tensorflow_serving/example/inception_client.runfiles/tensorflow_serving/example/inception_client.py", line 53, in <module>
tf.app.run()
File "/serving/bazel-bin/tensorflow_serving/example/inception_client.runfiles/external/org_tensorflow/tensorflow/python/platform/app.py", line 30, in run
sys.exit(main(sys.argv))
File "/serving/bazel-bin/tensorflow_serving/example/inception_client.runfiles/tensorflow_serving/example/inception_client.py", line 48, in main
result = stub.Classify(request, 10.0) # 10 secs timeout
File "/usr/local/lib/python2.7/dist-packages/grpc/beta/_client_adaptations.py", line 300, in __call__
self._request_serializer, self._response_deserializer)
File "/usr/local/lib/python2.7/dist-packages/grpc/beta/_client_adaptations.py", line 198, in _blocking_unary_unary
raise _abortion_error(rpc_error_call)
grpc.framework.interfaces.face.face.AbortionError: AbortionError(code=StatusCode.INTERNAL, details="FetchOutputs node : not found")
E0805 09:10:47.129263239 200 chttp2_transport.c:1810] close_transport: {"created":"@1470388247.129230608","description":"FD shutdown","file":"src/core/lib/iomgr/ev_poll_posix.c","file_line":427}
Jeder Rat oder Anleitung in dieser Angelegenheit würde sehr geschätzt werden.
Irgendwas Glück hier? Mit einem ähnlichen Problem konfrontiert – kampta
Das gleiche Problem hier! Jede Rückmeldung/Anleitung würde sehr geschätzt werden! –