1

tf ハブ モジュールを使用してテキスト分類用のモデルをエクスポートしてから、 predictor.from_saved_model()を使用して単一の文字列の例についてモデルから予測を推測します。同様のアイデアの例をいくつか見ましたが、機能を構築するために tf ハブ モジュールを使用する場合にはまだ機能しませんでした。これが私がすることです:

        train_input_fn = tf.estimator.inputs.pandas_input_fn(
        train_df, train_df['label_ids'], num_epochs= None, shuffle=True)

    # Prediction on the whole training set.
    predict_train_input_fn = tf.estimator.inputs.pandas_input_fn(
        train_df, train_df['label_ids'], shuffle=False)

    embedded_text_feature_column = hub.text_embedding_column(
        key='sentence',
        module_spec='https://tfhub.dev/google/nnlm-de-dim128/1')

    #Estimator
    estimator = tf.estimator.DNNClassifier(
        hidden_units=[500, 100],
        feature_columns=[embedded_text_feature_column],
        n_classes=num_of_class,
        optimizer=tf.train.AdagradOptimizer(learning_rate=0.003) )

    # Training
    estimator.train(input_fn=train_input_fn, steps=1000)

    #prediction on training set
    train_eval_result = estimator.evaluate(input_fn=predict_train_input_fn)

    print('Training set accuracy: {accuracy}'.format(**train_eval_result))

    feature_spec = tf.feature_column.make_parse_example_spec([embedded_text_feature_column])
    serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)

    export_dir_base = self.cfg['model_path']
    servable_model_path = estimator.export_savedmodel(export_dir_base, serving_input_receiver_fn)

    # Example message for inference
    message = "Was ist denn los"
    saved_model_predictor = predictor.from_saved_model(export_dir=servable_model_path)
    content_tf_list = tf.train.BytesList(value=[str.encode(message)])
    example = tf.train.Example(
            features=tf.train.Features(
                feature={
                    'sentence': tf.train.Feature(
                        bytes_list=content_tf_list
                    )
                }
            )
        )

    with tf.python_io.TFRecordWriter('the_message.tfrecords') as writer:
        writer.write(example.SerializeToString())

    reader = tf.TFRecordReader()
    data_path = 'the_message.tfrecords'
    filename_queue = tf.train.string_input_producer([data_path], num_epochs=1)
    _, serialized_example = reader.read(filename_queue)
    output_dict = saved_model_predictor({'inputs': [serialized_example]})

そして出力:

Traceback (most recent call last):
  File "/Users/dimitrs/component-pythia/src/pythia.py", line 321, in _train
    model = algo.generate_model(samples, generation_id)
  File "/Users/dimitrs/component-pythia/src/algorithm_layer/algorithm.py", line 56, in generate_model
    model = self._process_training(samples, generation)
  File "/Users/dimitrs/component-pythia/src/algorithm_layer/tf_hub_classifier.py", line 91, in _process_training
    output_dict = saved_model_predictor({'inputs': [serialized_example]})
  File "/Users/dimitrs/anaconda3/envs/pythia/lib/python3.6/site-packages/tensorflow/contrib/predictor/predictor.py", line 77, in __call__
    return self._session.run(fetches=self.fetch_tensors, feed_dict=feed_dict)
  File "/Users/dimitrs/anaconda3/envs/pythia/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 900, in run
    run_metadata_ptr)
  File "/Users/dimitrs/anaconda3/envs/pythia/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1135, in _run
    feed_dict_tensor, options, run_metadata)
  File "/Users/dimitrs/anaconda3/envs/pythia/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1316, in _do_run
    run_metadata)
  File "/Users/dimitrs/anaconda3/envs/pythia/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1335, in _do_call
    raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.InternalError: Unable to get element as bytes.

serialized_exampleによって提案された正しい入力ではありませんserving_input_receiver_fnか?

4

1 に答える 1