Tensorflow Java Api Set Placeholder For Categorical Columns
Solution 1:
Since you're using tf.estimator.export.build_parsing_serving_input_receiver_fn
, the exported saved model you've created expects a serialized tf.Example
protocol buffer as input.
You can use the tf.Example
protocol buffer in Java (maven, javadoc), using something like this:
import com.google.protobuf.ByteString;
import java.util.Arrays;
import org.tensorflow.*;
import org.tensorflow.example.*;
publicclassMain {
// Returns a Feature containing a BytesList, where each element of the list// is the UTF-8 encoded bytes of the Java string.publicstatic Feature feature(String... strings) {
BytesList.Builderb= BytesList.newBuilder();
for (String s : strings) {
b.addValue(ByteString.copyFromUtf8(s));
}
return Feature.newBuilder().setBytesList(b).build();
}
publicstatic Feature feature(float... values) {
FloatList.Builderb= FloatList.newBuilder();
for (float v : values) {
b.addValue(v);
}
return Feature.newBuilder().setFloatList(b).build();
}
publicstaticvoidmain(String[] args)throws Exception {
Featuresfeatures=
Features.newBuilder()
.putFeature("Attribute1", feature("A12"))
.putFeature("Attribute2", feature(12))
.putFeature("Attribute3", feature("A32"))
.putFeature("Attribute4", feature("A40"))
.putFeature("Attribute5", feature(7472))
.putFeature("Attribute6", feature("A65"))
.putFeature("Attribute7", feature("A71"))
.putFeature("Attribute8", feature(1))
.putFeature("Attribute9", feature("A92"))
.putFeature("Attribute10", feature("A101"))
.putFeature("Attribute11", feature(2))
.putFeature("Attribute12", feature("A121"))
.putFeature("Attribute13", feature(24))
.putFeature("Attribute14", feature("A143"))
.putFeature("Attribute15", feature("A151"))
.putFeature("Attribute16", feature(1))
.putFeature("Attribute17", feature("A171"))
.putFeature("Attribute18", feature(1))
.putFeature("Attribute19", feature("A191"))
.putFeature("Attribute20", feature("A201"))
.build();
Exampleexample= Example.newBuilder().setFeatures(features).build();
Stringpfad= System.getProperty("user.dir") + "\\1511523781";
try (SavedModelBundlemodel= SavedModelBundle.load(pfad, "serve")) {
Sessionsession= model.session();
finalStringxName="input_example_tensor";
finalStringscoresName="dnn/head/predictions/probabilities:0";
try (Tensor<String> inputBatch = Tensors.create(newbyte[][] {example.toByteArray()});
Tensor<Float> output =
session
.runner()
.feed(xName, inputBatch)
.fetch(scoresName)
.run()
.get(0)
.expect(Float.class)) {
System.out.println(Arrays.deepToString(output.copyTo(newfloat[1][2])));
}
}
}
}
Much of the boilerplate here is to construct the protocol buffer example. Alternatively, you could use something other than build_arsing_serving_input_receiver_fn
to setup the exported model to accept input in a different format.
Side note: You can use the saved_model_cli
command-line tool that is included with TensorFlow Python installation to inspect the saved model. For example, something like:
saved_model_cli show \
--dir ./export/1511523781 \--tag_set serve \--signature_def predict
will show something like:
ThegivenSavedModelSignatureDefcontainsthefollowinginput(s):inputs['examples']tensor_info:dtype:DT_STRINGshape:(-1)name:input_example_tensor:0ThegivenSavedModelSignatureDefcontainsthefollowingoutput(s):...outputs['probabilities']tensor_info:dtype:DT_FLOATshape:(-1,2)name:dnn/head/predictions/probabilities:0
Suggesting that the saved model takes a single input - a batch of DT_STRING
elements and the output probabilities are a batch of 2-dimensional float vectors.
Hope that helps.
Post a Comment for "Tensorflow Java Api Set Placeholder For Categorical Columns"