Add metadata to tensorflow serving api call










2















Is it possible to add metadata to a tensorflow serving servable, such that this metadata is also populated in the response from the servable?



If I have a servable with the file structure:



my_servable/ 
1541778457/
variables/
saved_model.pb


For example:



```
outputs
key: "classes"
value
dtype: DT_STRING
tensor_shape
dim
size: 8


string_val: "a"
string_val: "b"
string_val: "c"
string_val: "d"
string_val: "e"
string_val: "f"
string_val: "g"
string_val: "h"


outputs
key: "scores"
value
dtype: DT_FLOAT
tensor_shape
dim
size: 1

dim
size: 8


float_val: 1.212528104588273e-06
float_val: 5.094948463124638e-08
float_val: 0.0009737954242154956
float_val: 0.9988483190536499
float_val: 3.245145592245535e-07
float_val: 0.00010837535955943167
float_val: 4.101086960872635e-05
float_val: 2.676981057447847e-05


model_spec
name: "my_model"
version
value: 1541778457

signature_name: "prediction"



If I have something like a git hash or unique identifier for the code that generated this servable like f6ca434910504532a0d50dfd12f22d4c, is it possible to get this data in the client request?



Ideally something like:



```
outputs
key: "classes"
value
dtype: DT_STRING
tensor_shape
dim
size: 8


string_val: "a"
string_val: "b"
string_val: "c"
string_val: "d"
string_val: "e"
string_val: "f"
string_val: "g"
string_val: "h"


outputs
key: "scores"
value
dtype: DT_FLOAT
tensor_shape
dim
size: 1

dim
size: 8


float_val: 1.212528104588273e-06
float_val: 5.094948463124638e-08
float_val: 0.0009737954242154956
float_val: 0.9988483190536499
float_val: 3.245145592245535e-07
float_val: 0.00010837535955943167
float_val: 4.101086960872635e-05
float_val: 2.676981057447847e-05


model_spec
name: "my_model"
version
value: 1541778457

hash
value: f6ca434910504532a0d50dfd12f22d4c

signature_name: "prediction"



I tried changing the directory from 1541778457 to the hash, but this gave:



W tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:268] No versions of servable default found under base path










share|improve this question


























    2















    Is it possible to add metadata to a tensorflow serving servable, such that this metadata is also populated in the response from the servable?



    If I have a servable with the file structure:



    my_servable/ 
    1541778457/
    variables/
    saved_model.pb


    For example:



    ```
    outputs
    key: "classes"
    value
    dtype: DT_STRING
    tensor_shape
    dim
    size: 8


    string_val: "a"
    string_val: "b"
    string_val: "c"
    string_val: "d"
    string_val: "e"
    string_val: "f"
    string_val: "g"
    string_val: "h"


    outputs
    key: "scores"
    value
    dtype: DT_FLOAT
    tensor_shape
    dim
    size: 1

    dim
    size: 8


    float_val: 1.212528104588273e-06
    float_val: 5.094948463124638e-08
    float_val: 0.0009737954242154956
    float_val: 0.9988483190536499
    float_val: 3.245145592245535e-07
    float_val: 0.00010837535955943167
    float_val: 4.101086960872635e-05
    float_val: 2.676981057447847e-05


    model_spec
    name: "my_model"
    version
    value: 1541778457

    signature_name: "prediction"



    If I have something like a git hash or unique identifier for the code that generated this servable like f6ca434910504532a0d50dfd12f22d4c, is it possible to get this data in the client request?



    Ideally something like:



    ```
    outputs
    key: "classes"
    value
    dtype: DT_STRING
    tensor_shape
    dim
    size: 8


    string_val: "a"
    string_val: "b"
    string_val: "c"
    string_val: "d"
    string_val: "e"
    string_val: "f"
    string_val: "g"
    string_val: "h"


    outputs
    key: "scores"
    value
    dtype: DT_FLOAT
    tensor_shape
    dim
    size: 1

    dim
    size: 8


    float_val: 1.212528104588273e-06
    float_val: 5.094948463124638e-08
    float_val: 0.0009737954242154956
    float_val: 0.9988483190536499
    float_val: 3.245145592245535e-07
    float_val: 0.00010837535955943167
    float_val: 4.101086960872635e-05
    float_val: 2.676981057447847e-05


    model_spec
    name: "my_model"
    version
    value: 1541778457

    hash
    value: f6ca434910504532a0d50dfd12f22d4c

    signature_name: "prediction"



    I tried changing the directory from 1541778457 to the hash, but this gave:



    W tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:268] No versions of servable default found under base path










    share|improve this question
























      2












      2








      2


      1






      Is it possible to add metadata to a tensorflow serving servable, such that this metadata is also populated in the response from the servable?



      If I have a servable with the file structure:



      my_servable/ 
      1541778457/
      variables/
      saved_model.pb


      For example:



      ```
      outputs
      key: "classes"
      value
      dtype: DT_STRING
      tensor_shape
      dim
      size: 8


      string_val: "a"
      string_val: "b"
      string_val: "c"
      string_val: "d"
      string_val: "e"
      string_val: "f"
      string_val: "g"
      string_val: "h"


      outputs
      key: "scores"
      value
      dtype: DT_FLOAT
      tensor_shape
      dim
      size: 1

      dim
      size: 8


      float_val: 1.212528104588273e-06
      float_val: 5.094948463124638e-08
      float_val: 0.0009737954242154956
      float_val: 0.9988483190536499
      float_val: 3.245145592245535e-07
      float_val: 0.00010837535955943167
      float_val: 4.101086960872635e-05
      float_val: 2.676981057447847e-05


      model_spec
      name: "my_model"
      version
      value: 1541778457

      signature_name: "prediction"



      If I have something like a git hash or unique identifier for the code that generated this servable like f6ca434910504532a0d50dfd12f22d4c, is it possible to get this data in the client request?



      Ideally something like:



      ```
      outputs
      key: "classes"
      value
      dtype: DT_STRING
      tensor_shape
      dim
      size: 8


      string_val: "a"
      string_val: "b"
      string_val: "c"
      string_val: "d"
      string_val: "e"
      string_val: "f"
      string_val: "g"
      string_val: "h"


      outputs
      key: "scores"
      value
      dtype: DT_FLOAT
      tensor_shape
      dim
      size: 1

      dim
      size: 8


      float_val: 1.212528104588273e-06
      float_val: 5.094948463124638e-08
      float_val: 0.0009737954242154956
      float_val: 0.9988483190536499
      float_val: 3.245145592245535e-07
      float_val: 0.00010837535955943167
      float_val: 4.101086960872635e-05
      float_val: 2.676981057447847e-05


      model_spec
      name: "my_model"
      version
      value: 1541778457

      hash
      value: f6ca434910504532a0d50dfd12f22d4c

      signature_name: "prediction"



      I tried changing the directory from 1541778457 to the hash, but this gave:



      W tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:268] No versions of servable default found under base path










      share|improve this question














      Is it possible to add metadata to a tensorflow serving servable, such that this metadata is also populated in the response from the servable?



      If I have a servable with the file structure:



      my_servable/ 
      1541778457/
      variables/
      saved_model.pb


      For example:



      ```
      outputs
      key: "classes"
      value
      dtype: DT_STRING
      tensor_shape
      dim
      size: 8


      string_val: "a"
      string_val: "b"
      string_val: "c"
      string_val: "d"
      string_val: "e"
      string_val: "f"
      string_val: "g"
      string_val: "h"


      outputs
      key: "scores"
      value
      dtype: DT_FLOAT
      tensor_shape
      dim
      size: 1

      dim
      size: 8


      float_val: 1.212528104588273e-06
      float_val: 5.094948463124638e-08
      float_val: 0.0009737954242154956
      float_val: 0.9988483190536499
      float_val: 3.245145592245535e-07
      float_val: 0.00010837535955943167
      float_val: 4.101086960872635e-05
      float_val: 2.676981057447847e-05


      model_spec
      name: "my_model"
      version
      value: 1541778457

      signature_name: "prediction"



      If I have something like a git hash or unique identifier for the code that generated this servable like f6ca434910504532a0d50dfd12f22d4c, is it possible to get this data in the client request?



      Ideally something like:



      ```
      outputs
      key: "classes"
      value
      dtype: DT_STRING
      tensor_shape
      dim
      size: 8


      string_val: "a"
      string_val: "b"
      string_val: "c"
      string_val: "d"
      string_val: "e"
      string_val: "f"
      string_val: "g"
      string_val: "h"


      outputs
      key: "scores"
      value
      dtype: DT_FLOAT
      tensor_shape
      dim
      size: 1

      dim
      size: 8


      float_val: 1.212528104588273e-06
      float_val: 5.094948463124638e-08
      float_val: 0.0009737954242154956
      float_val: 0.9988483190536499
      float_val: 3.245145592245535e-07
      float_val: 0.00010837535955943167
      float_val: 4.101086960872635e-05
      float_val: 2.676981057447847e-05


      model_spec
      name: "my_model"
      version
      value: 1541778457

      hash
      value: f6ca434910504532a0d50dfd12f22d4c

      signature_name: "prediction"



      I tried changing the directory from 1541778457 to the hash, but this gave:



      W tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:268] No versions of servable default found under base path







      tensorflow-serving






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 15 '18 at 23:39









      user2726995user2726995

      98721324




      98721324






















          1 Answer
          1






          active

          oldest

          votes


















          0














          I suppose you could approach this problem in a couple of ways. If you want the idea to change the folder name to work, remember that folder name in this case describe your model version which I think must be an integer. I would therefore assume that you would need to convert your hash to either binary or decimal and then convert it back when you receive it.



          A better solution in my opinion would be if you were able to change your model and add a variable containing your hash. And add it to the models signature_def. In python that would look something like:



          // create your field
          hash = tf.placeholder("f6ca434910504532a0d50dfd12f22d4c",tf.string, name="HASH")

          // build tensor
          hash_info = tf.saved_model.utils.build_tensor_info(hash)

          // add hash_info in your output in signature_def

          // then you should be able to receive that data in your request





          share|improve this answer






















            Your Answer






            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "1"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53329391%2fadd-metadata-to-tensorflow-serving-api-call%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            I suppose you could approach this problem in a couple of ways. If you want the idea to change the folder name to work, remember that folder name in this case describe your model version which I think must be an integer. I would therefore assume that you would need to convert your hash to either binary or decimal and then convert it back when you receive it.



            A better solution in my opinion would be if you were able to change your model and add a variable containing your hash. And add it to the models signature_def. In python that would look something like:



            // create your field
            hash = tf.placeholder("f6ca434910504532a0d50dfd12f22d4c",tf.string, name="HASH")

            // build tensor
            hash_info = tf.saved_model.utils.build_tensor_info(hash)

            // add hash_info in your output in signature_def

            // then you should be able to receive that data in your request





            share|improve this answer



























              0














              I suppose you could approach this problem in a couple of ways. If you want the idea to change the folder name to work, remember that folder name in this case describe your model version which I think must be an integer. I would therefore assume that you would need to convert your hash to either binary or decimal and then convert it back when you receive it.



              A better solution in my opinion would be if you were able to change your model and add a variable containing your hash. And add it to the models signature_def. In python that would look something like:



              // create your field
              hash = tf.placeholder("f6ca434910504532a0d50dfd12f22d4c",tf.string, name="HASH")

              // build tensor
              hash_info = tf.saved_model.utils.build_tensor_info(hash)

              // add hash_info in your output in signature_def

              // then you should be able to receive that data in your request





              share|improve this answer

























                0












                0








                0







                I suppose you could approach this problem in a couple of ways. If you want the idea to change the folder name to work, remember that folder name in this case describe your model version which I think must be an integer. I would therefore assume that you would need to convert your hash to either binary or decimal and then convert it back when you receive it.



                A better solution in my opinion would be if you were able to change your model and add a variable containing your hash. And add it to the models signature_def. In python that would look something like:



                // create your field
                hash = tf.placeholder("f6ca434910504532a0d50dfd12f22d4c",tf.string, name="HASH")

                // build tensor
                hash_info = tf.saved_model.utils.build_tensor_info(hash)

                // add hash_info in your output in signature_def

                // then you should be able to receive that data in your request





                share|improve this answer













                I suppose you could approach this problem in a couple of ways. If you want the idea to change the folder name to work, remember that folder name in this case describe your model version which I think must be an integer. I would therefore assume that you would need to convert your hash to either binary or decimal and then convert it back when you receive it.



                A better solution in my opinion would be if you were able to change your model and add a variable containing your hash. And add it to the models signature_def. In python that would look something like:



                // create your field
                hash = tf.placeholder("f6ca434910504532a0d50dfd12f22d4c",tf.string, name="HASH")

                // build tensor
                hash_info = tf.saved_model.utils.build_tensor_info(hash)

                // add hash_info in your output in signature_def

                // then you should be able to receive that data in your request






                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Mar 5 at 23:35









                GrebtsewGrebtsew

                64




                64





























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53329391%2fadd-metadata-to-tensorflow-serving-api-call%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Top Tejano songwriter Luis Silva dead of heart attack at 64

                    政党

                    天津地下鉄3号線