You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
OS Platform and Distribution: Windows 11 and Ubuntu 22.04
TensorFlow.js installed from: npm
tfjs-node version: 4.18.0
I have a SavedModel made by exporting this model using Python with Tensorflow 2.16.1. The model works fine when loaded back into Python, however when I try to load it in Node using tfjs.node.loadSavedModel() I get the following errors
E tensorflow/core/grappler/optimizers/meta_optimizer.cc:903] tfg_optimizer{} failed: NOT_FOUND: Op type not registered 'DisableCopyOnRead' in binary running on LAPTOP. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
While importing FunctionDef: __inference__traced_save_11547
when importing GraphDef to MLIR module in GrapplerHook
E tensorflow/core/framework/node_def_util.cc:630] NodeDef mentions attribute debug_name which is not in the op definition: Op<name=VarHandleOp; signature= -> resource:resource; attr=container:string,default=""; attr=shared_name:string,default=""; attr=dtype:type; attr=shape:shape; attr=allowed_devices:list(string),default=[]; is_stateful=true> This may be expected if your graph generating binary is newer than this binary. Unknown attributes will be ignored. NodeDef: {{node Variable}}
I tensorflow/cc/saved_model/loader.cc:212] Running initialization op on SavedModel bundle at path: ./models/opennsfw2
E tensorflow/core/grappler/optimizers/meta_optimizer.cc:903] tfg_optimizer{} failed: NOT_FOUND: Op type not registered 'DisableCopyOnRead' in binary running on LAPTOP. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
While importing FunctionDef: __inference__traced_save_11547
when importing GraphDef to MLIR module in GrapplerHook
I tensorflow/cc/saved_model/loader.cc:301] SavedModel load for tags { serve }; Status: success: OK. Took 1504673 microseconds.
C:\dev\Projects\nsfw_test\node_modules\@tensorflow\tfjs-node\dist\nodejs_kernel_backend.js:435
return this.binding.loadSavedModel(path, tags);
^
Error: Failed to load SavedModel: Converting GraphDef to Graph has failed with an error: 'Op type not registered 'DisableCopyOnRead' in binary running on LAPTOP. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.' The binary trying to i
at NodeJSKernelBackend.loadSavedModelMetaGraph (C:\dev\Projects\nsfw_test\node_modules\@tensorflow\tfjs-node\dist\nodejs_kernel_backend.js:435:29)
at Object.<anonymous> (C:\dev\Projects\nsfw_test\node_modules\@tensorflow\tfjs-node\dist\saved_model.js:448:45)
at step (C:\dev\Projects\nsfw_test\node_modules\@tensorflow\tfjs-node\dist\saved_model.js:49:23)
at Object.next (C:\dev\Projects\nsfw_test\node_modules\@tensorflow\tfjs-node\dist\saved_model.js:30:53)
at fulfilled (C:\dev\Projects\nsfw_test\node_modules\@tensorflow\tfjs-node\dist\saved_model.js:21:58)
I'm not sure what the problem is, or if it strictly has to do with tfjs to be honest. Is the DisableCopyOnRead op not supported by tfjs-node? Is there a workaround, perhaps by exporting the model some other way? I also tried exporting it in Python with Tensorflow version 2.14.0, the earliest one allowed by the package containing the model, but that had no effect.
The text was updated successfully, but these errors were encountered:
I apologize for the delayed response and thank you for bringing this issue to our attention, if possible could you please help us with your Github repo along with converted TensorFlow.js models and complete steps to replicate the same behavior from our end to investigate this issue further from our end ?
System Information
I have a SavedModel made by exporting this model using Python with Tensorflow 2.16.1. The model works fine when loaded back into Python, however when I try to load it in Node using
tfjs.node.loadSavedModel()
I get the following errorsI'm not sure what the problem is, or if it strictly has to do with tfjs to be honest. Is the DisableCopyOnRead op not supported by tfjs-node? Is there a workaround, perhaps by exporting the model some other way? I also tried exporting it in Python with Tensorflow version 2.14.0, the earliest one allowed by the package containing the model, but that had no effect.
The text was updated successfully, but these errors were encountered: