Closed
Description
🐛 Bug
when convert scriptModule to onnx we crash and get the following exception:
Traceback (most recent call last):
File "/home/liron/envs/detectron/lib/python3.6/site-packages/torch/onnx/utils.py", line 382, in _export
fixed_batch_size=fixed_batch_size)
File "/home/liron/envs/detectron/lib/python3.6/site-packages/torch/onnx/utils.py", line 262, in _model_to_graph
fixed_batch_size=fixed_batch_size)
File "/home/liron/envs/detectron/lib/python3.6/site-packages/torch/onnx/utils.py", line 132, in _optimize_graph
graph = torch._C._jit_pass_onnx(graph, operator_export_type)
File "/home/liron/envs/detectron/lib/python3.6/site-packages/torch/onnx/__init__.py", line 174, in _run_symbolic_function
return utils._run_symbolic_function(*args, **kwargs)
File "/home/liron/envs/detectron/lib/python3.6/site-packages/torch/onnx/utils.py", line 619, in _run_symbolic_function
return op_fn(g, *inputs, **attrs)
File "/home/liron/envs/detectron/lib/python3.6/site-packages/torch/onnx/symbolic_helper.py", line 124, in wrapper
return fn(g, *args)
File "/home/liron/envs/detectron/lib/python3.6/site-packages/torch/onnx/symbolic_opset9.py", line 862, in batch_norm
if len(input_sizes) == 2:
TypeError: object of type 'NoneType' has no len() (occurred when translating batch_norm)
To Reproduce
load the attached torch script, and try to convert to onnx:
def convert(self):
loaded = torch.jit.load(self._torch_script_path)
#loaded.load_state_dict(self._model_state)
dummy_input = torch.randn(1, 3, 224, 224)
target = loaded(dummy_input)
torch.onnx.export(loaded, dummy_input, self._out_onnx_path, verbose=True,
operator_export_type=torch.onnx.OperatorExportTypes.ONNX,
example_outputs=target)
cc @suo @houseroad @spandantiwari @lara-hdr @BowenBao @neginraoof