Closed
Description
🐛 Describe the bug
matmul
triggers INTERNAL ASSERT FAILED when input requires grad
import torch
input_tensor = torch.randint(-8, 2, [11, 0, 4], dtype=torch.int8)
other_tensor = torch.rand([4], dtype=torch.float32)
input = input_tensor.clone()
other = other_tensor.clone()
res1 = torch.matmul(input, other, )
# Normal Pass
input = input_tensor.clone()
other = other_tensor.clone().requires_grad_()
res2 = torch.matmul(input, other, )
# RuntimeError: isDifferentiableType(variable.scalar_type())INTERNAL ASSERT FAILED at "/Users/distiller/project/pytorch/torch/csrc/autograd/functions/utils.h":65, please report a bug to PyTorch.
Plus, mm
also has such issue
import torch
input_tensor = torch.randint(0, 2, [0, 3], dtype=torch.uint8)
mat2_tensor = torch.rand([3, 2], dtype=torch.float32)
input = input_tensor.clone()
mat2 = mat2_tensor.clone()
res1 = torch.mm(input, mat2, )
input = input_tensor.clone()
mat2 = mat2_tensor.clone().requires_grad_()
res2 = torch.mm(input, mat2, )
# RuntimeError: isDifferentiableType(variable.scalar_type())INTERNAL ASSERT FAILED at "/Users/distiller/project/pytorch/torch/csrc/autograd/functions/utils.h":65, please report a bug to PyTorch.
Versions
pytorch: 1.11.0
cc @ezyang @albanD @zou3519 @gqchen @pearu @nikitaved @soulitzer @lezcano @Varal7