Closed
Description
I'm running below code, and got error below.
Commenting out y = y.transpose(0, 1)
ends up without error.
Is this something expected?
import torch
import torch.nn
from torch.autograd import Variable
import numpy as np
x = Variable(torch.from_numpy(np.random.randn(4, 6, 5)).float(), requires_grad=True)
x = torch.transpose(x, 1, 2)
conv = torch.nn.Conv1d(in_channels=5, out_channels=10, kernel_size=2)
y = conv(x)
y = y.transpose(0, 1) # this line cause error
y = y.view(-1)
result = torch.sum(y)
result.backward()
Traceback (most recent call last):
File "spam.py", line 13, in <module>
y = y.view(-1)
File "/home/wkentaro/anaconda2/lib/python2.7/site-packages/torch/autograd/variable.py", line 469, in view
return View(*sizes)(self)
File "/home/wkentaro/anaconda2/lib/python2.7/site-packages/torch/autograd/_functions/tensor.py", line 96, in forward
result = i.view(*self.sizes)
RuntimeError: input is not contiguous at /home/wkentaro/Projects/pytorch/src/pytorch/torch/lib/TH/generic/THTensor.c:231
In actual, I'm using .transpose()
and .view()
to use cross_entropy
for (n_batch, channel, height, width)
tensor by reshaping it to (n_batch * height * width, channel)
Metadata
Metadata
Assignees
Labels
No labels