8000 ValueError: Input dimension mis-match. (input[2].shape[0] = 2080, input[3].shape[0] = 32) · Issue #43 · hidasib/GRU4Rec · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
ValueError: Input dimension mis-match. (input[2].shape[0] = 2080, input[3].shape[0] = 32) #43
Open
@johnny12150

Description

@johnny12150

I disable the custom GPU optimization following the instructions in README.md.

https://github.com/hidasib/GRU4Rec#executing-on-cpu

However, this will trigger the error.

  File "./models/theano/gru4rec\model\gru4rec.py", line 617, in fit
    cost = train_function(in_idx, y, len(iters), reset.reshape(len(reset), 1))
  File "A:\env\sess\lib\site-packages\theano\compile\function_module.py", line 917, in __call__
    storage_map=getattr(self.fn, 'storage_map', None))
  File "A:\env\sess\lib\site-packages\theano\gof\link.py", line 325, in raise_with_op
    reraise(exc_type, exc_value, exc_trace)
  File "A:\env\sess\lib\site-packages\six.py", line 702, in reraise
    raise value.with_traceback(tb)
  File "A:\env\sess\lib\site-packages\theano\compile\function_module.py", line 903, in __call__
    self.fn() if output_subset is None else\
ValueError: Input dimension mis-match. (input[2].shape[0] = 2080, input[3].shape[0] = 32)
Apply node that caused the error: Elemwise{Composite{(i0 + Switch(i1, i2, i3))}}[(0, 2)](TensorConstant{(1,) of 1e-24}, Elemwise{gt,no_inplace}.0, Sum{axis=[0], acc_dtype=float64}.0, Sum{axis=[1], acc_dtype=float64}.0)
Toposort index: 61
Inputs types: [TensorType(float64, (True,)), TensorType(bool, (True,)), TensorType(float64, vector), TensorType(float64, vector)]
Inputs shapes: [(1,), (1,), (2080,), (32,)]
Inputs strides: [(8,), (1,), (8,), (8,)]
Inputs values: [array([1.e-24]), array([False]), 'not shown', 'not shown']
Outputs clients: [[InplaceDimShuffle{x,0}(Elemwise{Composite{(i0 + Switch(i1, i2, i3))}}[(0, 2)].0), InplaceDimShuffle{0,x}(Elemwise{Composite{(i0 + Switch(i1, i2, i3))}}[(0, 2)].0), Elemwise{Log}[(0, 0)](Elemwise{Composite{(i0 + Switch(i1, i2, i3))}}[(0, 2)].0)]]

HINT: Re-running with most Theano optimization disabled could give you a back-trace of when this node was created. This can be done with by setting the Theano flag 'optimizer=fast_compile'. If that does not work, Theano optimizations can be disabled with 'optimizer=None'.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

Process finished with exit code 1

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0