8000 Severe BUG when initializing parameters!!! · Issue #11 · gordicaleksa/pytorch-GAT · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
Severe BUG when initializing parameters!!! #11
Open
@RongfanLi98

Description

@RongfanLi98

Whenever you use nn.Parameter(torch.Tensor(...)) , it will sometimes include nan value and results in training failure. In case someone skip the nn.init.xavier_uniform_, the right way to initialize parameter is to use nn.Parameter(torch.rand(...)) or nn.Parameter(torch.randn(...)). For example, at GAT.py.

See also: https://discuss.pytorch.org/t/nn-parameter-contains-nan-when-initializing/44559

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0