8000 googlenet.py for MXNet-master by jinboci · Pull Request #1565 · dmlc/gluon-cv · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

googlenet.py for MXNet-master #1565

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 97 commits into from
Jan 15, 2021
Merged

googlenet.py for MXNet-master #1565

merged 97 commits into from
Jan 15, 2021

Conversation

jinboci
Copy link
@jinboci jinboci commented Dec 9, 2020

googlenet for MXNet-master
Passed tests for googlenet.py at test_imagenet_models() in test_model_zoo.py on my local sever.

Changes due to mxnet-master's different implementations from 1.7:

  • nn.HybridSequential on mxnet-master does not accept the parameter prefix
  • No name_scope defined in class HybridSequential HybridBlock Block on mxnet-master (block.py)

jinboci and others added 30 commits August 7, 2020 06:27
* fix both document and a bug for RandomCrop

`RandomCrop` pad first and then crop, not what is said in the document or even CIFAR tutorials.

further, an error occurs with the default `pad=None`
```
>>> for i in train_data:break
... 
multiprocessing.pool.RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/usr/lib/python3.8/multiprocessing/pool.py", line 125, in worker
    result = (True, func(*args, **kwds))
  File "/home/neutron/.local/lib/python3.8/site-packages/mxnet/gluon/data/dataloader.py", line 450, in _worker_fn
    batch = batchify_fn([_worker_dataset[i] for i in samples])
  File "/home/neutron/.local/lib/python3.8/site-packages/mxnet/gluon/data/dataloader.py", line 450, in <listcomp>
    batch = batchify_fn([_worker_dataset[i] for i in samples])
  File "/home/neutron/.local/lib/python3.8/site-packages/mxnet/gluon/data/dataset.py", line 219, in __getitem__
    return self._fn(*item)
  File "/home/neutron/.local/lib/python3.8/site-packages/mxnet/gluon/data/dataset.py", line 230, in __call__
    return (self._fn(x),) + args
  File "/home/neutron/.local/lib/python3.8/site-packages/mxnet/gluon/block.py", line 693, in __call__
    out = self.forward(*args)
  File "/home/neutron/.local/lib/python3.8/site-packages/mxnet/gluon/nn/basic_layers.py", line 55, in forward
    x = block(x)
  File "/home/neutron/.local/lib/python3.8/site-packages/mxnet/gluon/block.py", line 693, in __call__
    out = self.forward(*args)
  File "/home/neutron/.local/lib/python3.8/site-packages/gluoncv/data/transforms/block.py", line 75, in forward
    return image.random_crop(nd.array(x_pad), *self._args)[0]
UnboundLocalError: local variable 'x_pad' referenced before assignment
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/neutron/.local/lib/python3.8/site-packages/mxnet/gluon/data/dataloader.py", line 505, in __next__
    batch = pickle.loads(ret.get(self._timeout))
  File "/usr/lib/python3.8/multiprocessing/pool.py", line 771, in get
    raise self._value
UnboundLocalError: local variable 'x_pad' referenced before assignment
```



This PR is intend to fix both the document and the BUG which caused `pad` cannot be optional.

* make pylint happy.

make pylint happy.

* happy-2

happy-2

* remove monkey patch

I just think monkey patch may goes faster than the useless switch in the forward step

* make checkers happy
* bump up to 0.9 pre

* fix nightly build mxnet
This reverts commit d2daaa3.
@zhreshold zhreshold marked this pull request as ready for review January 15, 2021 01:11
@zhreshold zhreshold merged commit 31a2eb7 into dmlc:numpy Jan 15, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants
0