8000 Add support for torch exported model by agunapal · Pull Request #2812 · pytorch/serve · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Add support for torch exported model #2812

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 3 commits into from

Conversation

agunapal
Copy link
Collaborator

Description

This PR adds support for

  • Loading torch.export model
  • Includes an example of ResNet 18
  • Adds a pytest for torch.export

Fixes #(issue)

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • New feature (non-breaking change which adds functionality)
  • This change requires a documentation update

Feature/Issue validation/testing

Please describe the Unit or Integration tests that you ran to verify your changes and relevant result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.

================================================================================== test session starts ===================================================================================
platform linux -- Python 3.8.18, pytest-7.3.1, pluggy-1.0.0 -- /home/ubuntu/anaconda3/envs/torchserve/bin/python
cachedir: .pytest_cache
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/ubuntu/serve/test/pytest/.hypothesis/examples')
rootdir: /home/ubuntu/serve
plugins: mock-3.10.0, anyio-3.6.1, cov-4.1.0, hypothesis-6.54.3
collected 1 item                                                                                                                                                                         

test_torch_export.py::test_execute_script_in_custom_directory PASSED                                                                                                               [100%]

==================================================================================== warnings summary ====================================================================================
test_torch_export.py:3
  /home/ubuntu/serve/test/pytest/test_torch_export.py:3: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
    from pkg_resources import packaging

../../../anaconda3/envs/torchserve/lib/python3.8/site-packages/pkg_resources/__init__.py:2871
  /home/ubuntu/anaconda3/envs/torchserve/lib/python3.8/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('mpl_toolkits')`.
  Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../../anaconda3/envs/torchserve/lib/python3.8/site-packages/pkg_resources/__init__.py:2871
../../../anaconda3/envs/torchserve/lib/python3.8/site-packages/pkg_resources/__init__.py:2871
  /home/ubuntu/anaconda3/envs/torchserve/lib/python3.8/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('ruamel')`.
  Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../../anaconda3/envs/torchserve/lib/python3.8/site-packages/torchvision/transforms/_functional_pil.py:242
  /home/ubuntu/anaconda3/envs/torchserve/lib/python3.8/site-packages/torchvision/transforms/_functional_pil.py:242: DeprecationWarning: BILINEAR is deprecated and will be removed in Pillow 10 (2023-07-01). Use Resampling.BILINEAR instead.
    interpolation: int = Image.BILINEAR,

../../../anaconda3/envs/torchserve/lib/python3.8/site-packages/torchvision/transforms/_functional_pil.py:288
  /home/ubuntu/anaconda3/envs/torchserve/lib/python3.8/site-packages/torchvision/transforms/_functional_pil.py:288: DeprecationWarning: NEAREST is deprecated and will be removed in Pillow 10 (2023-07-01). Use Resampling.NEAREST or Dither.NONE instead.
    interpolation: int = Image.NEAREST,

../../../anaconda3/envs/torchserve/lib/python3.8/site-packages/torchvision/transforms/_functional_pil.py:304
  /home/ubuntu/anaconda3/envs/torchserve/lib/python3.8/site-packages/torchvision/transforms/_functional_pil.py:304: DeprecationWarning: NEAREST is deprecated and will be removed in Pillow 10 (2023-07-01). Use Resampling.NEAREST or Dither.NONE instead.
    interpolation: int = Image.NEAREST,

../../../anaconda3/envs/torchserve/lib/python3.8/site-packages/torchvision/transforms/_functional_pil.py:321
  /home/ubuntu/anaconda3/envs/torchserve/lib/python3.8/site-packages/torchvision/transforms/_functional_pil.py:321: DeprecationWarning: BICUBIC is deprecated and will be removed in Pillow 10 (2023-07-01). Use Resampling.BICUBIC instead.
    interpolation: int = Image.BICUBIC,

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
============================================================================= 1 passed, 8 warnings in 8.73s ============================================================================

Checklist:

  • Did you have fun?
  • Have you added tests that prove your fix is effective or that this feature works?
  • Has code been commented, particularly in hard-to-understand areas?
  • Have you made corresponding changes to the documentation?

@agunapal agunapal changed the title (WIP)Add support for torch exported model Add support for torch exported model Nov 28, 2023
@agunapal agunapal requested review from msaroufim and mreso November 28, 2023 20:44
Copy link
Member
@msaroufim msaroufim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right now there is no reason to just export a model, you want to combine this with an aot inductor tutorial so people can get ahead of time compilation and avoid cold starts

@@ -0,0 +1,62 @@
# TorchServe inference with torch export model
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there were some export related details on the main README as well could you consolidate?


- `PyTorch >= 2.1.0`

Change directory to the root of `serve`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cd serve is simpler

`torch.export.export()` takes a `torch.nn.Module` or a method along with sample inputs, and captures the computation graph into an `torch.export.ExportedProgram`

`torch.export` differs from `torch.compile` is a few ways
- JIT vs AOT
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this needs a longer elaboration

### Create model archive

```
torch-model-archiver --model-name res18-pt2 --handler image_classifier --version 1.0 --serialized-file resnet18.pt2 --extra-files ./examples/image_classifier/index_to_name.json
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you need to update the archiver code docstrings as well


`torch.export.export()` takes a `torch.nn.Module` or a method along with sample inputs, and captures the computation graph into an `torch.export.ExportedProgram`

`torch.export` differs from `torch.compile` is a few ways
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you need to combo aot inductor with this story

@agunapal
Copy link
Collaborator Author

torch.export.export by itself only creates an FX Graph representation of the model and serializes it. So, at inference, the model is executed in eager. Hence, even though this works, closing this PR as it doesn't make use of torch.compile

@agunapal agunapal closed this Nov 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants
0