8000 feat: reserve interface for other torch devices by Tohrusky · Pull Request #27 · Tohrusky/Final2x-core · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

feat: reserve interface for other torch devices #27

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Nov 19, 2024
Merged

Conversation

Tohrusky
Copy link
Owner
@Tohrusky Tohrusky commented Nov 19, 2024

Summary by Sourcery

Add support for 'directml' device and refactor device initialization to improve flexibility in selecting torch devices. Update the project version to 3.0.2.

New Features:

  • Introduce support for 'directml' device in the device selection process.

Enhancements:

  • Refactor device initialization to use a new utility function 'get_device' for determining the appropriate torch device.

Chores:

  • Update project version from 3.0.1 to 3.0.2 in pyproject.toml.

Copy link
sourcery-ai bot commented Nov 19, 2024

Reviewer's Guide by Sourcery

This PR refactors device handling in the Final2x_core library by introducing a dedicated device utility function and updating the supported device list. The changes centralize device initialization logic and add support for DirectML devices.

Class diagram for updated device handling in Final2x_core

classDiagram
    class CCRestoration {
        -SRBaseModel _SR_class
        +CCRestoration(SRConfig config)
        +process(np.ndarray img) np.ndarray
    }
    class SRConfig {
        +String device
        +String pretrained_model_name
        +String gh_proxy
    }
    class AutoModel {
        +from_pretrained(String pretrained_model_name, Boolean fp16, Union<torch.device, String> device, String gh_proxy)
    }
    class device {
        +get_device(String device) Union<torch.device, String>
    }
    CCRestoration --> SRConfig
    CCRestoration --> AutoModel
    AutoModel --> device
    note for device "New utility function for device handling"
Loading

File-Level Changes

Change Details Files
Introduced a new device handling utility function
  • Created a new utility function get_device() to centralize device initialization logic
  • Added support for DirectML devices
  • Implemented fallback to default device for unknown device types
Final2x_core/util/device.py
Refactored SR model initialization process
  • Removed separate _init_SR_model method
  • Integrated device initialization directly into init
  • Updated device handling to use the new get_device utility function
Final2x_core/SRclass.py
Updated supported device list configuration
  • Added 'directml' to the list of supported devices
  • Removed 'xla' and 'meta' from supported devices
  • Updated version number to 3.0.2
Final2x_core/config.py
pyproject.toml

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time. You can also use
    this command to specify where the summary should be inserted.

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @Tohrusky - I've reviewed your changes and they look great!

Here's what I looked at during the review
  • 🟡 General issues: 2 issues found
  • 🟢 Security: all looks good
  • 🟢 Testing: all looks good
  • 🟢 Complexity: all looks good
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

elif device.startswith("xpu"):
return torch.device("xpu")
else:
print(f"Unknown device: {device}, use auto instead.")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion: Consider using logger.warning instead of print for unknown device fallback

Using the logger would be more consistent with the rest of the codebase and provide better error tracking. Also consider raising a ValueError instead of falling back silently.

        logger.warning(f"Unknown device: {device}, use auto instead.")

elif device.startswith("mps"):
return torch.device("mps")
elif device.startswith("directml"):
import torch_directml
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion: Add explicit error handling for optional directml import

Consider wrapping this in a try-except block to provide a more informative error message if torch_directml is not installed.

Suggested change
import torch_directml
try:
import torch_directml
except ImportError:
raise ImportError("torch_directml is not installed. Please install it to use DirectML device.")

@Tohrusky Tohrusky merged commit 0da0246 into Tohrusky:main Nov 19, 2024
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant
0