10000 spell check for opacus/opacus documentation by zycalice · Pull Request #480 · pytorch/opacus · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

spell check for opacus/opacus documentation #480 8000

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -111,3 +111,4 @@ website/static/js/*
!website/static/js/mathjax.js
!website/static/js/code_block_buttons.js
website/static/_sphinx-sources/
spelling_wordlist.txt
4 changes: 2 additions & 2 deletions opacus/privacy_engine.py
10000
Original file line number Diff line numberDiff line change
Expand Up @@ -321,7 +321,7 @@ def make_private(
grad_sample_mode: str = "hooks",
) -> Tuple[GradSampleModule, DPOptimizer, DataLoader]:
"""
Add privacy-related responsibilites to the main PyTorch training objects:
Add privacy-related responsibilities to the main PyTorch training objects:
model, optimizer, and the data loader.

All of the returned objects act just like their non-private counterparts
Expand Down Expand Up @@ -552,7 +552,7 @@ def save_checkpoint(
torch_save_kwargs: Optional[Dict[str, Any]] = None,
):
"""
Saves the state_dict of module, optimzer, and accountant at path.
Saves the state_dict of module, optimizer, and accountant at path.
Args:
path: Path to save the state dict objects.
module: GradSampleModule to save; wrapped module's state_dict is saved.
Expand Down
12 changes: 6 additions & 6 deletions opacus/utils/batch_memory_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ class BatchSplittingSampler(Sampler[List[int]]):
Samples according to the underlying instance of ``Sampler``, but splits
the index sequences into smaller chunks.

Used to split large logical batches into physocal batches of a smaller size,
Used to split large logical batches into physical batches of a smaller size,
while coordinating with DPOptimizer when the logical batch has ended.
"""

Expand Down Expand Up @@ -117,16 +117,16 @@ class BatchMemoryManager(object):

Allows setting hard limit on the physical batch size as a just one line code change.
Can be used both for simulating large logical batches with limited memory and for
safeguarding against occasinal large batches produced by
safeguarding against occasional large batches produced by
:class:`~opacus.utils.uniform_sampler.UniformWithReplacementSampler`.

Note that it doesn't modify the input DataLoader, you'd need to use new DataLoader
returned by the context manager.

BatchSplittingSampler will split large logical batches into smaller sub-batches with
certain maximum size.
On every step optimzer will check if the batch was the last physical batch comprising
a logical one, and will change behaviour accordignly.
On every step optimizer will check if the batch was the last physical batch comprising
a logical one, and will change behaviour accordingly.

If it was not the last, ``optimizer.step()`` will only clip per sample gradients and
sum them into ``p.summed_grad`.` ``optimizer.zero_grad()`` will clear ``p.grad_sample``,
Expand All @@ -136,8 +136,8 @@ class BatchMemoryManager(object):
``optimizer.step()`` and ``optimizer.zero_grad()`` will behave normally.

Example:
>>> # Assuming you've initialized you objects and passed them to PrivacyEngine.
>>> # For this example we assume data_loader is initalized with batch_size=4
>>> # Assuming you've initialized your objects and passed them to PrivacyEngine.
>>> # For this example we assume data_loader is initialized with batch_size=4
>>> model, optimizer, data_loader = _init_private_training()
>>> criterion = nn.CrossEntropyLoss()
>>> with BatchMemoryManager(
Expand Down
8 changes: 4 additions & 4 deletions opacus/validators/module_validator.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ def validate(
) -> List[UnsupportedModuleError]:
"""
Validate module and sub_modules by running registered custom validators.
Returns or raises excpetions depending on ``strict`` flag.
Returns or raises exceptions depending on ``strict`` flag.

Args:
module: The root module to validate.
Expand Down Expand Up @@ -98,14 +98,14 @@ def fix(cls, module: nn.Module, **kwargs) -> nn.Module:
module = clone_module(module)
# iterate over all sub_modules
# We have to get sub_module names in a list first as we will be
# changing the modules inside the the loop.
# changing the modules inside the loop.
sub_module_names = [name for name, _ in trainable_modules(module)]
for sub_module_name in sub_module_names:
# get sub_module
sub_module = get_submodule(module, sub_module_name)
# if sub_module has a registered fixer
if type(sub_module) in ModuleValidator.FIXERS:
# get a repalcement for sub_module
# get a replacement for sub_module
sub_module_fixer = ModuleValidator.FIXERS[type(sub_module)]
new_sub_module = sub_module_fixer(sub_module, **kwargs)
# move new_sub_module to the same device as that of sub_module
Expand Down Expand Up @@ -150,7 +150,7 @@ def fix_and_validate(cls, module: nn.Module, **kwargs) -> nn.Module:
Fix the module and sub_modules first, and then run validation.

Args:
module: The root module to be fixed and validted
module: The root module to be fixed and validated
**kwargs: Arbitrary keyword arguments.

Returns:
Expand Down
0