8000 Update vllm requirement from <=0.6.3 to <=0.8.2 by dependabot[bot] · Pull Request #5 · CJReinforce/PURE · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Update vllm requirement from <=0.6.3 to <=0.8.2 #5

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

dependabot[bot]
Copy link
Contributor
@dependabot dependabot bot commented on behalf of github Mar 30, 2025

Updates the requirements on vllm to permit the latest version.

Release notes

Sourced from vllm's releases.

v0.8.2

This release contains important bug fix for the V1 engine's memory usage. We highly recommend you upgrading!

Highlights

  • Revert "Use uv python for docker rather than ppa:deadsnakess/ppa (#13569)" (#15377)
  • Remove openvino support in favor of external plugin (#15339)

V1 Engine

  • Fix V1 Engine crash while handling requests with duplicate request id (#15043)
  • Support FP8 KV Cache (#14570, #15191)
  • Add flag to disable cascade attention (#15243)
  • Scheduler Refactoring: Add Scheduler Interface (#15250)
  • Structured Output
    • Add disable-any-whitespace option support for xgrammar (#15316)
    • guidance backend for structured output + auto fallback mode (#14779)
  • Spec Decode
    • Enable spec decode for top-p & top-k sampling (#15063)
    • Use better defaults for N-gram (#15358)
    • Update target_logits in place for rejection sampling (#15427)
  • AMD
    • Enable Triton(ROCm) Attention backend for Nvidia GPUs (#14071)
  • TPU
    • Support V1 Sampler for ragged attention (#14227)
    • Tensor parallel MP support (#15059)
    • MHA Pallas backend (#15288)

Features

  • Integrate fastsafetensors loader for loading model weights (#10647)
  • Add guidance backend for structured output (#14589)

Others

  • Add Kubernetes deployment guide with CPUs (#14865)
  • Support reset prefix cache by specified device (#15003)
  • Support tool calling and reasoning parser (#14511)
  • Support --disable-uvicorn-access-log parameters (#14754)
  • Support Tele-FLM Model (#15023)
  • Add pipeline parallel support to TransformersModel (#12832)
  • Enable CUDA graph support for llama 3.2 vision (#14917)

What's Changed

... (truncated)

Commits
  • 25f560a [V1][Spec Decode] Update target_logits in place for rejection sampling (#15427)
  • a09ad90 [V1] guidance backend for structured output + auto fallback mode (#14779)
  • 10b34e3 [Bugfix] Fixed the issue of not being able to input video and image simultane...
  • b5269db Revert "Fix non-contiguous input passed to Marlin kernel (#15319)" (#15398)
  • 6db9457 [Misc] Remove LoRA log (#15388)
  • 97cfa65 Add pipeline parallel support to TransformersModel (#12832)
  • 911c8eb [Minor][Spec Decode] Remove compiled_softmax (#15416)
  • ebcebee [V1][Spec Decode] Enable spec decode for top-p & top-k sampling (#15063)
  • f533b58 [ROCm][Kernel] MoE weights padding (#14454)
  • 8279201 [Build] Cython compilation support fix (#14296)
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and 8000 stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Updates the requirements on [vllm](https://github.com/vllm-project/vllm) to permit the latest version.
- [Release notes](https://github.com/vllm-project/vllm/releases)
- [Changelog](https://github.com/vllm-project/vllm/blob/main/RELEASE.md)
- [Commits](vllm-project/vllm@v0.1.0...v0.8.2)

---
updated-dependencies:
- dependency-name: vllm
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update python code labels Mar 30, 2025
@CJReinforce CJReinforce deleted the dependabot/pip/vllm-lte-0.8.2 branch March 30, 2025 11:31
Copy link
Contributor Author
dependabot bot commented on behalf of github Mar 30, 2025

OK, I won't notify you again about this release, but will get in touch when a new version is available. If you'd rather skip all updates until the next major or minor version, let me know by commenting @dependabot ignore this major version or @dependabot ignore this minor version. You can also ignore all major, minor, or patch releases for a dependency by adding an ignore condition with the desired update_types to your config file.

If you change your mind, just re-open this PR and I'll resolve any conflicts on it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file python Pull requests that update python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant
0