-
Notifications
You must be signed in to change notification settings - Fork 173
Insights: willccbb/verifiers
Overview
-
- 0 Merged pull requests
- 1 Open pull request
- 3 Closed issues
- 11 New issues
There hasn’t been any commit activity on willccbb/verifiers in the last week.
Want to help out?
1 Pull request opened by 1 person
-
Add design patterns / architecture overview to docs
#102 opened
Jun 23, 2025
3 Issues closed by 3 people
-
Online drop of non-signal samples
#107 closed
Jun 26, 2025 -
OOM
#98 closed
Jun 26, 2025 -
flash attention error afaik
#96 closed
Jun 23, 2025
11 Issues opened by 9 people
-
Question regarding the effective batch size
#112 opened
Jun 28, 2025 -
vf-vllm parameter issue with vllm
#111 opened
Jun 27, 2025 -
Add a parameter for max tokens per turn
#110 opened
Jun 26, 2025 -
vLLM batching questions
#109 opened
Jun 26, 2025 -
Crashing at the end of dataset
#108 opened
Jun 25, 2025 -
Running Qwen 2.5-72B-Instruct with LoRA
#106 opened
Jun 25, 2025 -
--enforce-eager default parameter
#105 opened
Jun 24, 2025 -
Possible to train on a single GPU?
#104 opened
Jun 24, 2025 -
Evaluation Timeout & Error Message Handling during Training
#103 opened
Jun 23, 2025 -
RuntimeError: Cannot re-initialize CUDA in forked subprocess.
#101 opened
Jun 23, 2025 -
sampling_args not defined in smola_tool_env.py
#100 opened
Jun 22, 2025
1 Unresolved conversation
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
add: RAG tool (simple BM25)
#61 commented on
Jun 25, 2025 • 1 new comment