-
-
Notifications
You must be signed in to change notification settings - Fork 1.5k
shuf: panic due to capacity overflow when allocating a vector #1420
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Can you provide the file? I should have time this weekend to figure out what's wrong (and deal with some of the other issues that have stacked up...).
|
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
This is not tied to a specific operating system. I am on Ubuntu 20.04, GNU coreutils v8.30. GNU shuf:
uutils shuf:
This is happening because we are creating a vector of all integers in the given interval before shuffling. (This test case is using the "shuffle numbers in a given range" mode, but the same thing will happen with a sufficiently large file because all lines are read into memory before shuffling.) |
I use shuf for a large txt file(about 1Gb),throw capacity overflow,detail:
The text was updated successfully, but these errors were encountered: