-
Notifications
You must be signed in to change notification settings - Fork 137
Fix Double Error On Windows #911
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@elalish @pca006132 I might have misinterpreted the outputs, but it seems to me like the error is merely caused by overflow of memory in the stack? and we would just need to allocate it in the heap? I was trying to fix it, but I wasn't sure about my interpretation of these errors. So wanted to get your approval first. |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #911 +/- ##
==========================================
- Coverage 91.84% 89.76% -2.08%
==========================================
Files 37 66 +29
Lines 4976 9813 +4837
Branches 0 1055 +1055
==========================================
+ Hits 4570 8809 +4239
- Misses 406 1004 +598 ☔ View full report in Codecov by Sentry. |
Ah, a Stack Overflow - I suppose that could make sense since it's only happening on our largest input files and only after switching to double precision. But what is being allocated on the stack? The vector should just be a pointer to its heap allocation. I wonder if this is some idiosyncrasy of the way initializer lists are handled in our tests? By all means, if you have an idea, feel free to try it out here. |
So, I think it's possible that when we allocate the polygons which are bigger in input. The entire vector declared is initially in the stack, since we're declaring it in one go?(I'm not sure if that's the case). So, we can maybe try to Also, since this input is so large, why don't we write it all in a file and read from it by adding the points 1 by 1 in the process? |
Yeah, that's probably the right approach. Our current system was designed only to be very easy to copy-paste from our std out spew to a new test, not so much for extremely large polygons. |
Alright, I had started some work on it, but I see that @pca006132 is already much further along than I am. If there's something I can help out with, ping me. Till then, I'll go back to my prior tasks. |
Closing in favor of #913. |
Fixes #910, hopefully.
I had an idea through my experimentation on Windows and just wanted to check if it's correct.