Open
Description
I'm doing a bunch of batch processing for my entire dataset. I can't give the exact code, but it's a bit like this:
for path in npz_paths:
print(f"Processing {path}")
units = nap.load_file(str(path)) # TsGroup
peths = nap.compute_perievent(units, onsets, (-1, 1)) # dict of TsGroup
for i, peth in peths.items():
peth.save(f"{cell_name}.npz")
del units, peths, peth
gc.collect()
(This is btw a dirty solution to unpack neuron-indexed perievents as mentioned by #380)
My code usually stops running without issuing warnings or errors after hundreds of iterations. As you can already tell from the title and the code, I suspected memory exhaustion to be the cause. However, adding del
and gc.collect()
didn't solve the issue.
I would appreciate any feedback! My current workaround is to partition my dataset into several segments and run them separately.
Metadata
Metadata
Assignees
Labels
No labels