-
Notifications
You must be signed in to change notification settings - Fork 434
Closed
Description
I just spent hours trying to find out why my function didn't get cached when using @memory.cache
. It turns out the reason was that the function returned an object where pickle
threw an exception when trying to pickle it to disk.
This exception was silently swallowed in _store_backends.py
:
def dump_item(self, path, item, verbose=1):
"""Dump an item in the store at the path given as a list of
strings."""
try:
item_path = os.path.join(self.location, *path)
if not self._item_exists(item_path):
self.create_location(item_path)
filename = os.path.join(item_path, 'output.pkl')
if verbose > 10:
print('Persisting in %s' % item_path)
def write_func(to_write, dest_filename):
with self._open_item(dest_filename, "wb") as f:
numpy_pickle.dump(to_write, f,
compress=self.compress)
self._concurrency_safe_write(item, filename, write_func)
except: # noqa: E722
" Race condition in the creation of the directory "
It would be great if at least a log message would be generated along the lines of "function XY could not be cached due to unpicklable function result", even if this is just logged at a high verbosity level.
tnielens and Nielius
Metadata
Metadata
Assignees
Labels
No labels