[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Simplified Python caching
From: |
Bruno Haible |
Subject: |
Re: Simplified Python caching |
Date: |
Mon, 15 Apr 2024 20:24:53 +0200 |
Hi Collin,
> I was looking at the documentation for the Python standard library
> yesterday and discovered the 'lru_cache()' function [1]. ...
>
> if 'makefile-unconditional' not in self.cache:
> # do work and then save it
> self.cache['makefile-unconditional'] = result
> return self.cache['makefile-unconditional']
>
> Using '@lru_cache(maxsize=None)' would let Python deal with all of
> this caching for us.
It's better to not do it, and keep the code as-is.
Two reasons:
* As I already mentioned, it's good to keep the code understandable,
without going "all-in" on all possible features that the programming
language offers.
I'm all for simplifications that keep us at the same level of
abstraction (such as using 'open' instead of 'codecs.open'). But here,
we would be starting to use the meta-level ("higher-order functions").
This can be useful in some cases, e.g. Java JUnit makes good use of it
in order to avoid repetitive code. But it's an extra steep piece of
learning curve. For saving 9x 3 lines of code? Not worth it.
* It would introduce hidden global state, that is a constraint for
future developments. Namely, if in the future we would need a GLModule
to become mutable (not sure this would be a good decision, but anyway),
then these caches behind the scenes would become a big surprise. It's
better if it is visible in plain sight, not hidden.
Bruno