website/content/blog/pymemoization.md

26 lines
938 B
Markdown
Raw Permalink Normal View History

2020-03-30 17:44:31 -04:00
---
title: "Quick Python: Memoization"
date: 2020-03-30T17:31:55-04:00
draft: false
2022-01-02 14:24:29 -05:00
tags: ["Python"]
2023-01-05 14:04:45 -05:00
medium_enabled: true
2020-03-30 17:44:31 -04:00
---
There is often a trade-off when it comes to efficiency of CPU vs memory usage. In this post, I will show how the [`lru_cache`](https://docs.python.org/3/library/functools.html#functools.lru_cache) decorator can cache results of a function call for quicker future lookup.
```python
2020-03-31 09:26:32 -04:00
from functools import lru_cache
2020-03-30 17:44:31 -04:00
@lru_cache(maxsize=2**7)
def fib(n):
if n == 1:
return 0
if n == 2:
return 1
return f(n - 1) + f(n - 2)
```
In the code above, `maxsize` indicates the number of calls to store. Setting it to `None` will make it so that there is no upper bound. The documentation recommends setting it equal to a power of two.
2020-03-31 09:26:32 -04:00
Do note though that `lru_cache` does not make the execution of the lines in the function faster. It only stores the results of the function in a dictionary.