使用python获取Redis数据库中的所有键
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/22255589/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Get all keys in Redis database with python
提问by tscizzle
There is a post about a Redis command to get all available keys, but I would like to do it with Python.
有一篇关于 Redis 命令来获取所有可用密钥的帖子,但我想用 Python 来做。
Any way to do this?
有没有办法做到这一点?
采纳答案by Patrick Collins
Use scan_iter()
用 scan_iter()
scan_iter()is superior to keys()for large numbers of keys because it gives you an iterator you can use rather than trying to load all the keys into memory.
scan_iter()优于keys()大量键,因为它为您提供了一个可以使用的迭代器,而不是尝试将所有键加载到内存中。
I had a 1B records in my redis and I could never get enough memory to return all the keys at once.
我的 redis 中有 1B 条记录,我永远无法获得足够的内存来一次返回所有键。
SCANNING KEYS ONE-BY-ONE
一对一扫描密钥
Here is a python snippet using scan_iter()to get all keys from the store matching a pattern and delete them one-by-one:
这是一个 python 片段,scan_iter()用于从商店中获取匹配模式的所有键并逐一删除它们:
import redis
r = redis.StrictRedis(host='localhost', port=6379, db=0)
for key in r.scan_iter("user:*"):
# delete the key
r.delete(key)
SCANNING IN BATCHES
分批扫描
If you have a very large list of keys to scan - for example, larger than >100k keys - it will be more efficient to scan them in batches, like this:
如果您要扫描的密钥列表非常大 - 例如,大于 >100k 的密钥 - 批量扫描它们会更有效,如下所示:
import redis
from itertools import izip_longest
r = redis.StrictRedis(host='localhost', port=6379, db=0)
# iterate a list in batches of size n
def batcher(iterable, n):
args = [iter(iterable)] * n
return izip_longest(*args)
# in batches of 500 delete keys matching user:*
for keybatch in batcher(r.scan_iter('user:*'),500):
r.delete(*keybatch)
I benchmarked this script and found that using a batch size of 500 was 5 times faster than scanning keys one-by-one. I tested different batch sizes (3,50,500,1000,5000) and found that a batch size of 500 seems to be optimal.
我对这个脚本进行了基准测试,发现使用 500 的批量大小比逐个扫描密钥快 5 倍。我测试了不同的批次大小 (3,50,500,1000,5000),发现 500 的批次大小似乎是最佳的。
Note that whether you use the scan_iter()or keys()method, the operation is not atomic and could fail part way through.
请注意,无论您使用scan_iter()orkeys()方法,该操作都不是原子的,并且可能会中途失败。
DEFINITELY AVOID USING XARGS ON THE COMMAND-LINE
绝对避免在命令行上使用 XARGS
I do not recommend this example I found repeated elsewhere. It will fail for unicode keys and is incredibly slow for even moderate numbers of keys:
我不推荐这个我在别处重复发现的例子。对于 unicode 键,它会失败,即使是中等数量的键,它也会非常慢:
redis-cli --raw keys "user:*"| xargs redis-cli del
In this example xargs creates a new redis-cli process for every key! that's bad.
在这个例子中,xargs 为每个键创建了一个新的 redis-cli 进程!那很糟。
I benchmarked this approach to be 4 times slower than the first python example where it deleted every key one-by-one and 20 times slower than deleting in batches of 500.
我对这种方法进行了基准测试,比第一个 python 示例慢 4 倍,在第一个 python 示例中,它逐个删除每个键,比批量删除 500 个键慢 20 倍。
回答by fedorqui 'SO stop harming'
Yes, use keys()from the StrictRedis module:
是的,keys()从 StrictRedis 模块使用:
>>> import redis
>>> r = redis.StrictRedis(host=YOUR_HOST, port=YOUR_PORT, db=YOUR_DB)
>>> r.keys()
Giving a null pattern will fetch all of them. As per the page linked:
给出一个空模式将获取所有这些。根据链接的页面:
keys(pattern='*')
Returns a list of keys matching pattern
键(模式='*')
返回匹配模式的键列表
回答by Black_Rider
import redis
r = redis.Redis("localhost", 6379)
for key in r.scan_iter():
print key
using Pyredis library
使用 Pyredis 库
Available since 2.8.0.
自 2.8.0 起可用。
Time complexity: O(1) for every call. O(N) for a complete iteration, including enough command calls for the cursor to return back to 0. N is the number of elements inside the collection..
时间复杂度:每次调用 O(1)。O(N) 用于完整的迭代,包括足够的命令调用使游标返回到 0。N 是集合内的元素数。

