那么对于已经写入的数据,如何分析找出里面的bigkey进行优化呢?可以通过Redis官方客户端redis-cli的bigkeys参数来定位大Key分布。
shell> redis-cli -h 127.0.0.1 -p 18708 -a xxxx --bigkeys -i 0.01
[00.00%] Biggest string found so far 'urlcount:www.guprocessorSuccessMid' with 1 bytes
[00.01%] Biggest string found so far 'TestDomain:www:config:scheduler' with 3847 bytes
[00.03%] Biggest string found so far 'TestDomain:www:config:scheduler' with 211306 bytes
[00.88%] Biggest set found so far 'specialTestJobSet:www' with 20 members
[01.69%] Biggest list found so far 'TestDomain:www:urlList' with 9762 items
[07.13%] Biggest list found so far 'TestDomain:bx:urlList' with 457676 items
[07.39%] Biggest set found so far 'specialTestJobSet:www' with 100 members
[13.99%] Biggest string found so far 'TestDomain:wwwe:config:scheduler' with 540731 bytes
[18.74%] Biggest set found so far 'TestJobSet' with 300 members
[58.09%] Biggest string found so far 'TestDomain:wwwrt:config:scheduler' with 739024 bytes
[64.19%] Biggest string found so far 'TestDomain:bx:config:scheduler' with 1335468 bytes
-------- summary -------
Sampled 62522 keys in the keyspace!
Total key length in bytes is 2471471 (avg len 39.53)
Biggest list found 'TestDomain:bx:urlList' has 457676 items
Biggest string found 'TestDomain:bx:config:scheduler' has 1335468 bytes
Biggest set found 'TestJobSet' has 300 members
208 lists with 2408539 items (00.33% of keys, avg size 11579.51)
0 hashs with 0 fields (00.00% of keys, avg size 0.00)
62283 strings with 32642667 bytes (99.62% of keys, avg size 524.10)
0 streams with 0 entries (00.00% of keys, avg size 0.00)
31 sets with 1354 members (00.05% of keys, avg size 43.68)
0 zsets with 0 members (00.00% of keys, avg size 0.00)从输出结果我们可以看到,每种数据类型所占用的最大长度或含有最多成员的 key 是哪一个,以及每种数据类型在整个实例中的占比和平均大小及成员数量。
其实,使用这个命令的原理就是 Redis 在内部执行了 SCAN 命令,遍历整个实例中所有的 key,然后针对 key 的类型,分别执行 STRLEN、HLEN、LLEN、SCARD、ZCARD 命令,来获取 String 类型的长度、集合类型(Hash、List、Set、ZSet)的成员个数.
注意,使用该--bigkeys进行大key的统计时要注意: