I’m using Redis as the broker for Celery. Lately, Redis has a high memory consumption, and the big mystery to me is that if I multiply avg size of each entry and the number of entries, it should account for only a few megabytes. And yet, it has around 380MB of allocated memory.
One thing that we think might be affecting is the fact that we are constantly revoking tasks (~100 tasks / hour). But,on theother hand fragmentation ratio is around 0.94.
Bellow is the outputs for --bigkeys
and info memory
.
What could be consuming that much memory on this Redis instance if not the key/value stored?
[00.00%] Biggest string found so far '"celery-task-meta-1da49f3e-608f-401e-9335-9cca050e9dbe"' with 277 bytes
[00.00%] Biggest string found so far '"celery-task-meta-2ddc2fe7-516d-45a0-b1d2-3aa3fdaaf9a7"' with 4910 bytes
[02.03%] Biggest set found so far '"_kombu.binding.celeryev"' with 70 members
[05.03%] Biggest hash found so far '"unacked"' with 201 fields
[51.11%] Biggest string found so far '"celery-task-meta-a3b9787c-62ad-49c4-ae90-e109d2592c92"' with 5498 bytes
[92.64%] Biggest zset found so far '"unacked_index"' with 201 members
[92.64%] Biggest string found so far '"celery-task-meta-3d374276-61b3-4ed5-908e-3e5a4406490b"' with 6033 bytes
-------- summary -------
Sampled 1033 keys in the keyspace!
Total key length in bytes is 54576 (avg len 52.83)
Biggest hash found '"unacked"' has 201 fields
Biggest string found '"celery-task-meta-3d374276-61b3-4ed5-908e-3e5a4406490b"' has 6033 bytes
Biggest set found '"_kombu.binding.celeryev"' has 70 members
Biggest zset found '"unacked_index"' has 201 members
0 lists with 0 items (00.00% of keys, avg size 0.00)
1 hashs with 201 fields (00.10% of keys, avg size 201.00)
1028 strings with 965720 bytes (99.52% of keys, avg size 939.42)
0 streams with 0 entries (00.00% of keys, avg size 0.00)
3 sets with 73 members (00.29% of keys, avg size 24.33)
1 zsets with 201 members (00.10% of keys, avg size 201.00)
> info memory
# Memory
used_memory:387115568
used_memory_human:369.18M
used_memory_rss:366198784
used_memory_rss_human:349.23M
used_memory_peak:403778136
used_memory_peak_human:385.07M
used_memory_peak_perc:95.87%
used_memory_overhead:7753176
used_memory_startup:6425024
used_memory_dataset:379362392
used_memory_dataset_perc:99.65%
allocator_allocated:388137472
allocator_active:389877760
allocator_resident:400334848
used_memory_lua:36864
used_memory_lua_human:36.00K
used_memory_scripts:1080
used_memory_scripts_human:1.05K
number_of_cached_scripts:3
maxmemory:402653184
maxmemory_human:384.00M
maxmemory_policy:volatile-lru
allocator_frag_ratio:1.00
allocator_frag_bytes:1740288
allocator_rss_ratio:1.03
allocator_rss_bytes:10457088
rss_overhead_ratio:0.91
rss_overhead_bytes:-34136064
mem_fragmentation_ratio:0.95
mem_fragmentation_bytes:-20875768
mem_not_counted_for_evict:0
mem_replication_backlog:1048576
mem_clients_slaves:0
mem_clients_normal:143520
mem_aof_buffer:0
mem_allocator:jemalloc-5.1.0
active_defrag_running:0
lazyfree_pending_objects:0
lazyfreed_objects:0