If you work with Redis and a fairly big dataset it’s pretty likely that, at some point, you’re going to fill up all your RAM. This already happened to me more than once with hashtagify.me; the easy fix, if the growth is slow enough, is to just add more memory.
When that doesn’t work anymore, the next move – while waiting for Redis cluster – is quite likely going to be using some manual sharding technique. When you do, you’ll probably going to need to split your dataset among more instances.
This is exactly what just happened to me, and I didn’t find any ready recipe or solution to easily split my rather large dataset. So I ended up creating a small node.js script which doesn’t use too much memory and that you can find here: https://github.com/hashtagify/redis-migrate-keys