How to remove files after n-minutes?


I have small project. I am generating files that user can request from browser. I want to remove files after n-minutes from FS. What is the most modern way to do this task except using cron and bash?


Why do you even want to store the contents in the FS if you are generating it and using it only for a short interval?

How about using an In-Memory db or a layer of redis like cache where you store the contents of the file as text and use the API with right content disposition/content type to serve the file instead? This way the cleanup is much easier and probably way faster.


+1 for redis in this situation. It has built into it an expiration. So, when you create the file, you give it a key (aka filename), data (in your case either text or bytes), and a TTL. It will automatically purge for you after a number of seconds.


+1 redis is the better in this situation


Is it’s possible to use redis if my lib (xlsxwriter) is support writing only on FS?
Is there any more lightweight alternative to Redis? I have not more than 100 files per day.


xlsxwriter supports a parameterised way of opening the FD for the workbook.

output = io.BytesIO()
workbook = xlsxwriter.Workbook(output, {'in_memory': True})

With this, your contents will be written to the output variable which you can handle any way you see fit.

There are few alternatives such as CouchDB and others. But redis seems to be the easiest of them all.


Redis is extremely lightweight and easy to spin up with docker. Here is a resource:

I would suggest using the redis:alpine image.


And there are two good async libraries: aredis and aioredis.


You could start another Python process with Process which checks the timestamp of the files and then remove. It would be about 10 lines of code.

If you need more sophisticated solution (ie. sending messages from Sanic to process) you could wrap it up with ZeroMQ. No need to wrestle with databases in any case.