Is it possible to define custom max request size per endpoint?


I was wondering if it was possible to define a small request size for all the endpoint but set a higher for specific - “upload” - endpoint to allow for a higher value.

This would ensure that most of the endpoints don’t risk receiving a too heavy request while allowing a specific max size for dedicated endpoint.

I saw that an endpoint with stream=True can define a custom max request size, but I was wondering if this was possible for non-streamable endpoint.

Also, does it work to define a low REQUEST_MAX_SIZE for the whole project, and a higher for custom endpoint, or does the general configuration will trigger first?

Thank you for the help.

The general idea is that you should have a low setting for the whole project but then use streaming handlers for the few places where larger data is expected. A streaming handler can increase the value up to infinity and will not trigger on the general limit then.

Using non-streaming handlers with large requests makes your service vulnerable to denial of service attacks because someone can send many large requests to fill up all your RAM and crash your server. Thus, any large data should always use streaming and not buffer the whole thing in memory.



I’m re-opening this because I’m not sure to see how it can work when combining both the streaming and files.

Right now, when I upload a file, I can get access to its details by calling request.files.get('name') and can access either “name”, “type” and “body”.

Body, here, is what causes the problem, when the data is too big.

But if I use stream=True, I can’t find in the documentation how to access the name, content type and body of the content sent, just the raw data sent.

Ideally, I would go with something like this, when using stream=True

name = request.files.get('name').name
content_type = request.files.get('name').type
body = await request.files.get('name').read() # This would be the change

Is something like this possible? How can I do that?