HTTP headers get inserted at the start of a CSV download on production but not in local

I have rows of dicts and want to make it available for download as a CSV file. This works perfectly in my local development environment. But when I use it on the server, the HTTP headers get prepended at the beginning of the file, and I can’t figure out why.

Sample code:

import csv
import io

import arrow
from sanic import response
from sanic.views import HTTPMethodView


class DownloadCSVView(HTTPMethodView):
    async def get(self, request):
        rows = [
            dict(a=1, b=2, c=3),
            dict(a=7, b=8, c=9),
            dict(a=4, b=5, c=6),
        ]
        fieldnames = rows[0].keys()

        async def streaming_fn(response):
            data = io.StringIO()
            writer = csv.DictWriter(
                data,
                fieldnames=fieldnames,
                extrasaction='ignore'
            )
            writer.writeheader()
            for row in rows:
                writer.writerow(row)

            await response.write(data.getvalue())

        content_type = 'text/csv'

        return response.stream(
            streaming_fn,
            content_type=content_type,
            headers={
                'Content-Disposition': 'attachment; filename="foo-{}.csv";'.format(
                    arrow.utcnow().format('YYYY-MM-DD-HH-MM-SS')
                ),
                'Content-Type': content_type,
            }
        )

CSV file in local:

a,b,c
1,2,3
7,8,9
4,5,6

CSV file when hosted:

HTTP/1.1 200 OK
Content-Disposition: attachment; filename="foo-2020-10-22-08-10-13.csv";
Content-Type: text/csv
Set-Cookie: session=2d43dxxxxxxxxxxxxxxxxxx; Path=/; HttpOnly; expires=Thu, 29-Oct-2020 08:09:06 GMT; Max-Age=604800; SameSite=None; Secure
Transfer-Encoding: chunked
Connection: close

511f
a,b,c
1,2,3
7,8,9
4,5,6

I can’t figure out if sanic is doing this or it has to do with the hosting / cloudflare / anything else that is not present in my local dev.

Also, if you have a suggestion on how to write this better, it would be great — I use StringIO with csv only because that is how I could get things working with the examples in the docs, but if there are in fact better ways to write a CSV file, I would love to know about it.

Any help would be greatly appreciated!

If you are using a CDN, like cloudflare, this could be something you’re getting in transit from the edge node.

To verify, you should hit your defined origin directly and see if the problem persists - if it happens at the edge, but not on the origin, there’s a misconfiguration. If it’s happening both at the origin and at the edge, perhaps review the differences between the local dev environment and the deployed environment in order to identify what’s going on.

I’ve made a page rule to to bypass the CDN and still had the same problem. Then I tried running curl on localhost on both servers and local and I think that I have found the culprit (not sure why I didn’t think of doing this before til now).

  • I develop on local running the sanic module directly
  • Server uses gunicorn with uvicorn worker

I have just reproduced this in local dev when using gunicorn with uvicorn worker, and then I tested again using the uvicorn command line directly and have the same issue.

So I suppose this is a uvicorn issue but I’m not entirely sure how to fix this.

I have opened an issue with uvicorn because I think that this is more related to them than sanic — though if it has something to do with the compatibility of the frameworks then maybe it‘s worth it for you to look into also.

I have used a much simpler example on the github issue, using sample code which comes directly from the sanic docs:

Looks like this issue. And, it might indeed not be uvicorn and be a Sanic problem.

Let me get back and take a look at this one again.

This is being released in #1957 for 20.9 and 19.12

I am using sanic==20.6.3 with Python 3.8.1

The fix will be available on either 20.9.1 or 19.12.3. It will not be backported to 20.6. It should be available hopefully some time today.

1 Like