Sanic logs streaming

Hi @all I’m working on docker logs streaming on sanic. I created streaming endpoint with stream logs

async def test(request):
    container_name = 'postgres'
    client = from_env(assert_hostname='http+unix://var/run/docker.sock')
    image = client.images.list(name='postgres')
    containers = client.containers.list(filters={'ancestor': image[0].id})
    id_cont = str(containers[0].id)
    logs_container = client.containers.get(id_cont)
    async def sample_streaming_fn(response):
        logs = logs_container.logs(follow=True, tail=1, timestamps=True, stream=True)
        for i in logs:
            await response.write(i)
    return stream(sample_streaming_fn, content_type='text/csv')

But i don’t understand how can i check if streaming work correct. How can I stream logs data from sanic?

I am not familiar with what api your code is using. Are you trying to setup a syslog receiver, and then output that as a stream in an endpoint?

Looking real quick, it seems that sample_streaming_fn had no way to read our wait for something to come in, and it would just drain whatever is there and then complete. Not knowing what tools you are trying to achieve this with, I cannot say how this could be done.

Yes, i try to read docker logs by row and write it in stream. And data not writing in response.
I made it simple

async def docker_stream(request):
    async def sample_streaming_fn(response):
        controller = DockerController(request.args)
        container = controller.logs_stream()
        for log in container.logs(follow=True, tail=1, timestamps=True, stream=True):
            await response.write(log.decode('utf-8'))
    return stream(sample_streaming_fn, content_type='text/event-stream')