Failed to Obtain Process Lock on Linux But Successful on Windows

Hi all,

I have the following simplified code, which defines a lock named “locks” (type multiprocessing.Lock) in shared_ctx, but when trying to access it using request.app.shared_ctx.locks in the index function on my Linux system, the system hangs (just leaving this line of code there).

from sanic import Sanic
from sanic import request
from sanic.response import html
from multiprocessing import Lock
import logging

app = Sanic(__name__)

@app.main_process_start
async def init(app: Sanic):
    app.shared_ctx.locks = Lock()

@app.route("/index", methods=["get"])
async def index(request: request.Request):
    print("enter")
    while not request.app.shared_ctx.locks.acquire(block=False, timeout=None):
        print("1")
        await asyncio.sleep(0.1)
    print("exit")
    request.app.shared_ctx.locks.release()
    return html("sss")

if __name__ == '__main__':
    app.run("0.0.0.0", 80, fast=True, debug=True)

Here is the result on the terminal:

[2023-06-10 06:01:55 -0400] [51615] [DEBUG] Process ack: Sanic-Server-2-0 [51615]
[2023-06-10 06:01:55 -0400] [51615] [INFO] Starting worker [51615]
enter
enter
enter
enter

It seems that this code works fine on Windows but not on Linux. Have any of you seen a similar issue before? Any advice on how to resolve it would be appreciated.

Thank you!

Try this:

from multiprocessing import Manager

@app.main_process_start
async def init(app: Sanic):
    app.shared_ctx.locks = Manager().Lock()


1 Like

Thank you. With this code, I have successfully solved the problem. However, I will try to understand why Windows can succeed without using Manager.Lock.

If you do, please tell me!