Running Pytorch with Apache mod_wsgi

TL;DR make sure to add magic words WSGIApplicationGroup %{GLOBAL} to the Apache config, otherwise import torch will hang.

I tried to integrate my PyTorch AI model with my Apache web site, so I could play with it interactively. I chose to use raw WSGI script, since I did not want to invest in creating a full-blown Django or Flask solution. By the way, here’s a tutorial on how to inregrate Pytorch with Flask.

The ‘hello-world’ WSGI script worked, but importing torch caused WSGI process to hang, with eventual “gateway timeout” returned to client.

After a few hours, I found the reason. WSGI uses Python sub-interpreter by default, and apparently PyTorch cannot run in a sub-inerpeter. To prevent WSGI from using a sub-interpreter, it should run in daemon mode as part of “global group”. The working version of my Apache virtual host config contains the following WSGI-related directives:

WSGIScriptAlias /api/wsgi /var/www/mysite/api/wsgi.py
WSGIDaemonProcess mysite processes=2 threads=5 display-name=ivk-wsgi
WSGIApplicationGroup %{GLOBAL}

Additional notes:

  • WSGI process still shows up as ‘apache2’ when listing processes via ps -A. It shows up as ‘ivk-wsgi’ when using ps -ax.
  • WSGI process cannot run as root, it runs as www-data.
  • In a docker container, gdb by default refuses to attach to other user’s processes, even if you are root.
  • To overcome that, use --privileged switch as follows: docker exec --privileged -it container bash

The problem with hanging torch will affect any WSGI server, including Django and Flask, as evidenced by this StackOverflow:
https://stackoverflow.com/questions/62788479/how-to-use-pytorch-in-flask-and-run-it-on-wsgi-mod-for-apache2.

So, even if I went ahead with the Flask tutorial, I would have faced the same problem, with more moving parts to debug.

PS. Maybe I should have listened to the advice to use gunicorn instead of mod_wsgi, but using modules seemed cleaner, and “gunicorn” also has problematic pronunciation issues. Do you render it as “gunny corn”, “goony corn”, or “gee unicorn” (answer)? Anyway, I ended up using mod_wsgi.

Leave a Reply

Your email address will not be published. Required fields are marked *