What is the best and production-safe way to handle long process them efficiently in Django, ideally with parallel processing?

I'm working on a Django app where users upload large .txt files that need to be parsed and stored in a PostgreSQL database. The files can contain hundreds of thousands of lines, and processing them can take a long time.

Currently, I'm using threading.Thread with an in-memory tasks_registry to process the file and stream task status. It works in development, but I now realize this approach is not safe in production, especially with Gunicorn and multiple workers.

  • upload data / Process the file in parallel for better performance

  • Send a response to the frontend after the upload completes, while the processing continues in the background

  • Safely handle this under Gunicorn or another WSGI server

Any recommendations for architecture or proven patterns would be very helpful?

Вернуться на верх