Можно ли использовать AWS SQS в проекте Django с бэкендом django-celery-results?

Предупреждение: я многого не понимаю

Мое требование

I need to be able to get the result of a celery task. I need the status to change to 'SUCCESS' when completed successfully.

Например:

I need to be able to get the result of x + y after executing add.delay(1,2) on the task below.

myapp/tasks.py

from celery import shared_task
from time import sleep

@shared_task
def add(x, y):
    sleep(10)
    return x + y

Подходит ли AWS SQS для моих нужд?

Я прочитал статью Селери Использование Amazon SQS и понял, что в самом низу написано вот что о результатах.

Results

Multiple products in the Amazon Web Services family could be a good candidate to store or publish results with, but there’s no such result backend included at this point.

Вопрос:

Означает ли это, что django-celery-results нельзя использовать с AWS SQS?

Больше контекста ниже


Что я делаю на практике?

  1. I look at my AWS queue (shows messages available as 3)
  2. In my local terminal, I do celery -A ebdjango worker --loglevel=INFO (see celery output below)
  3. In my PyCharm Python console connected to my Django project, I do r = add. delay(1,2)
  4. r is an AsyncResult object:
>>> r = add.delay(1,2)
>>> r
<AsyncResult: b69c4287-5c82-4873-aa8c-227547511233>
  1. In AWS, my "Messages available" went from 3 to 4
  2. Locally, in my terminal, nothing happened (I expect SQS to send the message back to me locally? Is this wrong?)
  3. I inspect r and see this:

    >>> r.id
    'b69c4287-5c82-4873-aa8c-227547511233'
    >>> r.status
    'PENDING'
    >>> r.result
    >>> type(r.result)
    <class 'NoneType'>

ebdjango/settings.py

...
AWS_ACCESS_KEY_ID = "XXXXXXXXXXXXXXXXXXX"
AWS_SECRET_ACCESS_KEY = "YYYYYYYYYYYYYYYYYYYYYYYYYYY"

CELERY_BROKER_URL = "sqs://"
CELERY_BROKER_TRANSPORT_OPTIONS = {
    'region': 'us-west-2',
    'visibility_timeout': 3600,
    'predefined_queues': {
        'eb-celery-queue': {
            'url': 'https://sqs.us-west-2.amazonaws.com/12345678910/eb-celery-queue',
            'access_key_id': AWS_ACCESS_KEY_ID,
            'secret_access_key': AWS_SECRET_ACCESS_KEY,
        }
    }
}

CELERY_SEND_EVENTS = False
CELERY_ENABLE_REMOTE_CONTROL = False
CELERY_TASK_DEFAULT_QUEUE = 'eb-celery-queue'
CELERY_WORKER_CONCURRENCY = 1
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_CONTENT_ENCODING = 'utf-8'
CELERY_RESULT_BACKEND = 'django-db' <-- Note: I have django-celery-results installed and set

Выход сельдерея в начале:

(eb-virt) C:\Users\Jarad\Documents\PyCharm\DEVOPS\ebdjango>celery -A ebdjango worker --loglevel=INFO
[2021-08-27 14:35:31,914: WARNING/MainProcess] No hostname was supplied. Reverting to default 'None'

 -------------- celery@Inspiron v5.1.2 (sun-harmonics)
--- ***** -----
-- ******* ---- Windows-10-10.0.19041-SP0 2021-08-27 14:35:31
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         ebdjango:0x1d64e4d4630
- ** ---------- .> transport:   sqs://localhost//
- ** ---------- .> results:
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> eb-celery-queue  exchange=eb-celery-queue(direct) key=eb-celery-queue


[tasks]
  . ebdjango.celery.debug_task
  . homepage.tasks.add
  . homepage.tasks.count_widgets
  . homepage.tasks.cu
  . homepage.tasks.mul
  . homepage.tasks.rename_widget
  . homepage.tasks.xsum

[2021-08-27 14:35:31,981: WARNING/MainProcess] No hostname was supplied. Reverting to default 'None'
[2021-08-27 14:35:31,981: INFO/MainProcess] Connected to sqs://localhost//
[2021-08-27 14:35:32,306: WARNING/MainProcess] ...
[2021-08-27 14:35:32,307: INFO/MainProcess] celery@Inspiron ready.

I do notice that 1) results: section shows empty (like it's not defined) and 2) the task events are OFF which might be because task events aren't supported for SQS, but I don't know for certain. It seems I can set CELERY_SEND_EVENTS and it has no effect on task events output here.

Вернуться на верх