How to configure DB for Django, Celery, and SQS

I'm trying to offload a montecarlo task to Celery, and save the results to PostgreSQL AWS DB (RDS)

from views.py:

newResults = MonteCarloResultTrue.objects.create(htmlCompanyArray = "[]")
viewRun = runMonteCarloAsync.delay(checkedCompaniesIDs,newResults.id)

The object is created, but in tasks.py, the DB object is not being edited:

@app.task(bind=True)
def runMonteCarloAsync(self,checkedCompaniesIDs, calc_id):
    newResults = MonteCarloResultTrue.objects.get(id=calc_id)
    newResults.htmlCompanyArray = "[asdf]"
    newResults.save()

How can I update the DB from the Celery task? Do I need to explicitly tell Celery where to look for the DB? (settings.py):

CELERY_accept_content = ['application/json']
CELERY_task_serializer = 'json'
CELERY_TASK_DEFAULT_QUEUE = 'django-queue-dev'

CELERY_BROKER_URL = 'sqs://{0}:{1}@'.format(
    urllib.parse.quote(AWS_ACCESS_KEY_ID, safe=''),
    urllib.parse.quote(AWS_SECRET_ACCESS_KEY, safe='')
)

CELERY_BROKER_TRANSPORT_OPTIONS = {
    "region": "us-east-1",
    'polling_interval': 20
}
CELERY_RESULT_BACKEND = 'django-db'
CELERY_CACHE_BACKEND = 'django-cache'

What am I missing?

The problem was with my Procfile. I changed it to:

celery_worker: celery -A rvm worker --loglevel=INFO

and I can now save in the task (Using Celery 5.x)

Back to Top