How to save Celery data into Django DB
I'm running a Django app on AWS, with a PostgreSQL DB, using SQS. I'm trying to offload the montecarlo simulation onto Celery, but I am unable to save the results to my DB.
My view looks like:
runMonteCarloAsync.delay(checkedCompaniesIDs)
The task looks like:
@app.task(bind=True)
def runMonteCarloAsync(self,checkedCompaniesIDs):
# Do some montecarlo stuff
data = []
newResults = MonteCarloResultTrue(data=data)
newResults.save()
Here is my settings.py:
CELERY_accept_content = ['application/json']
CELERY_task_serializer = 'json'
CELERY_TASK_DEFAULT_QUEUE = 'django-queue-dev'
CELERY_BROKER_URL = 'sqs://{0}:{1}@'.format(
urllib.parse.quote(AWS_ACCESS_KEY_ID, safe=''),
urllib.parse.quote(AWS_SECRET_ACCESS_KEY, safe='')
)
CELERY_BROKER_TRANSPORT_OPTIONS = {
"region": "us-east-1",
'polling_interval': 20
}
CELERY_RESULT_BACKEND = 'django-db'
CELERY_CACHE_BACKEND = 'django-cache'
ProcFile:
web: python manage.py runserver
celery_worker: celery worker -A rvm.settings.celery.app --concurrency=1 --loglevel=INFO -n worker.%%h
celery_beat: celery
I can see the messages hitting SQS:
There are no new DB entires. I feel like I'm missing something, but I can't figure it out.