I'm trying to create some asynchronous tasks with celery in my own design app
settings .py
BROKER_URL = 'Django: // local host: 6379/0' CELERY_ACCEPT_CONTENT = ['json'] CELERY_TASK_SERIALIZER = 'json' CELERY_RESULT_SERIALIZER = 'json'
celery .py:
django.conf import setting os.environ.setdefault ('DJANGO_SETTINGS_MODULE', 'provcon.settings') application = celery imported from celery __future__ import absolute_import from celery Import OS ('from provcon') app.config_from_object ('django.conf: from : setting') app.au Todiscover_tasks (lambda: settings.INSTALLED_APPS)
import absolute_import from the .celery import app in the form of project __init__py __future__ celery_app
tasks.py:
__future__ import. Import import celery from imported imported from imported_task to absolute_import from import_task. From my thoughts call asynchronous tasks:Task import carga_ftp @ login_required (LOGIN_URL = '/ login /') def archivoview (request): usuario = request.user if request.method == 'POST': form = ProcFTPForm (usuario, request.POST) If form.is_valid ( ): Form. Save () proc = Lista_Final () Lista = proc.archivos () # call asynchronous task carga_ftp.delay () return HttpResponseRedirect ( '/ resumen /') and: form = ProcFTPForm (usuario) return render_to_response ( 'archivo.html', {'Form': form}, context_instance = RequestContext (request))
When I run a python with manage.py shell. The worker is executed and makes database objects without any problem
But when I try to execute the task sequence does not work
Any idea why the work
as the backgroundcheck whether the revision is going
< code> $ redis-cli pingCheck whether celery staff running Dijengo admin interface
If it The value is not executed
Celery - A provoque worker - l info
Check your work from your Dijengo app
If working, then the celery worker runs
No comments:
Post a Comment