Ich habe Aufgaben, die ich mit Sellerie in meiner Anwendung ausführen. Ich habe es ohne Stress in meiner Entwicklungsumgebung eingerichtet und es funktionierte perfekt mit Redis als Broker. Gestern habe ich den Code auf meinen Server übertragen und neu eingerichtet, aber Sellerie kann die Aufgaben nicht erkennen. Der Code ist genau derselbe.Sellerie Beat nicht Aufgaben zu entdecken
Meine celery_conf.py
Datei (zunächst celery.py
):
# coding: utf-8
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'vertNews.settings')
app = Celery('vertNews')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
Sellerie Konfigurationen in den Einstellungen
# Celery Configuration
CELERY_TASK_ALWAYS_EAGER = False
CELERY_BROKER_URL = SECRETS['celery']['broker_url']
CELERY_RESULT_BACKEND = SECRETS['celery']['result_backend']
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = TIME_ZONE
__init__.py
von root app
# coding: utf-8
from __future__ import absolute_import, unicode_literals
from .celery_conf import app as celery_app
__all__ = ['celery_app']
Meine Aufgaben
# coding=utf-8
from __future__ import unicode_literals, absolute_import
import logging
from celery.schedules import crontab
from celery.task import periodic_task
from .api import fetch_tweets, delete_tweets
logger = logging.getLogger(__name__)
@periodic_task(
run_every=(crontab(minute=10, hour='0, 6, 12, 18, 23')),
name="fetch_tweets_task",
ignore_result=True)
def fetch_tweets_task():
logger.info("Tweet download started")
fetch_tweets()
logger.info("Tweet download and summarization finished")
@periodic_task(
run_every=(crontab(minute=13, hour=13)),
name="delete_tweets_task",
ignore_result=True)
def delete_tweets_task():
logger.info("Tweet deletion started")
delete_tweets()
logger.info("Tweet deletion finished")
die Ergebnisse, wenn ich in Remote-Server ausgeführt werden (nicht funktioniert)
(trendiz) [email protected]:~/projects/verticals-news/src$ celery -A vertNews beat -l debug
Trying import production.py settings...
celery beat v4.0.2 (latentcall) is starting.
__ - ... __ - _
LocalTime -> 2017-04-03 13:55:49
Configuration ->
. broker -> redis://localhost:6379//
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]@%DEBUG
. maxinterval -> 5.00 minutes (300s)
[2017-04-03 13:55:49,770: DEBUG/MainProcess] Setting default socket timeout to 30
[2017-04-03 13:55:49,771: INFO/MainProcess] beat: Starting...
[2017-04-03 13:55:49,785: DEBUG/MainProcess] Current schedule:
[2017-04-03 13:55:49,785: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes
[2017-04-03 13:55:49,785: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
Ergebnis, wenn ich in dev-Server ausgeführt (Arbeits)
LocalTime -> 2017-04-03 14:16:19
Configuration ->
. broker -> redis://localhost:6379//
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]@%DEBUG
. maxinterval -> 5.00 minutes (300s)
[2017-04-03 14:16:19,919: DEBUG/MainProcess] Setting default socket timeout to 30
[2017-04-03 14:16:19,919: INFO/MainProcess] beat: Starting...
[2017-04-03 14:16:19,952: DEBUG/MainProcess] Current schedule:
<ScheduleEntry: fetch_tweets_task fetch_tweets_task() <crontab: 36 0, 6, 12, 18, 22 * * * (m/h/d/dM/MY)>
<ScheduleEntry: delete_tweets_task delete_tweets_task() <crontab: 13 13 * * * (m/h/d/dM/MY)>
[2017-04-03 14:16:19,952: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes
[2017-04-03 14:16:19,953: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
Ich bin mit Python 3.5 und 4.0 Sellerie .2 in beiden Umgebungen