Django.fun

limiting fetch data in django-celery acting as a consumer from rabbitmq queue acting as a producer

I want to retrieve 10 data message in every period of django-celery periodic task from a rabbitmq queue containing 100000 data messages. everything works well. but I don't know how can I stop fetching data if 10 data messages have been retrieve in the specific period. here is my task.py

@periodic_task(run_every=(crontab(minute='*/1')), name="task", ignore_result=True)
def task():

    connection = pika.BlockingConnection(pika.ConnectionParameters(host='localhost'))
    channel = connection.channel()
    channel.queue_declare(queue='name')

    def callback(ch, method, properties, body):
       # do sth


    channel.basic_qos(prefetch_count=10, global_qos=True)
                                                                       
    channel.basic_consume(queue='name', on_message_callback=callback)
    channel.start_consuming()

and here is my rabbitmq producer

connection = pika.BlockingConnection(
    pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='name')
for i in range(10000000):
    channel.basic_publish(exchange='', routing_key='name', body=msg)

Answers: 0