Laravel 5.2 Queue and Jobs - Not Pushing to Jobs DB

12,404

Solution 1

Well... I am dumb....

php artisan config:clear

I didn't clear the cache of the config.... wow...

Solution 2

In Laravel 6.x , edit this in /env : QUEUE_CONNECTION=sync => QUEUE_CONNECTION=database then handle data can insert jobs table

Solution 3

As

php artisan config:clear

wasn't working for me i was able to make it work using:

php artisan config:cache
Share:
12,404

Related videos on Youtube

Brian Logan
Author by

Brian Logan

Updated on February 06, 2020

Comments

  • Brian Logan
    Brian Logan about 4 years

    EDIT 2:

    Here are the steps we are going through:

    1. Schedule is Run (creating CollectHistoricalData Jobs for each company)
    2. CollectHistoricalData should be pushed to Queue (jobs table)
    3. CollectHistoricalData has a function ApiDaemon::GetCompanyWithQuery($company, $query) which is run from a separate class that is referenced a couple of other places too.
    4. GetCompanyWithQuery collects the data and inserts it into the database.

    It runs all the way through fine, but the hang up is rather than inserting the job into the jobs table, it just runs it synchronously, one after another.


    EDIT 1:

    The .env file is set to use the database QUEUE_DRIVER, I have even tried hard coding it in the config/queue.php file.


    We are using Laravel 5.2 for a project. In this project, we are needing to every hour cURL a url and save the data to the database. We at first were using Cron Jobs and basically firing off thousands of cURLs in about a minute, which would crash PHP due to the load.

    We decided to move over to Laravel's Jobs and Queues, without success. We are using the Database driver for our jobs, and have tried numerous different approaches to getting the jobs into the database, so the daemon workers we have can process them.

    Here is our code right now, we are using the Kernel.php $schedule to start the thing off, so we don't have hundreds of requests attempting to happen an hour, which results in tens of thousands of cURLs.

    Kernel.php Schedule:

        $schedule
            ->call(function () {
                $items = DB::select('{selecting certain things to run}');
                foreach ($items as $q) {
                    $this->dispatch(new CollectHistoricalData(Company::find($q->company_id), ApiQuery::find($q->query_id)));
                }
            })
            ->hourly()
            ->name('Historical Pulls')
            ->withoutOverlapping()
            ->before(function() {
                $this->startTime = Carbon::now();
            })
            ->after(function () {
                mail({mail us a report afterward});
            });
    

    When this runs, it is sitting there running them all one by one, rather than pushing them to the Jobs table that was created.

    CollectHistoricalData.php:

    <?php
    
    namespace App\Jobs;
    
    use App\Helpers\Daemons\ApiDaemon;
    use App\Jobs\Job;
    use App\Models\Company;
    use App\Models\ApiQuery;
    use Illuminate\Queue\SerializesModels;
    use Illuminate\Queue\InteractsWithQueue;
    use Illuminate\Contracts\Queue\ShouldQueue;
    
    class CollectHistoricalData extends Job implements ShouldQueue
    {
        use InteractsWithQueue, SerializesModels;
    
        protected $company, $query;
    
    
        /**
         * CollectHistoricalData constructor.
         * @param Company $company
         * @param ApiQuery $query
         */
        public function __construct(Company $company, ApiQuery $query)
        {
            $this->company = $company;
            $this->query = $query;
        }
    
        /**
         * Execute the job.
         *
         * @return void
         */
        public function handle()
        {
            mail({let us know what started and when});
            QueryDaemon::GetCompanyWithQuery($this->company, $this->query);
        }
    
        public function failed()
        {
            mail({mail us letting us know it failed});
        }
    
    
    }
    

    The job is referencing another class with the function inside it (since that code is a heafty beast all on its own), plus there are about 20 of these, so it is easiest to reference the class rather than recreating all 20 classes into Jobs.

    TL;DR

    We have a schedule that is supposed to push a job that references a function in another class, to the jobs table, but is rather running them one after another, slowly. What is causing this?

    • jszobody
      jszobody almost 8 years
      what's the QUEUE_DRIVER specified in your .env file?
  • InnisBrendan
    InnisBrendan over 5 years
    This worked for me. I'm not sure what this did. Can someone explain?
  • 4unkur
    4unkur over 4 years
    @grez Your configuration info is probably cached in production, so you have to clear the cache in order to have changes take effect