Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[5.x] Show workload also per queue when balancing is disabled #966

Merged
merged 4 commits into from
Feb 14, 2021

Conversation

arjanwestdorp
Copy link
Contributor

When handling multiple queues from a single supervisor with balancing disabled it will only show the total amount of jobs and wait time for that supervisor. When using this configuration to split your queues in priority (for example high,default,low) it's not possible to see the details per queue. When the workload spikes you have no idea which queue is causing it. For example, when the 'low' queue would make your workload spike it's no problem probably, as any new jobs coming in on the 'high' or 'default' queue would still be processed quickly.

To get better insight in the workload per queue when 'balance' => false this PR adds details about each individual queue of a supervisor.

This only affects those setups where balancing is disabled. For all other balancing strategies the UI is unchanged.

image

@taylorotwell
Copy link
Member

Are the times actually accurate per queue? The algolia queue is near the bottom of the priority in your screenshot but shows just a few seconds wait with 7,321 jobs - while the low queue just below it has a wait of 5 minutes.

@arjanwestdorp
Copy link
Contributor Author

I've manually modified the runtime of the queues while testing the feature so that's why the numbers in the screenshot were a bit off.

Below a test to verify the waiting times:

// Dispatch a job on each queue to calculate the `runtime` value per queue.
for ($i = 0; $i < 1; $i++) {
    dispatch(fn() => sleep(10))->onQueue('low');
    dispatch(fn() => sleep(8))->onQueue('algolia');
    dispatch(fn() => sleep(6))->onQueue('images');
    dispatch(fn() => sleep(4))->onQueue('default');
    dispatch(fn() => sleep(2))->onQueue('high');
}

Next I've paused horizon and dispatched 100 jobs for each queue to verify if the waiting times were 100 * sleep_in_seconds:

for ($i = 0; $i < 100; $i++) {
    dispatch(fn() => sleep(10))->onQueue('low');
    dispatch(fn() => sleep(8))->onQueue('algolia');
    dispatch(fn() => sleep(6))->onQueue('images');
    dispatch(fn() => sleep(4))->onQueue('default');
    dispatch(fn() => sleep(2))->onQueue('high');
}

Expected results:

Queue Calculation Expected wait
High 2 seconds * 100 200 seconds / 3 minutes
Default 4 seconds * 100 400 seconds / 7 minutes
Images 6 seconds * 100 600 seconds / 10 minutes
Algolia 8 seconds * 100 800 seconds / 13 minutes
Low 10 seconds * 100 1000 seconds / 17 minutes

This will give the following output on the dashboard:
image

@taylorotwell
Copy link
Member

taylorotwell commented Feb 13, 2021

I'm not sure this is showing accurately for me. I put 500 jobs on two queues (high and low)... each job sleeps for 1 second. Yet, Horizon says 3 minute wait time on both queues. But, no jobs from the low queue will be worked until the high queue is totally empty - so how can that be accurate?

In this case, since it is last, the wait time for the "low" queue would actually be the total amount of time to process both queues (5-6 minutes).

image

@arjanwestdorp
Copy link
Contributor Author

arjanwestdorp commented Feb 13, 2021

I see what you mean, sorry for misunderstanding you the first time. I was showing the time it would take to empty that specific queue and not the time it would take before that queue was empty.
I've now added the 'time to clear' of the previous queue(s) so it would give the result for each queue it actually takes before that queue is emptied:

image

@taylorotwell taylorotwell merged commit 4cb7adf into laravel:5.x Feb 14, 2021
@arjanwestdorp arjanwestdorp deleted the workload-per-queue branch February 15, 2021 07:50
@driesvints
Copy link
Member

Cool addition, thanks @arjanwestdorp!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants