Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow dcurl to affect the thread pool size of libtuv #149

Closed
marktwtn opened this issue May 7, 2019 · 4 comments · Fixed by #159
Closed

Allow dcurl to affect the thread pool size of libtuv #149

marktwtn opened this issue May 7, 2019 · 4 comments · Fixed by #159
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@marktwtn
Copy link
Collaborator

marktwtn commented May 7, 2019

There are two methods to set the thread pool size of libtuv:

  1. Change the default size(=4) in the libtuv source code.
  2. Use the environment variable UV_THREADPOOL_SIZE to overwrite the default size.

Based on the previous advice of method 1 pull request:

I dislike such modifications. The default value is valid for libtuv's motivation and purpose on IoT devices. We can refer to runtime configurations in order to calibrate CPU resources. Also, consider the case when HyperThreading is enabled.

Objective:
Offer other acceptable methods to set the thread pool size and apply it to the current design.

@marktwtn marktwtn added the enhancement New feature or request label May 7, 2019
@marktwtn marktwtn self-assigned this May 7, 2019
@jserv
Copy link
Member

jserv commented May 8, 2019

You might learn from xmrig by searching affinity.

@wusyong wusyong added this to the Azalea milestone May 13, 2019
@marktwtn
Copy link
Collaborator Author

I would like to add a new function in the src/threadpool.c of libtuv.
The new function is mainly for setting the thread pool size with the passed in function argument.
And it will be called by the dcurl_entry() function in src/dcurl.c of dcurl.

@marktwtn
Copy link
Collaborator Author

marktwtn commented May 22, 2019

I would like to add a new function in the src/threadpool.c of libtuv.
The new function is mainly for setting the thread pool size with the passed in function argument.
And it will be called by the dcurl_entry() function in src/dcurl.c of dcurl.

Continue from the last comment.
If the new function is implemented, then there will be two functions which can set the threadpool size.

  • uv_queue_work(): Set with the default size or environment variable.
  • new_function(int poolSize): Set with the passed in argument.

The function which is called first is important.

If we continue this way, it might have some concern.
For example, if tangle-accelerator, some dcurl test case or the other part of the source code also uses the libtuv APIs and call the uv_queue_work() first, then there might have some problem.
The threadpool size would not be the same as you expected if you wish to use the new_function(int poolSize) to set it.

marktwtn added a commit to marktwtn/dcurl that referenced this issue May 27, 2019
In the initialization phase of dcurl,
the threadpool size is preset to the (maximum processor - 1).
Then the threadpool would be initialized in the first call of dcurl_entry().

Close DLTcollab#149.
marktwtn added a commit to marktwtn/dcurl that referenced this issue May 27, 2019
In the initialization phase of dcurl,
the threadpool size is preset to the (maximum processor - 1)
with the new libtuv API uv_set_threadpool_size().
Then the threadpool would be initialized in the first call of dcurl_entry()
with the preset size.

Close DLTcollab#149.
@marktwtn
Copy link
Collaborator Author

I create a new API uv_set_threadpool_size() to preset the threadpool size of libtuv.

The size presetting happens in dcurl_init(),
and the real threadpool initialization would happen in the first call of the dcurl_entry().

Since we assign the thread number to be used for PoW when calling the dcurl_entry(),
the preset size is set to (max acceptable thread number of the hardware - 1), which can satisfy any requested thread number of dcurl_entry().

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants