Integrate MinIO into Triton inference server
Create two folders for storing the data from the container.
-
Pull the image first:
docker pull minio/minio:RELEASE.2022-03-24T00-43-44Z
-
Launch a container via command line:
docker run -p 9000:9000 -p 9001:9001 --name minio1 \ -v data:/data \ -v config:/root/.minio \ minio/minio:RELEASE.2022-03-24T00-43-44Z \ server /data --console-address "0.0.0.0:9001"
Or I recommend that you can use
docker-compose up
Access the link by
0.0.0.0:9000
and then you can see the console.- User name : user
- Password : user123456
Please check the instruction. Before you use it, please launch the server first.
-
I set the MINIO_ROOT_USER and MINIO_ROOT_PASSWORD in the docker compose file.
- MINIO_ROOT_USER: user
- MINIO_ROOT_PASSWORD: user123456
-
I set the default bucket that it will automatically create a folder after it starts the container.
-
Triton will connect to the model bucket of local minIO container.
-
No need to set up everything on AWS cloud services. Everything is on host.
-
You can download the testing model via the script of
fetch_models.sh
file.
One command:docker-compose -f docker-compose-triton.yml up