Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug] Add out of the box support for deployment_directories #1131

Merged
merged 3 commits into from
Jul 20, 2023

Conversation

rahul-tuli
Copy link
Member

@rahul-tuli rahul-tuli commented Jul 20, 2023

Some of our pipelines did not support using directories(with model.onnx) out of the box for deployment with server; Noting this used to work with certain tasks as a few pipelines did implement the logic to process deployment directories.

Current PR propagates this support to all Pipelines; essentially it updates model_to_path function (used by all pipelines) to accept model directories as valid inputs; The test was performed using yolov5 model (as OD pipelines did not support model_directories initially)

Test:

model:

$ sparsezoo.download "zoo:cv/detection/yolov5-n/pytorch/ultralytics/coco/pruned30-none-vnni" --save-dir ~/test-model
INFO:root:Downloading files from model 'zoo:cv/detection/yolov5-n/pytorch/ultralytics/coco/pruned30-none-vnni'
downloading...: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3.70M/3.70M [00:00<00:00, 7.56MB/s]
downloading...: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 7.24M/7.24M [00:00<00:00, 10.8MB/s]
downloading...: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 19.0M/19.0M [00:01<00:00, 11.2MB/s]
downloading...: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 292M/292M [00:27<00:00, 11.2MB/s]
downloading...: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 926/926 [00:00<00:00, 58.0kB/s]
downloading...: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1.09k/1.09k [00:00<00:00, 407kB/s]
downloading...: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1.72k/1.72k [00:00<00:00, 648kB/s]
downloading...: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 7.24M/7.24M [00:00<00:00, 11.0MB/s]
Download results
====================

Model(stub=zoo:cv/detection/yolov5-n/pytorch/ultralytics/coco/pruned30-none-vnni) downloaded to /home/rahul/test-model

deepsparse.server:

$ deepsparse.server --task yolo --model_path ~/test-model/deployment
2023-07-20 11:21:28 deepsparse.server.server INFO     config_path: /tmp/tmpnn7b8zkv/server-config.yaml
2023-07-20 11:21:28 deepsparse.server.server INFO     Using config: ServerConfig(num_cores=None, num_workers=None, integration='local', engine_thread_pinning='core', pytorch_num_threads=1, endpoints=[EndpointConfig(name='yolo', route='/predict', task='yolo', model='/home/rahul/test-model/deployment', batch_size=1, logging_config=PipelineSystemLoggingConfig(enable=True, inference_details=SystemLoggingGroup(enable=False, target_loggers=[]), prediction_latency=SystemLoggingGroup(enable=True, target_loggers=[])), data_logging=None, bucketing=None, kwargs={})], loggers={}, system_logging=ServerSystemLoggingConfig(enable=True, request_details=SystemLoggingGroup(enable=False, target_loggers=[]), resource_utilization=SystemLoggingGroup(enable=False, target_loggers=[])))
2023-07-20 11:21:28 deepsparse.server.server INFO     torch.set_num_threads(1)
2023-07-20 11:21:28 deepsparse.server.server INFO     NM_BIND_THREADS_TO_CORES=1
2023-07-20 11:21:28 deepsparse.server.server INFO     NM_BIND_THREADS_TO_SOCKETS=0
2023-07-20 11:21:28 deepsparse.server.server INFO     Built context: Context(num_cores=10, num_streams=1, scheduler=Scheduler.elastic)
2023-07-20 11:21:28 deepsparse.server.server INFO     Built ThreadPoolExecutor with 1 workers
2023-07-20 11:21:28 deepsparse.loggers.build_logger INFO     Created default logger: PythonLogger
2023-07-20 11:21:28 deepsparse.loggers.build_logger INFO     System Logging: enabled for groups: ['yolo/prediction_latency']
2023-07-20 11:21:29 deepsparse.server.server INFO     Initializing pipeline for 'yolo'
DeepSparse, Copyright 2021-present / Neuralmagic, Inc. version: 1.6.0.20230607 COMMUNITY | (7a67b14b) (release) (optimized) (system=avx512, binary=avx512)
2023-07-20 11:21:30 deepsparse.server.server INFO     Adding endpoints for 'yolo'
2023-07-20 11:21:30 deepsparse.server.server INFO     Added '/predict' endpoint
2023-07-20 11:21:30 deepsparse.server.server INFO     Added '/predict/from_files' endpoint
2023-07-20 11:21:30 deepsparse.server.server INFO     Added endpoints: ['/openapi.json', '/docs', '/docs/oauth2-redirect', '/redoc', '/', '/config', '/status', '/healthcheck', '/health', '/ping', '/endpoints', '/endpoints', '/predict', '/predict/from_files']
INFO:     Started server process [94072]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:5543 (Press CTRL+C to quit)
^CINFO:     Shutting down
INFO:     Waiting for application shutdown.
INFO:     Application shutdown complete.
INFO:     Finished server process [94072]

@rahul-tuli rahul-tuli marked this pull request as ready for review July 20, 2023 15:42
Copy link
Member

@bfineran bfineran left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May need to extend behavior in future to allow for names besides model.onnx. LGTM for existing support

@rahul-tuli rahul-tuli merged commit 3e44bdd into main Jul 20, 2023
7 checks passed
@rahul-tuli rahul-tuli deleted the is-a-directory-bug branch July 20, 2023 15:56
@rahul-tuli
Copy link
Member Author

May need to extend behavior in future to allow for names besides model.onnx. LGTM for existing support

That would be very easy; just a one line change would do, Pathlib has a function for that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants