Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[server] Update readmes to no longer use the deprecated pathway + update pathway as per new UX docs #1592

Merged
merged 3 commits into from
Feb 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 4 additions & 8 deletions src/deepsparse/server/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,15 +18,15 @@ Usage: deepsparse.server [OPTIONS] COMMAND [ARGS]...

1. `deepsparse.server --config_file [OPTIONS] <config path>`

2. `deepsparse.server task [OPTIONS] <task>
2. `deepsparse.server --task [OPTIONS] <task>

Examples for using the server:

`deepsparse.server --config_file server-config.yaml`

`deepsparse.server task question_answering --batch-size 2`
`deepsparse.server --task question_answering --batch-size 2`

`deepsparse.server task question_answering --host "0.0.0.0"`
`deepsparse.server --task question_answering --host "0.0.0.0"`

Example config.yaml for serving:

Expand Down Expand Up @@ -63,10 +63,6 @@ Usage: deepsparse.server [OPTIONS] COMMAND [ARGS]...

Options:
--help Show this message and exit.

Commands:
config Run the server using configuration from a .yaml file.
task Run the server using configuration with CLI options, which can...
```
---
<h3>Note on the latest server release</h3>
Expand Down Expand Up @@ -104,7 +100,7 @@ Example CLI command for serving a single model for the **question answering** ta

```bash
deepsparse.server \
task question_answering \
--task question_answering \
--model_path "zoo:nlp/question_answering/bert-base/pytorch/huggingface/squad/12layer_pruned80_quant-none-vnni"
```

Expand Down
106 changes: 5 additions & 101 deletions src/deepsparse/server/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,16 +11,7 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

"""
There are two sub-commands for the server:
1. `deepsparse.server config [OPTIONS] <config path>`
2. `deepsparse.server task [OPTIONS] <task>
```
"""

import os
import warnings
from tempfile import TemporaryDirectory
from typing import Optional, Union

Expand Down Expand Up @@ -223,6 +214,7 @@ def main(
# if the --model_path option is provided, use that
# otherwise if the argument is given and --model_path is not used, use the
# argument instead

if model and model_path == "default":
model_path = model

Expand All @@ -236,6 +228,10 @@ def main(
if task is None and config_file is None:
raise ValueError("Must specify either --task or --config_file. Found neither")

if config_file is not None:
server = _fetch_server(integration=integration, config=config_file)
server.start_server(host, port, log_level, hot_reload_config=hot_reload_config)

if task is not None:
cfg = ServerConfig(
num_cores=num_cores,
Expand Down Expand Up @@ -263,98 +259,6 @@ def main(
host, port, log_level, hot_reload_config=hot_reload_config
)

if config_file is not None:
server = _fetch_server(integration=integration, config=config_file)
server.start_server(host, port, log_level, hot_reload_config=hot_reload_config)


@main.command(
context_settings=dict(
token_normalize_func=lambda x: x.replace("-", "_"), show_default=True
),
)
@click.argument("config-path", type=str)
@HOST_OPTION
@PORT_OPTION
@LOG_LEVEL_OPTION
@HOT_RELOAD_OPTION
def config(
config_path: str, host: str, port: int, log_level: str, hot_reload_config: bool
):
"[DEPRECATED] Run the server using configuration from a .yaml file."
warnings.simplefilter("always", DeprecationWarning)
warnings.warn(
"Using the `config` sub command is deprecated. "
"Use the `--config_file` argument instead.",
category=DeprecationWarning,
)


@main.command(
context_settings=dict(
token_normalize_func=lambda x: x.replace("-", "_"), show_default=True
),
)
@click.argument(
"task",
type=click.Choice(SupportedTasks.task_names(), case_sensitive=False),
)
@MODEL_OPTION
@BATCH_OPTION
@CORES_OPTION
@WORKERS_OPTION
@HOST_OPTION
@PORT_OPTION
@LOG_LEVEL_OPTION
@HOT_RELOAD_OPTION
@INTEGRATION_OPTION
def task(
task: str,
model_path: str,
batch_size: int,
num_cores: int,
num_workers: int,
host: str,
port: int,
log_level: str,
hot_reload_config: bool,
integration: str,
):
"""
[DEPRECATED] Run the server using configuration with CLI options,
which can only serve a single model.
"""

warnings.simplefilter("always", DeprecationWarning)
warnings.warn(
"Using the `task` sub command is deprecated. "
"Use the `--task` argument instead.",
category=DeprecationWarning,
)

cfg = ServerConfig(
num_cores=num_cores,
num_workers=num_workers,
integration=integration,
endpoints=[
EndpointConfig(
task=task,
name=f"{task}",
model=model_path,
batch_size=batch_size,
)
],
loggers={},
)

with TemporaryDirectory() as tmp_dir:
config_path = os.path.join(tmp_dir, "server-config.yaml")
with open(config_path, "w") as fp:
yaml.dump(cfg.dict(), fp)

server = _fetch_server(integration=integration, config=config_path)
server.start_server(host, port, log_level, hot_reload_config=hot_reload_config)


def _fetch_server(integration: str, config: Union[ServerConfig, str]):
if isinstance(config, str):
Expand Down
10 changes: 5 additions & 5 deletions src/deepsparse/transformers/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ inference = qa_pipeline(question="What's my name?", context="My name is Snorlax"
Spinning up:
```bash
deepsparse.server \
task question-answering \
--task question-answering \
--model_path "zoo:nlp/question_answering/bert-base/pytorch/huggingface/squad/12layer_pruned80_quant-none-vnni"
```

Expand Down Expand Up @@ -162,7 +162,7 @@ inference = opt_pipeline("Who is the president of the United States?")
Spinning up:
```bash
deepsparse.server \
task text-generation \
--task text-generation \
--model_path zoo:opt-1.3b-opt_pretrain-pruned50_quantW8A8
```

Expand Down Expand Up @@ -210,7 +210,7 @@ inference = sa_pipeline("I hate it!")
Spinning up:
```bash
deepsparse.server \
task sentiment-analysis \
--task sentiment-analysis \
--model_path "zoo:nlp/sentiment_analysis/bert-base/pytorch/huggingface/sst2/pruned80_quant-none-vnni"
```

Expand Down Expand Up @@ -263,7 +263,7 @@ inference = tc_pipeline(
Spinning up:
```bash
deepsparse.server \
task text-classification \
--task text-classification \
--model_path "zoo:nlp/text_classification/distilbert-none/pytorch/huggingface/qqp/pruned80_quant-none-vnni"
```

Expand Down Expand Up @@ -316,7 +316,7 @@ inference = tc_pipeline("Drive from California to Texas!")
Spinning up:
```bash
deepsparse.server \
task token-classification \
--task token-classification \
--model_path "zoo:nlp/token_classification/bert-base/pytorch/huggingface/conll2003/pruned90-none"
```

Expand Down
2 changes: 1 addition & 1 deletion src/deepsparse/yolact/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@ If a `--model_filepath` arg isn't provided, then `zoo:cv/segmentation/yolact-dar
Spinning up:
```bash
deepsparse.server \
task yolact \
--task yolact \
--model_path "zoo:cv/segmentation/yolact-darknet53/pytorch/dbolya/coco/pruned82_quant-none"
```

Expand Down
2 changes: 1 addition & 1 deletion src/deepsparse/yolo/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,7 @@ If a `--model_filepath` arg isn't provided, then `zoo:cv/detection/yolov5-s/pyto
Spinning up:
```bash
deepsparse.server \
task yolo \
--task yolo \
--model_path "zoo:cv/detection/yolov5-s/pytorch/ultralytics/coco/pruned_quant-aggressive_94"
```

Expand Down
Loading