Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow selection of multiple outputs with SampleSoftmax #155

Merged
merged 4 commits into from
Aug 5, 2022

Conversation

nv-alaiacano
Copy link
Collaborator

This enables users to choose more than one output in SampleSoftmax, in case it's desirable to have more than just the item_id.

It also makes a slight change to the output schema: the output schema will match the selected columns from the input schema. We previously specified the name as ordered_ids and set is_list=True and is_ragged=True. Now those settings are all passed from the input, as well as the dtype.

I also added some tests and found a potential issue with _relevance_col_name and dependencies - see the TODO note in the test file.

Note that this is a somewhat breaking change if people are setting _input_col="a string" manually. If that is a concern, I can rework the signature to maintain backwards compatibility.

@nv-alaiacano nv-alaiacano added bug Something isn't working enhancement New feature or request labels Jul 30, 2022
@nv-alaiacano nv-alaiacano self-assigned this Jul 30, 2022
@github-actions
Copy link

Documentation preview

https://nvidia-merlin.github.io/systems/review/pr-155

@nvidia-merlin-bot
Copy link

Click to view CI Results
GitHub pull request #155 of commit d53206034d9050d24cf2f8822a36de6e369d19b0, no merge conflicts.
Running as SYSTEM
Setting status of d53206034d9050d24cf2f8822a36de6e369d19b0 to PENDING with url https://10.20.13.93:8080/job/merlin_systems/167/console and message: 'Pending'
Using context: Jenkins
Building on master in workspace /var/jenkins_home/workspace/merlin_systems
using credential fce1c729-5d7c-48e8-90cb-b0c314b1076e
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/systems # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/systems
 > git --version # timeout=10
using GIT_ASKPASS to set credentials login for merlin-systems user + githubtoken
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/systems +refs/pull/155/*:refs/remotes/origin/pr/155/* # timeout=10
 > git rev-parse d53206034d9050d24cf2f8822a36de6e369d19b0^{commit} # timeout=10
Checking out Revision d53206034d9050d24cf2f8822a36de6e369d19b0 (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d53206034d9050d24cf2f8822a36de6e369d19b0 # timeout=10
Commit message: "add more tests"
 > git rev-list --no-walk 6ebf6cba52aa6dc19397f0ff9c451d63a86b5dd7 # timeout=10
[merlin_systems] $ /bin/bash /tmp/jenkins8950905476297710301.sh
PYTHONPATH=:/usr/local/lib/python3.8/dist-packages/:/usr/local/hugectr/lib:/var/jenkins_home/workspace/merlin_systems/systems
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.2, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/merlin_systems/systems, configfile: pyproject.toml
plugins: anyio-3.6.1, xdist-2.5.0, forked-1.4.0, cov-3.0.0
collected 67 items

tests/unit/test_version.py . [ 1%]
tests/unit/examples/test_serving_ranking_models_with_merlin_systems.py . [ 2%]
[ 2%]
tests/unit/systems/test_ensemble.py .... [ 8%]
tests/unit/systems/test_ensemble_ops.py F. [ 11%]
tests/unit/systems/test_export.py . [ 13%]
tests/unit/systems/test_graph.py . [ 14%]
tests/unit/systems/test_inference_ops.py ... [ 19%]
tests/unit/systems/test_model_registry.py . [ 20%]
tests/unit/systems/test_op_runner.py .... [ 26%]
tests/unit/systems/test_tensorflow_inf_op.py ... [ 31%]
tests/unit/systems/dag/ops/test_softmax_sampling.py ................. [ 56%]
tests/unit/systems/fil/test_fil.py .......................... [ 95%]
tests/unit/systems/fil/test_forest.py ... [100%]

=================================== FAILURES ===================================
____________________________ test_softmax_sampling _____________________________

tmpdir = local('/tmp/pytest-of-jenkins/pytest-8/test_softmax_sampling0')

@pytest.mark.skipif(not TRITON_SERVER_PATH, reason="triton server not found")
def test_softmax_sampling(tmpdir):
    request_schema = Schema(
        [
            ColumnSchema("movie_ids", dtype=np.int32),
            ColumnSchema("output_1", dtype=np.float32),
        ]
    )

    combined_features = {
        "movie_ids": np.random.randint(0, 10000, 100).astype(np.int32),
        "output_1": np.random.random(100).astype(np.float32),
    }

    request = make_df(combined_features)

    ordering = ["movie_ids"] >> SoftmaxSampling(relevance_col="output_1", topk=10, temperature=20.0)

    ensemble = Ensemble(ordering, request_schema)
    ens_config, node_configs = ensemble.export(tmpdir)
  response = _run_ensemble_on_tritonserver(
        tmpdir, ensemble.graph.output_schema.column_names, request, "ensemble_model"
    )

tests/unit/systems/test_ensemble_ops.py:52:


tests/unit/systems/utils/triton.py:39: in _run_ensemble_on_tritonserver
with run_triton_server(tmpdir) as client:
/usr/lib/python3.8/contextlib.py:113: in enter
return next(self.gen)


modelpath = local('/tmp/pytest-of-jenkins/pytest-8/test_softmax_sampling0')

@contextlib.contextmanager
def run_triton_server(modelpath):
    """This function starts up a Triton server instance and returns a client to it.

    Parameters
    ----------
    modelpath : string
        The path to the model to load.

    Yields
    ------
    client: tritonclient.InferenceServerClient
        The client connected to the Triton server.

    """
    cmdline = [
        TRITON_SERVER_PATH,
        "--model-repository",
        modelpath,
        "--backend-config=tensorflow,version=2",
    ]
    env = os.environ.copy()
    env["CUDA_VISIBLE_DEVICES"] = "0"
    with subprocess.Popen(cmdline, env=env) as process:
        try:
            with grpcclient.InferenceServerClient("localhost:8001") as client:
                # wait until server is ready
                for _ in range(60):
                    if process.poll() is not None:
                        retcode = process.returncode
                      raise RuntimeError(f"Tritonserver failed to start (ret={retcode})")

E RuntimeError: Tritonserver failed to start (ret=1)

merlin/systems/triton/utils.py:46: RuntimeError
----------------------------- Captured stderr call -----------------------------
I0730 02:18:42.785842 25572 pinned_memory_manager.cc:240] Pinned memory pool is created at '0x7f1766000000' with size 268435456
I0730 02:18:42.786609 25572 cuda_memory_manager.cc:105] CUDA memory pool is created on device 0 with size 67108864
I0730 02:18:42.789135 25572 model_repository_manager.cc:1191] loading: 0_softmaxsampling:1
I0730 02:18:42.896239 25572 python.cc:2388] TRITONBACKEND_ModelInstanceInitialize: 0_softmaxsampling (GPU device 0)
0730 02:18:45.025322 25612 pb_stub.cc:301] Failed to initialize Python stub: KeyError: ('input_col',)

At:
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/dag/ops/softmax_sampling.py(47): from_config
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/dag/op_runner.py(33): init
/tmp/pytest-of-jenkins/pytest-8/test_softmax_sampling0/0_softmaxsampling/1/model.py(59): initialize

E0730 02:18:45.371715 25572 model_repository_manager.cc:1348] failed to load '0_softmaxsampling' version 1: Internal: KeyError: ('input_col',)

At:
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/dag/ops/softmax_sampling.py(47): from_config
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/dag/op_runner.py(33): init
/tmp/pytest-of-jenkins/pytest-8/test_softmax_sampling0/0_softmaxsampling/1/model.py(59): initialize

E0730 02:18:45.371902 25572 model_repository_manager.cc:1551] Invalid argument: ensemble 'ensemble_model' depends on '0_softmaxsampling' which has no loaded version
I0730 02:18:45.372017 25572 server.cc:556]
+------------------+------+
| Repository Agent | Path |
+------------------+------+
+------------------+------+

I0730 02:18:45.372129 25572 server.cc:583]
+---------+-------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Backend | Path | Config |
+---------+-------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------+
| python | /opt/tritonserver/backends/python/libtriton_python.so | {"cmdline":{"auto-complete-config":"false","min-compute-capability":"6.000000","backend-directory":"/opt/tritonserver/backends","default-max-batch-size":"4"}} |
+---------+-------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------+

I0730 02:18:45.372216 25572 server.cc:626]
+-------------------+---------+------------------------------------------------------------------------------------------------------------------+
| Model | Version | Status |
+-------------------+---------+------------------------------------------------------------------------------------------------------------------+
| 0_softmaxsampling | 1 | UNAVAILABLE: Internal: KeyError: ('input_col',) |
| | | |
| | | At: |
| | | /var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/dag/ops/softmax_sampling.py(47): from_config |
| | | /var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/dag/op_runner.py(33): init |
| | | /tmp/pytest-of-jenkins/pytest-8/test_softmax_sampling0/0_softmaxsampling/1/model.py(59): initialize |
+-------------------+---------+------------------------------------------------------------------------------------------------------------------+

I0730 02:18:45.435663 25572 metrics.cc:650] Collecting metrics for GPU 0: Tesla P100-DGXS-16GB
I0730 02:18:45.436527 25572 tritonserver.cc:2138]
+----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Option | Value |
+----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| server_id | triton |
| server_version | 2.22.0 |
| server_extensions | classification sequence model_repository model_repository(unload_dependents) schedule_policy model_configuration system_shared_memory cuda_shared_memory binary_tensor_data statistics trace |
| model_repository_path[0] | /tmp/pytest-of-jenkins/pytest-8/test_softmax_sampling0 |
| model_control_mode | MODE_NONE |
| strict_model_config | 1 |
| rate_limit | OFF |
| pinned_memory_pool_byte_size | 268435456 |
| cuda_memory_pool_byte_size{0} | 67108864 |
| response_cache_byte_size | 0 |
| min_supported_compute_capability | 6.0 |
| strict_readiness | 1 |
| exit_timeout | 30 |
+----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

I0730 02:18:45.436562 25572 server.cc:257] Waiting for in-flight requests to complete.
I0730 02:18:45.436570 25572 server.cc:273] Timeout 30: Found 0 model versions that have in-flight inferences
I0730 02:18:45.436580 25572 server.cc:288] All models are stopped, unloading models
I0730 02:18:45.436587 25572 server.cc:295] Timeout 30: Found 0 live models and 0 in-flight non-inference requests
error: creating server: Internal - failed to load all models
W0730 02:18:46.463029 25572 metrics.cc:468] Unable to get energy consumption for GPU 0. Status:Success, value:0
W0730 02:18:46.463090 25572 metrics.cc:507] Unable to get memory usage for GPU 0. Memory usage status:Success, value:0. Memory total status:Success, value:0
=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/nvtabular/framework_utils/init.py:18
/usr/local/lib/python3.8/dist-packages/nvtabular/framework_utils/init.py:18: DeprecationWarning: The nvtabular.framework_utils module is being replaced by the Merlin Models library. Support for importing from nvtabular.framework_utils is deprecated, and will be removed in a future version. Please consider using the models and layers from Merlin Models instead.
warnings.warn(

tests/unit/examples/test_serving_ranking_models_with_merlin_systems.py: 1 warning
tests/unit/systems/test_ensemble.py: 2 warnings
tests/unit/systems/test_export.py: 1 warning
tests/unit/systems/test_inference_ops.py: 2 warnings
tests/unit/systems/test_op_runner.py: 4 warnings
/usr/local/lib/python3.8/dist-packages/cudf/core/frame.py:384: UserWarning: The deep parameter is ignored and is only included for pandas compatibility.
warnings.warn(

tests/unit/systems/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/triton/export.py:304: UserWarning: Column x is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(

tests/unit/systems/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/triton/export.py:304: UserWarning: Column y is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(

tests/unit/systems/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/triton/export.py:304: UserWarning: Column id is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(

tests/unit/systems/fil/test_fil.py::test_binary_classifier_default[sklearn_forest_classifier-get_model_params4]
tests/unit/systems/fil/test_fil.py::test_binary_classifier_with_proba[sklearn_forest_classifier-get_model_params4]
tests/unit/systems/fil/test_fil.py::test_multi_classifier[sklearn_forest_classifier-get_model_params4]
tests/unit/systems/fil/test_fil.py::test_regressor[sklearn_forest_regressor-get_model_params4]
tests/unit/systems/fil/test_fil.py::test_model_file[sklearn_forest_regressor-checkpoint.tl]
/usr/local/lib/python3.8/dist-packages/sklearn/utils/deprecation.py:103: FutureWarning: Attribute n_features_ was deprecated in version 1.0 and will be removed in 1.2. Use n_features_in_ instead.
warnings.warn(msg, category=FutureWarning)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================== short test summary info ============================
FAILED tests/unit/systems/test_ensemble_ops.py::test_softmax_sampling - Runti...
============ 1 failed, 66 passed, 19 warnings in 255.50s (0:04:15) =============
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/systems/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[merlin_systems] $ /bin/bash /tmp/jenkins2329962312142991326.sh

@karlhigley
Copy link
Contributor

@nv-alaiacano I think one of the ensemble tests needs to be updated to match these changes (based on what I see from Jenkins)

@nvidia-merlin-bot
Copy link

Click to view CI Results
GitHub pull request #155 of commit 1f34a8c694bb69a73def361db2a91b7dda9bceef, no merge conflicts.
Running as SYSTEM
Setting status of 1f34a8c694bb69a73def361db2a91b7dda9bceef to PENDING with url https://10.20.13.93:8080/job/merlin_systems/169/console and message: 'Pending'
Using context: Jenkins
Building on master in workspace /var/jenkins_home/workspace/merlin_systems
using credential fce1c729-5d7c-48e8-90cb-b0c314b1076e
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/systems # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/systems
 > git --version # timeout=10
using GIT_ASKPASS to set credentials login for merlin-systems user + githubtoken
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/systems +refs/pull/155/*:refs/remotes/origin/pr/155/* # timeout=10
 > git rev-parse 1f34a8c694bb69a73def361db2a91b7dda9bceef^{commit} # timeout=10
Checking out Revision 1f34a8c694bb69a73def361db2a91b7dda9bceef (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1f34a8c694bb69a73def361db2a91b7dda9bceef # timeout=10
Commit message: "fix typos in sampling op"
 > git rev-list --no-walk 4cdf663e0c0d43574a0442d77c901f1d6d9f2815 # timeout=10
[merlin_systems] $ /bin/bash /tmp/jenkins8518254697382694194.sh
PYTHONPATH=:/usr/local/lib/python3.8/dist-packages/:/usr/local/hugectr/lib:/var/jenkins_home/workspace/merlin_systems/systems
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.2, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/merlin_systems/systems, configfile: pyproject.toml
plugins: anyio-3.6.1, xdist-2.5.0, forked-1.4.0, cov-3.0.0
collected 67 items

tests/unit/test_version.py . [ 1%]
tests/unit/examples/test_serving_ranking_models_with_merlin_systems.py . [ 2%]
[ 2%]
tests/unit/systems/test_ensemble.py .... [ 8%]
tests/unit/systems/test_ensemble_ops.py .. [ 11%]
tests/unit/systems/test_export.py . [ 13%]
tests/unit/systems/test_graph.py . [ 14%]
tests/unit/systems/test_inference_ops.py ... [ 19%]
tests/unit/systems/test_model_registry.py . [ 20%]
tests/unit/systems/test_op_runner.py .... [ 26%]
tests/unit/systems/test_tensorflow_inf_op.py ... [ 31%]
tests/unit/systems/dag/ops/test_softmax_sampling.py ..............FFF [ 56%]
tests/unit/systems/fil/test_fil.py .......................... [ 95%]
tests/unit/systems/fil/test_forest.py ... [100%]

=================================== FAILURES ===================================
_____________________ test_softmax_from_config[input_col1] _____________________

input_cols = 'input_col1'

@pytest.mark.parametrize("input_cols", ["input_col1", ["input_col1"], ["input_col1", "input_col2"]])
def test_softmax_from_config(input_cols):
    parameters = {
        "relevance_col": "rel_col",
        "input_col": input_cols,
        "temperature": 10.0,
        "topk": 2,
    }
    config = {"params": json.dumps(parameters)}
    SoftmaxSampling.__init__ = MagicMock(return_value=None)
  s = SoftmaxSampling.from_config(config)

tests/unit/systems/dag/ops/test_softmax_sampling.py:153:


cls = <class 'merlin.systems.dag.ops.softmax_sampling.SoftmaxSampling'>
config = {'params': '{"relevance_col": "rel_col", "input_col": "input_col1", "temperature": 10.0, "topk": 2}'}

@classmethod
def from_config(cls, config):
    """Load operator and properties from Triton config"""
    parameters = json.loads(config.get("params", ""))
    relevance_col = parameters["relevance_col"]
  input_cols = parameters["input_cols"]

E KeyError: 'input_cols'

merlin/systems/dag/ops/softmax_sampling.py:47: KeyError
____________________ test_softmax_from_config[input_cols1] _____________________

input_cols = ['input_col1']

@pytest.mark.parametrize("input_cols", ["input_col1", ["input_col1"], ["input_col1", "input_col2"]])
def test_softmax_from_config(input_cols):
    parameters = {
        "relevance_col": "rel_col",
        "input_col": input_cols,
        "temperature": 10.0,
        "topk": 2,
    }
    config = {"params": json.dumps(parameters)}
    SoftmaxSampling.__init__ = MagicMock(return_value=None)
  s = SoftmaxSampling.from_config(config)

tests/unit/systems/dag/ops/test_softmax_sampling.py:153:


cls = <class 'merlin.systems.dag.ops.softmax_sampling.SoftmaxSampling'>
config = {'params': '{"relevance_col": "rel_col", "input_col": ["input_col1"], "temperature": 10.0, "topk": 2}'}

@classmethod
def from_config(cls, config):
    """Load operator and properties from Triton config"""
    parameters = json.loads(config.get("params", ""))
    relevance_col = parameters["relevance_col"]
  input_cols = parameters["input_cols"]

E KeyError: 'input_cols'

merlin/systems/dag/ops/softmax_sampling.py:47: KeyError
____________________ test_softmax_from_config[input_cols2] _____________________

input_cols = ['input_col1', 'input_col2']

@pytest.mark.parametrize("input_cols", ["input_col1", ["input_col1"], ["input_col1", "input_col2"]])
def test_softmax_from_config(input_cols):
    parameters = {
        "relevance_col": "rel_col",
        "input_col": input_cols,
        "temperature": 10.0,
        "topk": 2,
    }
    config = {"params": json.dumps(parameters)}
    SoftmaxSampling.__init__ = MagicMock(return_value=None)
  s = SoftmaxSampling.from_config(config)

tests/unit/systems/dag/ops/test_softmax_sampling.py:153:


cls = <class 'merlin.systems.dag.ops.softmax_sampling.SoftmaxSampling'>
config = {'params': '{"relevance_col": "rel_col", "input_col": ["input_col1", "input_col2"], "temperature": 10.0, "topk": 2}'}

@classmethod
def from_config(cls, config):
    """Load operator and properties from Triton config"""
    parameters = json.loads(config.get("params", ""))
    relevance_col = parameters["relevance_col"]
  input_cols = parameters["input_cols"]

E KeyError: 'input_cols'

merlin/systems/dag/ops/softmax_sampling.py:47: KeyError
=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/nvtabular/framework_utils/init.py:18
/usr/local/lib/python3.8/dist-packages/nvtabular/framework_utils/init.py:18: DeprecationWarning: The nvtabular.framework_utils module is being replaced by the Merlin Models library. Support for importing from nvtabular.framework_utils is deprecated, and will be removed in a future version. Please consider using the models and layers from Merlin Models instead.
warnings.warn(

tests/unit/examples/test_serving_ranking_models_with_merlin_systems.py: 1 warning
tests/unit/systems/test_ensemble.py: 2 warnings
tests/unit/systems/test_export.py: 1 warning
tests/unit/systems/test_inference_ops.py: 2 warnings
tests/unit/systems/test_op_runner.py: 4 warnings
/usr/local/lib/python3.8/dist-packages/cudf/core/frame.py:384: UserWarning: The deep parameter is ignored and is only included for pandas compatibility.
warnings.warn(

tests/unit/systems/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/triton/export.py:304: UserWarning: Column x is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(

tests/unit/systems/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/triton/export.py:304: UserWarning: Column y is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(

tests/unit/systems/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/triton/export.py:304: UserWarning: Column id is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(

tests/unit/systems/fil/test_fil.py::test_binary_classifier_default[sklearn_forest_classifier-get_model_params4]
tests/unit/systems/fil/test_fil.py::test_binary_classifier_with_proba[sklearn_forest_classifier-get_model_params4]
tests/unit/systems/fil/test_fil.py::test_multi_classifier[sklearn_forest_classifier-get_model_params4]
tests/unit/systems/fil/test_fil.py::test_regressor[sklearn_forest_regressor-get_model_params4]
tests/unit/systems/fil/test_fil.py::test_model_file[sklearn_forest_regressor-checkpoint.tl]
/usr/local/lib/python3.8/dist-packages/sklearn/utils/deprecation.py:103: FutureWarning: Attribute n_features_ was deprecated in version 1.0 and will be removed in 1.2. Use n_features_in_ instead.
warnings.warn(msg, category=FutureWarning)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================== short test summary info ============================
FAILED tests/unit/systems/dag/ops/test_softmax_sampling.py::test_softmax_from_config[input_col1]
FAILED tests/unit/systems/dag/ops/test_softmax_sampling.py::test_softmax_from_config[input_cols1]
FAILED tests/unit/systems/dag/ops/test_softmax_sampling.py::test_softmax_from_config[input_cols2]
============ 3 failed, 64 passed, 19 warnings in 249.00s (0:04:08) =============
Build step 'Execute shell' marked build as failure
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/systems/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[merlin_systems] $ /bin/bash /tmp/jenkins7369514289499627307.sh

@nv-alaiacano nv-alaiacano force-pushed the laiacano/softmax-multiple-output branch from 1f34a8c to e9cfe4f Compare August 5, 2022 14:23
@nvidia-merlin-bot
Copy link

Click to view CI Results
GitHub pull request #155 of commit e9cfe4f17817c2412f6791c81ad28f33a6437e3a, no merge conflicts.
Running as SYSTEM
Setting status of e9cfe4f17817c2412f6791c81ad28f33a6437e3a to PENDING with url https://10.20.13.93:8080/job/merlin_systems/204/console and message: 'Pending'
Using context: Jenkins
Building on master in workspace /var/jenkins_home/workspace/merlin_systems
using credential fce1c729-5d7c-48e8-90cb-b0c314b1076e
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/systems # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/systems
 > git --version # timeout=10
using GIT_ASKPASS to set credentials login for merlin-systems user + githubtoken
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/systems +refs/pull/155/*:refs/remotes/origin/pr/155/* # timeout=10
 > git rev-parse e9cfe4f17817c2412f6791c81ad28f33a6437e3a^{commit} # timeout=10
Checking out Revision e9cfe4f17817c2412f6791c81ad28f33a6437e3a (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e9cfe4f17817c2412f6791c81ad28f33a6437e3a # timeout=10
Commit message: "fix typo in tests"
 > git rev-list --no-walk a063aafca186aaf8d0351aabd791ced14c0bb9fc # timeout=10
[merlin_systems] $ /bin/bash /tmp/jenkins969718016192929246.sh
PYTHONPATH=:/usr/local/lib/python3.8/dist-packages/:/usr/local/hugectr/lib:/var/jenkins_home/workspace/merlin_systems/systems
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-7.1.2, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/merlin_systems/systems, configfile: pyproject.toml
plugins: anyio-3.6.1, xdist-2.5.0, forked-1.4.0, cov-3.0.0
collected 69 items

tests/unit/test_version.py . [ 1%]
tests/unit/examples/test_serving_ranking_models_with_merlin_systems.py . [ 2%]
[ 2%]
tests/unit/systems/test_ensemble.py .... [ 8%]
tests/unit/systems/test_ensemble_ops.py .. [ 11%]
tests/unit/systems/test_export.py . [ 13%]
tests/unit/systems/test_graph.py . [ 14%]
tests/unit/systems/test_inference_ops.py ... [ 18%]
tests/unit/systems/test_model_registry.py . [ 20%]
tests/unit/systems/test_op_runner.py .... [ 26%]
tests/unit/systems/test_tensorflow_inf_op.py .... [ 31%]
tests/unit/systems/dag/ops/test_softmax_sampling.py ................. [ 56%]
tests/unit/systems/fil/test_fil.py .......................... [ 94%]
tests/unit/systems/fil/test_forest.py .... [100%]

=============================== warnings summary ===============================
../../../../../usr/local/lib/python3.8/dist-packages/nvtabular/framework_utils/init.py:18
/usr/local/lib/python3.8/dist-packages/nvtabular/framework_utils/init.py:18: DeprecationWarning: The nvtabular.framework_utils module is being replaced by the Merlin Models library. Support for importing from nvtabular.framework_utils is deprecated, and will be removed in a future version. Please consider using the models and layers from Merlin Models instead.
warnings.warn(

tests/unit/examples/test_serving_ranking_models_with_merlin_systems.py: 1 warning
tests/unit/systems/test_ensemble.py: 2 warnings
tests/unit/systems/test_export.py: 1 warning
tests/unit/systems/test_inference_ops.py: 2 warnings
tests/unit/systems/test_op_runner.py: 4 warnings
/usr/local/lib/python3.8/dist-packages/cudf/core/frame.py:384: UserWarning: The deep parameter is ignored and is only included for pandas compatibility.
warnings.warn(

tests/unit/systems/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/triton/export.py:304: UserWarning: Column x is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(

tests/unit/systems/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/triton/export.py:304: UserWarning: Column y is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(

tests/unit/systems/test_export.py::test_export_run_ensemble_triton[tensorflow-parquet]
/var/jenkins_home/workspace/merlin_systems/systems/merlin/systems/triton/export.py:304: UserWarning: Column id is being generated by NVTabular workflow but is unused in test_name_tf model
warnings.warn(

tests/unit/systems/fil/test_fil.py::test_binary_classifier_default[sklearn_forest_classifier-get_model_params4]
tests/unit/systems/fil/test_fil.py::test_binary_classifier_with_proba[sklearn_forest_classifier-get_model_params4]
tests/unit/systems/fil/test_fil.py::test_multi_classifier[sklearn_forest_classifier-get_model_params4]
tests/unit/systems/fil/test_fil.py::test_regressor[sklearn_forest_regressor-get_model_params4]
tests/unit/systems/fil/test_fil.py::test_model_file[sklearn_forest_regressor-checkpoint.tl]
/usr/local/lib/python3.8/dist-packages/sklearn/utils/deprecation.py:103: FutureWarning: Attribute n_features_ was deprecated in version 1.0 and will be removed in 1.2. Use n_features_in_ instead.
warnings.warn(msg, category=FutureWarning)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
================= 69 passed, 19 warnings in 261.57s (0:04:21) ==================
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://github.com/gitapi/repos/NVIDIA-Merlin/systems/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[merlin_systems] $ /bin/bash /tmp/jenkins16976367098549610751.sh

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants